It was on this fateful day twenty years ago when Tim Berners-Lee, a physicist at the CERN laboratories in Switzerland, created the first-ever website and, therefore, web server. Its subject? Discussion of the WWW Project, which eventually grew to encompass the web as we know it today.
The WWW Project was a set of materials talking about what hypertext was and how it worked, how to create a webpage, how to search the web for information, and other details relevant to the project at hand. The whole concept lived on the CERN server at info.cern.ch (URLs pre-date the rise of the WWW by a fair margin).
The first webserver was a NeXTcube used by Tim Berners-Lee. NeXT was the company Steve Jobs founded after leaving Apple; when he returned later on, the NeXTSTEP OS upon which it ran became the foundations of Apple’s modern OS X operating system.
You can see a copy of the page as it was in 1992; the web really took off the next year (1993) when the Mosaic web browser was developed and reached widespread notoriety. Much like the web itself, Berners-Lee has gone on to bigger and better things. A year after the WWW really set out, its developer went to MIT and founder the W3C, a consortium dedicated to setting and protecting a body of standards that govern web and website creation and development.
It’s been said that Berners-Lee regrets necessitating the “http://” found at the beginning of all URLs, as it tends to obfuscate and confuse the layperson. Modern browsers have obviated the need for typing it into the address bar; some, such as Google Chrome refuse to even show it.
These days, the web’s founder is working on the next iteration of networked systems with a project he calls the Semantic Web. This model takes the Internet and makes the data it contains more easily read, understood and manipulated by machines, letting them accomplish tasks that currently require human intervention.
In 1999’s Weaving the Web, which discusses the creation of the WWW, Tim Berners-Lee was quoted as saying:
|I have a dream for the Web [in which computers] become capable of analyzing all the data on the Web – the content, links, and transactions between people and computers. A ‘Semantic Web’, which should make this possible, has yet to emerge, but when it does, the day-to-day mechanisms of trade, bureaucracy and our daily lives will be handled by machines talking to machines. The ‘intelligent agents’ people have touted for ages will finally materialize.|