The Internet Wasn’t Born Yesterday—A Brief History

Last Updated:Tuesday, July 4, 2023
SHARE:

Kids these days don’t know what “the Internet” is, they just use it.

But there was a time, not so long ago, when computers and the internet were shorthand for “I am a nerd,” and the phrase “I'm surfing the net” could be uttered.. without irony.

The always-on, ubiquitous, and mainstream-accepted Internet of today is the product of a long and winding road. The writings of a World War II-era visionary, Cold War military research, and a Physics Laboratory in Switzerland all played their part in creating it.

1945 and the dream of the Internet

In July 1945, as World War II was nearing its end, The Atlantic published an article about the future of technology, and, more specifically, a hypothetical information database called a “memex”—a mechanized private file archive and library. It was written by American science administrator and engineer Vannevar Bush, and it was titled “As We May Think.”

During the war, Bush had led the USA’s Office of Scientific Research and Development (OSRD), a centralized research organization designed to make innovation happen faster. Bush effectively led wartime scientific research in America and consequently became a significant public figure (in 1944, he even appeared on the cover of TIME).

“As We May Think” was read by many, and sparked many imaginations, both at the time of its publication and in subsequent decades.

Bush’s memex concept envisioned a future record-keeping system that mimics the associative processes of the human mind, yet unlike these processes is available as a permanent record. Such a system would allow the user to navigate information by building trails, and then leave that trail available to review and consult later. The enormous, ever-expanding amount of information in the world would thus be made manageable.

Sounds a bit like... hypertext, and the internet more generally, doesn’t it? The major difference, of course, is that Bush was thinking mechanically, so his essay envisioned what ended up being an anachronistic future dependent on moving parts (rather than hidden circuits).

Post-war, Bush’s ideas started to take root in American research culture. Beginning in the 1950s, stateside computer scientists began working on building a distributed network to link computer terminals without the need for telephone operator-style circuit switching.

1960s-1980s: Building the first Internet

In 1962, young and plucky MIT Ph.D. student Leonard Kleinrock published his thesis, "Information Flow in Large Communication Nets." In it, he provided a mathematical theory for packet switching, essentially a mode of traffic control for smoothly routing and getting data from point A to point B. The idea was foundational for Internet research.

The next year, U.S. government-funded Internet research took off, in the form of the Advanced Research Projects Agency Network (ARPANET).  ARPANET was designed to ensure reliable communications for government and military during the Cold War so that the overall computer network could still function when one node in the network was knocked out.

The initiative was spearheaded by Joseph Lidlicker (the first director of the Information Processing Technology Office), with key support from Ivan Sutherland, Bob Taylor, and Paul Baran of the RAND Corporation.

In 1969, the first version of ARPANET was launched, and the Internet as we (sort of) know it was born. To inaugurate the groundbreaking service, an online message was sent from UCLA to Stanford. It was meant to be “login,” but the network crashed after the sender typed two letters. The incomplete message arrived anyway, meaning that technically the first internet transmission ever sent was “lo.”

Lo and behold, a new era had begun. In 1971, Ray Tomlinson invented the electronic mailing system, aka. email, allowing users to send messages to multiple addresses (75% of ARPANET’s traffic was made up of emails by 1975).  

Before ARPANET, each long-distance computer connection had required a separate dedicated terminal. For example, you’d need 2 dedicated terminals to connect MIT to UC Berkeley (one at either end, in Massachusetts and California).

In-office you’d have to wheel your chair over to the right terminal to send a message. Basically, you’d be jumping back and forth between long distance texting and doing your other work—god forbid you wanted to have a three-way conversation.

To get around this limitation, ARPANET used a host-to-host protocol called Network Control Protocol (NCP).

This technology was improved upon in 1978 when Robert Kahn and Vint Cerf of USA’s DARPA (Defense Advanced Research Projects Agency) developed Transfer Control Protocol/Internet Protocol (TCP/IP).

By 1982, the US Department of Defense had adopted TCP/IP for all military-related computer networking. Their networks were further improved by the invention of Domain Name System (DNS) by Paul Mockapetris in 1983, a distributed naming system that made it easy to look up and find IP addresses.

The rest of the 1980s saw TCP/IP adoption spread from academic and government use into the business world; a few notable early adopters were Digital Equipment Corporation (now a part of HP), IBM, and AT&T.

Almost four decades later, TCP/IP remains the standard protocol underlying all internet traffic.

Local networks: the lost civilization of BBS

Yet as TCP/IP snaked its way from the military-industrial complex to corporations and academic institutions, another form of modem-based networking was gaining popularity among the common folk.

This was the dial-up Bulletin Board System (BBS), the original form of computer-based social networking.

BBS functioned a lot like the bulletin board at the Student Union Building or small town coffee shop. You stuck some information or message there and another user came along and checked it out later.

All that was required to access a BBS was a modem, a computer, and a phone line.

The original serial modems used for BBS were super-slow at 300 baud, like the popular Hayes Smartmodem introduced in 1981, while the last generation offered a 'whopping' 56k transfer speed. There was no complex, layered networking, just the simple transfer of bits over a telephone line from point A to B.

Theoretically one could connect to anyone with a phone number anywhere, but prohibitive long-distance charges generally pre-empted that, so BBS communities tended to cluster by area code.

In the 1980s and early 1990s, there were tens of thousands of them.

BBS hosts or 'SysOps' dedicated what was often their only home computer to the cause of facilitating interactions between random strangers. These were hardcore days of one user per computer. Operating systems like DOS could not multi-task; If you launched an application, the other thing you were running automatically shut down first.

Improbably, a grand total of 497 BBS continue to survive in 2019, although only a tiny portion continues to use old school dial-up modems. For current day hosts, the raison d'être is nostalgia, and a desire to preserve internet history. BBS was a weird, wild, and gentler internet after all; a place with no corporate presence, rife with chance encounters of the like seldom seen online today.

The birth of the WWW and the mainstreaming of the Internet

“I just had to take the hypertext idea and connect it to the TCP and DNS ideas and ta-da! the World Wide Web.”

— Tim Berners-Lee

For the better part of a decade, the worlds of BBS and TCP/IP networks coexisted, not unlike Neanderthals and humans 45,000ish years ago. But an “extinction event” was on its way for BBS.

Its progenitor was a 30-something-year-old British computer scientist named Tim Berners-Lee. Working over the 1980s at the European Organization for Nuclear Research (CERN), he developed an internal data system for nuclear scientists to share information. It initially went by the whimsical, and the quintessentially English name “Enquire Within About Everything.”

CERN was (and is) a highly international organization, and such cosmopolitanism comes with its own set of challenges. When Berners-Lee arrived there in the early 80s, he found people from different countries working on wildly different platforms that couldn’t communicate with one another: Macs, PCs, medium size computers running UNIX, big mainframe computers—it was a recipe for dysfunction.

One might have to learn new software, or write conversion programs to make files readable across devices. In the end, the most effective option might be to just walk over and ask someone what they’re working on.

Berners-Lee was deeply annoyed by this status quo and decided to solve the problem once-and-for-all. To do so, he Frankenstein-ed a new network out of a bunch of existing parts.

He took TCP/IP, Domain Name System (DNS), and e-mail, all of which had been around for quite some time, and unified everything with hypertext, which made it possible to click links from one document to another.

Hypertext had, up until this point, never been used outside of one computer.

Ted Nelson had philosophized hypertext’s theory of interconnectivity back in the early 1960s but hadn’t linked it to the Internet. Douglas Engelbart’s 1968 “mother of all demos” demonstrated hypertext and hypermedia, but only in the context of a single computer.

Berners-Lee’s concept of sticking all this stuff together was also, he acknowledges, heavily influenced by Vannevar Bush’s all-in-one memex concept.

And so it was that the World Wide Web came into existence in 1989, run off of a server hosted on Berners-Lee’s NeXT Computer. In 1991, the World Wide Web was made available to global users outside of the CERN research program that developed it.

2 years after that, in 1993, CERN made the underlying code of the internet available royalty-free, sparking the first big wave of general audience adoption.

Mosaic takes the net to the masses

Finally, not only was the Internet a fully-fledged thing, but everyone (at least in theory) could be a part of it. The problem was that most people were busy doing other stuff in the early 90s, and didn’t really care. The Internet needed users if it was going to survive.

Thankfully, a software engineer named Marc Andreessen was working on a product to bring the WWW to the mainstream at the National Center for Supercomputer Applications (NCSA). With money provided by the High Performance Computing Act (also known as the Gore Bill), he began binge-working on a new multimedia web browser with programmer friend Eric Bina.

It was called Mosaic.

Mosaic wasn’t the first browser with a graphical user interface (GUI). Erwise, created by four Finnish college students, gets to claim that honor. But Mosaic had enough user experience orientated innovations to set it apart from what came before.

Perhaps most crucially, Mosaic allowed images and text to be displayed on the same page. It was no longer necessary to click on a link to load an image on a separate page. Mosaic also allowed the user to click on in-text links, which were helpfully underlined in blue, and jump from page to page.

With this innovation, you could start on one page, and keep clicking on stuff until you ended up far, far away. While one can still capture this sensation by going down a Wikipedia-related topic wormhole, back in 1993 this ability to “surf the web” to some unknown place was the exciting premise of the internet.

In January 1993, the Mosaic web browser was released: first for UNIX, and later in the year for Microsoft Windows and Mac. 80% of users were on Windows back then and had been left in the dark by previous technically complex, niche-targeted browsers. Now they could get online with an easy install.

In December, Mosaic got a splashy front-page piece in the Business section of The New York Times. The Internet was on its way to becoming “a thing.”

The Browser Wars begin

Following his entrepreneurial instinct, Andreessen left NCSA the same year Mosaic browser was released. He teamed up with Jim Clark and his company Silicon Graphics (SGI) and set to work developing a new commercial browser called Netscape Navigator.

Released in 1994, it was a hit and quickly claimed an 80% market share.

The Internet was still only being used by about 1% of the world at this point, but it was quickly becoming part of the cultural zeitgeist. In 1995, Sandra Bullock starred in Internet hacker thriller The Net. Meanwhile, MTV did a painfully Gen X feature about getting “jacked into the internet.”

Then in August 1995, Microsoft’s Windows 95 operating system was released to enormous fanfare, backed by a huge $300 million dollar marketing campaign featuring The Rolling Stones, Jay Leno, and cast members of Friends. It sold 7 million copies in the first 5 weeks alone.

Crucially, it also came preloaded with Microsoft’s own free browser, Internet Explorer.

All of a sudden, Netscape had some serious competition to contend with. The so-called ‘First Browser War’ had begun.

Netscape saw its once-dominant market share shrink year after year, as Microsoft’s free, proprietary browser got better and better. In November 1998, Netscape was acquired by online service provider AOL and updates to the browser ceased. Microsoft had won. By 2002, Internet Explorer held a whopping 96% share of the web browser market.

Epilogue: the post-90s internet and what’s to come

By 2000, 412.8 million people were online. By 2010, the number of users was just shy of 2 billion. As of 2019, there are 4.39 billion internet users spread across the globe, with a soaring number of new users in South Asia, East Asia, and Africa.

The Internet today is fast, cheap, and everywhere—piped into our phones by 3G, 4G, and very shortly 5G wireless networks, and readily available in buses, trains, airplanes, and deadbeat friends’ apartments.

What’s more, we have become completely reliant on it.

We need it to get a job, pay bills, keep in touch with people, find events, book restaurants… basically, we need it to participate in society in any meaningful way. To that end, the UN now declares Internet access to be a “human right.”

Most of us humans have a way of not thinking too hard about what’s around us day-to-day. That mentality extends into the realm of technology.

We fall into routines and see the current status quo as something fixed, when in fact all the gadgets and infrastructure that came before now are irretrievably gone, and all the products of today are slowly but surely being pushed towards obsolescence.

As we can see, the Internet’s place in our lives was very different not too long ago. The integrated, seamless (well, more or less) cloud of apps that we take for granted is not old. After all, smartphones have only been around for 12-ish years, with the launch of the original iPhone back in June 2007.

While today’s Internet is more useful than ever, it does have its downsides.

Tim Berners-Lee, for one, is pretty choked at what’s become of his creation in recent years. He was deeply alarmed by Facebook’s exposure of data from 87 million users, without consent, to political research firm Cambridge Analytica in 2018. The repeal of net neutrality that same year in the US set him off further.  

Berners-Lee is now fighting to make the web decentralized again, and by doing so address some of the contemporary internet’s major concerns, namely privacy and personal data ownership. To that end, he’s working on a nifty little project at MIT called Solid.

Can Berners-Lee succeed in reviving the good things about the old Internet, namely its freedom, in the context of the advanced modern Internet? Only time will tell, of course, but if there’s any lesson in the history of the Internet, it’s that everything can change...again, quickly and unexpectedly.

I want an internet where content businesses grow according to their quality, not their ability to pay to ride in the fast lane. I want an internet where ideas spread because they’re inspiring, not because they chime with the views of telecoms executives. I want an internet where consumers decide what succeeds online…

— Tim Berners-Lee

SHARE: