Computing Evolves. Part V: Jacked in (1990-1999)

Last Updated:Sunday, May 28, 2023
SHARE:

In 1984, author William Gibson released his cyberpunk opus, Neuromancer. The book predicted a world in which computers proliferated into every nook and cranny of society. It also introduced the term ‘cyberspace.’

In the 90s, Gibson’s blueprint of the future began to be filled in (albeit clumsily). 1991 was the year punk broke, and the Internet too. Irreverence and tech would be the order of the decade.

In part five of our seven-part series on the history of computers, we look at that edgiest era of all—the 90s. 

(If you missed Part IV, you can check it out here.)

Computing ‘goes mental’ (1990-1999) 

“Are you jacked into the Internet; are you one of those computer guys?”

—Some guy from MTV, interviewing David Bowie (circa 1995).

The 90s were arguably the most important decade in computing history. Over this ten-year span, computers dramatically matured in capabilities and gained ubiquitous mainstream acceptance. They finally did true multimedia, from music to video to (primitive) streaming.

At the same time, this was a deeply experimental, all-over-the-place era.

While today one can be forgiven for thinking there are basically only 4 or 5 companies building computers, in the 90s, the industry was more crammed than the clowniest clown car. Motivated by juicy profit margins on hardware sales, dozens of companies threw their hats in the ring: CompUSA, Gateway, NEC, Packard Bell, Quantex, Zenith...

In 1999, a company called Patriot even released a Hot Wheels computer. Somehow, it failed.

Mix-and-match PC rigs with aftermarket add-ons were all across the ‘computer rooms’ of America. People installed pricey CD burners in their tower computers, then stored the resulting bootleg mixes in CD binders. Young gamers experimented with ‘overclocking’ their CPUs to get more speed and installed the world’s first 3D graphics cards.

Open-source programs like Linux gained significant cult followings, and remotely distributed teams of developers, united by the Internet, built innovative software, got into flame wars on forums and modded their favorite games. 

There was a crazy amount of computer magazines, and there were even ‘computer shows’—a lot like dog shows but with a lot more dad jeans and bootleg software.

One thing that was ‘standard’ in this time was the general public’s choice of operating system. Microsoft Windows absolutely dominated the market. 

In the background of all this computer mayhem, and inarguably fueling it, was the rise of the Internet. In 1991, the World Wide Web was made available to global users outside of the CERN research program that developed it. 2 years later, in 1993, CERN made the underlying code of the internet available royalty-free, sparking the first big wave of general audience adoption across the Western world.  

Witness this incredible Gen X zeitgeist nugget of an interview featuring Coolio, David Bowie, and Moby weighing in on the state of the Internet in 1995. At one point, the edged-out MTV host describes websites as a “proliferation of special interest truck stops.” 

This was the decade that e-commerce giants Amazon and eBay got their start (in 1994 and 1995, respectively), convincing everyone that it’s totally fine to use a credit card online. Pets.com, Webvan.com, eToys.com, and 100s of other so-called ‘dot.com’ companies were founded and mobbed by investors. 

The 90s also saw the first tentative steps towards truly mobile computing, with the advent of the first consumer laptops in the early part of the decade, and internet on phones at the close of the decade in 1999, when Japanese telecom giant NTT DoCoMo launched i-Mode, the first internet mobile services platform.

Meet the desktops that defined a decade

‘Wintel’ (Windows + Intel) PC computers were large and in charge in the 90s. And most of them were beige-grey, for reasons that shall remain forever shrouded in a grey-ish fog.

Apple spent most of this time treading water but delivered a company-saving desktop computer late in the decade.  

Commodore Amiga 3000

The start of the 90s was a murky, transitional time for home computing. 80s machines like the Commodore 64, Apple Macintosh, and other technically obsolete devices clung on by their fingertips. 

Commodore’s Amiga series was introduced in 1985. The first model, the Amiga 1000, had a 32-bit GUI with multi-tasking capabilities, stereo sound, and video modes that displayed up to 4096 colors simultaneously. It was a big success in the home computing market, with special strengths in creative software (animators and video graphics artists loved it) and video games.

Fast forward to 1990, and the Amiga 3000, which was a big technical improvement on its predecessors in the series. The high-end graphics workstation offered a 16Mhz CPU, 2MB of processing speed, and a 32-bit system memory that exponentially increased processing speed. It also ran on a new AmigaOS called Workbench 2.04, which was significantly better in terms of UX than previous software like Operating System 1.03 (which had been included with the Amiga 1000 and 2000 models). 

The Amiga 3000 also had many expansion ports and connectors for attaching peripherals, from MIDI devices to scanners to CD-ROM drives.

Priced at $4,100 for computer and monitor ($8,050 in 2019), it was not exactly cheap. 

The Amiga’s sales were good for a couple of years, but it didn’t last. Home video game consoles arrived in the late 80s and early 90s, like the Sega Genesis and Super Nintendo, and ate up more and more of the gamer market. IBM PC compatible devices also shrunk in price in the late 80s and early 90s, negating the last advantage that the Amiga held over the competition (i.e. its affordability). 

Lastly, and perhaps most crucially, Commodore was having a tough time finding third-party developers, as software companies piled on to the ever-growing PC market.

Revenues and sales started to go into a fatal nosedive in 1992. Commodore followed up on the Amiga 3000 with the 4000 series that year, in a last-ditch effort. 

It didn’t work. Commodore went bankrupt in 1994.

IBM Aptiva 

The IBM Aptiva, released in September 1994, was the successor to the company’s IBM PS/1 series, which had been debuted in 1990. While not the fastest, cheapest, or best-selling computer, the Aptiva is in many ways the personification of mid-90s home computing. As per the IBM ethos of making accessible, middle-of-the-road products, it was designed as an out-of-box bundle. Mouse, keyboard, and speakers were all included with the computer tower. 

The launch model was based on the Intel 80486 CPU (later models used Pentium AMD CPUs), came preloaded with MS-DOS and Windows 3.1 (later models came with Windows 95), and had modem functionality. 

According to a TV commercial, you could even use an Aptiva to access something called “the 3-D Internet.”

The launch price was $1,800 ($3,110 in 2019) for the computer and peripherals and $330 ($570) for the cheapest monitor. 

When the series was discontinued in 2001, it marked the end of IBM’s career in the home computing market. Competitor companies, most prominently Compaq, Dell, and HP had outmaneuvered them every step of the way in the 90s. IBM decided to concentrate on its Internet technology and e-commerce products. 

iMac G3

No computer quite defines those heady, final years of the 90s like the Apple iMac G3. 

This was Steve Job’s ‘bet-the-farm’ effort to save Apple from financial ruin. It was designed to be simple from an aesthetic standpoint, as well as a user experience standpoint, with a special emphasis on ease of Internet setup and access.

The colorful little computer was released into the wilds in August 1998.

The iMac G3 print ad promised 10 minutes from unboxing to ‘Interneting’. And, in crazy contrast to Apple’s current, Henry Ford-esque insistence on utilitarian minimalism, the computer was available in a wide range of funky colors

The machine had fairly humble technical specs, such as a 233Mhz CPU, 32 MB of SDRAM, a 4GB hard drive, but it ran on Mac OS 8, which offered quite a clean, intuitive user experience for the times. 

Its launch price was $1,299 ($2,040 in 2019), making it competitive in the economy-level market. Given that Apple had failed to compete in this segment throughout the 90s, this was key. 

The iMac G3 was a big success, selling like hotcakes. At the same time, it was derided by many as underpowered and super-heavy (it weighed 38 pounds), while admittedly the crappy mouse it came with was nicknamed “the hockey puck”. 

Laptops appear 

The first laptop ever, technically speaking, was the GriD Compass, built by former Xerox Parc engineers at Grid Systems Corporation in 1982. But it was so expensive that only NASA, the military, and fancy-schmancy government people had access to them. 

In the 90s, however, miniaturization and dropping hardware costs made mass-market consumer laptops a real thing. 

PowerBook 100 laptop

In 1991, the PowerBook 100 was introduced, aka. the world’s first “true” consumer laptop. The boxy little machine weighed 5 pounds, which is not much heavier than a lot of laptops today.

It featured 16 Mhz of CPU processing power, a 20 or 40 MB high-speed SCSI hard drive, 2 MB of RAM (8 MB on the top-spec models), a 3.5-inch floppy disk drive, and a 9-inch screen display. In terms of design, the device pioneered the set-back, ergonomic keyboard design that basically all laptops since have followed. 

In case you were griping about the price of a 15-inch MacBook Pro with Touch Bar, it’s worth noting the PowerBook’s launch price of $2,300 (around $4,230 today).

In the product’s TV commercial, Karim Abdul-Jabbar, a large man indeed, showed how it could be used on a plane. At the time, that was pretty, pretty good value.

The PowerBook 100 proved to be a mainstream commercial success, capturing 40% of the market and $1 billion in revenue and paving the way for further developments in the laptop market. 

ThinkPad

The first IBM ThinkPad was a note-taking tablet released in 1991, but it was a commercial flop. Nonetheless, the ThinkPad name was pressed into service again the following year, this time to brand the company’s first laptop computer. 

Designed at IBM Yamato Facility in Japan under the aegis of engineer Arimasa Naitoh, the first ThinkPad notebook was released in October 1992. It was a big hit. 

It also won hundreds of design awards. That’s not too surprising, given that designer Richard Sapper, who created the classic Tizio Artemide lamp in 1972, came up with the computer’s distinctive body style. 

Priced at $2,375 ($4,340 in 2019), the original economy-level model featured 25 Mhz of processing power 4MB of RAM, a 9.5 inch LCD display, 3.5-inch floppy disk drive, and an 80MB internal hard drive. 

Apple sinks, Microsoft dominates

In 2019, Macs are the computer of choice for the super cool, the globally mobile, the creative class, the most erudite of tenure-track academics, and… everyone else. Apple’s products are ubiquitous among urbanites, and iPhones occupy something like 40% of the smartphone market share in North America. Apple is also one of the world's top 3 highest-valued companies.

Meanwhile, arch-rival Microsoft is doing well too. The company frequently swaps first, second, and third place in the list of the world’s most valuable companies, along with Apple and Amazon. Windows may have slipped from a position of total global domination, but it still occupies just under 90% of total market share worldwide.

Microsoft’s ongoing success isn’t surprising in the least. But in the 90s, the notion that one day Mac would be pumping out status symbol computers and dominating the phone industry with an icy, metallic fist was unthinkable. 

By 1997, the company was on the verge of annihilation and, in a soap opera twist, was only saved by the Janus-faced intervention of Microsoft.

Oh, and Steve Jobs, right. 

How Apple hit rock bottom and barely survived

In the 80s, Apple looked like it was going to be everywhere. The release of the 1984 Apple Macintosh and the first mass-market GUI was a ‘big bang’ moment in computing history. But in the late 80s and early to mid-90s, the company started dying. 

After Steve Jobs and Steve Wozniak left Apple in 1985, it was led by the former president of Pepsi John Sculley, who reorganized the company as a rational corporate venture that avoided Jobs’ strategy of expensive, risky big moves. 

The obvious problem with this was that Apple’s whole brand was based on being different. Sculley’s steering of the company ultimately put it in the same demographic lane as IBM, Compaq, Microsoft, and other mainstreamers. 

Apart from the PowerBook 100 laptop series, rolled out in 1991, Apple products in the early and mid-90s were outdated and overpriced. They also attracted few developers, meaning that almost all the really good new software was for PCs. 

Apple also made some colossal product development mistakes in the mid-90s. 

One was the Apple Newton MessagePad, a personal digital assistant (PDA) released in 1993. Innovative, yet flawed, the device was kind of like a primordial iPad, managing notes, contacts, and calendars, and offering a primitive form of handwriting recognition that got it a lot of negative press (and a Simpsons cameo). 

Expensive and not useful enough to justify its price point, the device sold poorly and was killed off in 1997. 

Then there was the launch of the equally unsuccessful Quadra, Centris, and Performa computer lines, respectively targeted at low, mid, and high-end markets (yet sharing very similar hardware specs). This decidedly un-minimalist approach confused consumers, and the machines all sold poorly. 

After losing money for three years in a row, Apple was on the ropes. Drastic action was required and was taken. In December 1996, Apple bought Steve Jobs’ company NeXT (mainly for their NeXTSTEP OS) and brought him back into the fold. 

Then in August 1997, Bill Gates brokered a surprise deal with Jobs that saw Microsoft inject $150 million dollars into Apple. Microsoft committing to developed Microsoft Office and Internet Explorer for Macs for 5 years or more, in exchange for Apple agreeing to pre-install Internet Explorer on all new Macs shipped. 

Apple purists’ reactions were as follows: pissed, confused, manic.

The Microsoft bailout story is often simplified as a fluffy love-in; two visionary men who respected each other collaborated to “make the world a better place.” But the truth was that Gates saw the sickest of deals and sniped it. 

Firstly, Microsoft needed to keep Apple alive because they were already being sued for monopolistic practices, and it was clear that if their only real competitor left was vanquished, they’d face a series of endless legal attacks (and potentially be forced to break up the company). 

Secondly, Microsoft made money from the deal. The $150 million they injected into Apple was peanuts for a company that had just made $11.36 billion in revenue within the last fiscal year. Also, it turned out later that this money was converted into Apple shares, which were sold in the early 2000s (after the company recovered), for a significant profit. 

Windows rules the world 

While Apple bobbed around in choppy waters, Microsoft was absolutely raking it in. 

Windows was the official OS of the 90s. 

Microsoft had rolled out the first version of Windows in 1985 but didn’t really perfect their GUI formula until 1990, when they released the first version of the popular Windows 3.0 series.  

Windows 3.0 was a 16-bit operating system run from DOS and was optimized to work just fine on computers with Intel 286 and 386 processors from the mid-80s. This meant you could buy a cheap PC, load Windows on it, and have a Mac-like experience for a significantly lower price. 

While not pretty by today’s standards, its standardized GUI allowed you to organize files and master different programs quickly. You could merge resources from different Windows programs; you could, for example, draw a pretty picture in Paint, turn data from the Spreadsheet into a graph, and plop both into a Word or Newsletter document. It did crash quite a lot, but oh well.

Then there were the massive perks for developers. Visual Basic 1.0 (released in 1991) allowed you to design a Windows app’s user interface before coding it.

Windows 3.0 sold 4 million copies in its first year. That may not seem like much, given that Windows 8 sold 100 million licenses in its first six months (and nobody thought that was very special), but the figure was a big deal back then. 

Windows 3.1 followed two years later, in 1992, which tweaked and improved upon 3.0 without messing with its core components. By bypassing the system BIOS (freeing up hard drive resources) and adding SmartDrive 4.0 disk caching, Windows 3.1 was quite a bit faster. 

Most people hung onto their copy of Windows 3.1 for the next couple of years. The much improved, yet system resource-hogging next iteration of Windows—Windows NT—was released in 1993, but didn’t attract a big audience.

What did attract a big audience was the company’s next big OS, codenamed ‘Chicago.’ But you probably know it (from the vaporware scene) as Windows 95.

Windows 95 was released to enormous fanfare on August 24th, 1995, backed up by a huge $300 million dollar marketing campaign. The Rolling Stones let Microsoft use their feel-good song “Start Me Up” in a TV ad because one of the big selling points of the OS was its fancy new “Start” button, you see. Meanwhile, Bill Gates’ launch party was assisted by cringey computer jokes from Jay Leno, and a bone-chillingly 90s video guide was produced, featuring cast members of Friends

The OS launch became a pop culture phenomenon, the likes of which the computing world had never seen. Windows 95 sold 7 million copies in the first 5 weeks alone, and 40 million by years’ end. 

As for features, Windows 95 offered a totally new operating system that ran straight after boot-up, i.e. not out of MS-DOS as with all previous versions of Windows had. The desktop was also brand-spanking new, with the ability to store files, system icons, and shortcuts. 

The combination of a new taskbar, built-in network support to simplify Internet access, the intuitive “Windows Explorer” utility and 32-bit application support produced a truly next-level experience.

Windows 95 was succeeded by Windows 98, released in…you guessed it, 1998. That product, which upgraded every aspect of the earlier version, sold very well too (25 million units over its first summer, fall, and holiday seasons on the market). However, it failed to generate any kind of culture mania; not one Matt Perry and Jennifer Aniston-starred instructional video was made for it. 

Ghosted by the zeitgeist | popping the 90s bubble

The 90s was a crazy, wide-open time, chock-a-block with people looking to make a quick buck.

Software, hardware, and e-commerce companies were all high on the seemingly limitless possibilities of the booming computing industry and the Internet. But three months after the clock struck Y2K, there were a whole lot of burned-out wrecks on the side of the Information Superhighway.

The 2000s began with the collapse of the NASDAQ stock market, as the dot.com bubble exploded into a billion tiny pieces. When the smoke cleared, a new computing ecosystem would rise from this mass extinction event. 

More on that in Part VI, when we explore computing in the Aughts.

SHARE: