Computing Evolves. Part III: From Personnel to Personal (1960-1977)
In the 60s, it was cool to wear a fringe leather jacket. It was not cool to be ‘into computers.’ In the 70s, well, things weren’t too much different.
But what was different was the role computers had to play in society. By the end of the 70s, cost and size had shrunk dramatically, and the stage was finally set for mass-market computing.
In the third part of our seven-part series on the history of computers, we check out 60s computer ideology, hulking Mad Men mainframes, and the first personal computers.
Getting hip to the future (1960-1969)
As we know, the 60s were a time of social upheaval across the Western world. The decade encompassed watershed events like the American Civil Rights movement and the 1963 March on Washington, the May ‘68 riots in Paris, the escalation of the Vietnam War, and the emergence of something called a ‘hippie.’
All this general cultural foment definitely had an impact on computing. This was the decade when the weirdos and savant visionaries started to come out of the woodwork. Computers, previously the purview of business and government, and circumscribed to mathematical and statistical applications, were now being thought about in more holistic, humanistic terms.
Case in point, computer pioneer Ted Nelson. In 1960, he founded Project Xanadu, dedicated to creating a “magical place of literary memory” on the computer. Nelson developed the concepts of hypertext and hypermedia, and wrote and spoke extensively on the myriad possibilities that computing could bring for human communication and individual empowerment.
Nelson’s friend, the brilliant engineer Douglas Engelbart, invented the mouse in 1963. For him, it was a tiny piece of the puzzle in a larger project to “augment human intellect.”
A few years later, on December 9, 1968, Engelbart gave what has been dubbed “the mother of all demos” at the Fall Joint Computer Conference in San Francisco's Civic Auditorium. Titled "A Research Center for Augmenting Human Intellect," his demonstration of an operating system he called the oN-Line System (NLS) included the first use of a computer mouse, video conferencing/teleconferencing, a word processor program, hypertext and hypermedia, a hierarchical file system, and a real-time editor that multiple users could collaborate on simultaneously.
But thinkers like Nelson and Engelbert were ahead of the curve. While computer hardware had advanced leaps and bounds since the advent of integrated circuits, 60s machines were still, by modern standards, pretty limited, and/or pretty huge. The only notable exception was 1966’s Apollo Guidance Computer (ACG), used to help NASA land the first humans on the moon, which was about as powerful as the world’s first personal computers brought to market 11 years later (albeit designed only for super specific moon-related tasks).
The IBM System/360 family of mainframe computers, introduced in 1964, were the best selling machines of the era. You may have seen one on Mad Men, where it was used as a plot device to stoke in-office tensions about dehumanization and automation of work.
These were the first computers that could be programmed for a vast range of needs, using microcode machine language. The first models featured 8 MB of memory, which was an incredible amount at the time.
You could also plug in any number of peripheral devices to increase its functionality: graphical data plotter displays provided a way to quickly turn data into line, grid, and curve graphs (making it exponentially easier to do things like generate quarterly reports and sales forecasts), magnetic tape-based peripherals facilitated audio and data storage and plug-in disk drive storage devices beefed-up storage capacity.
That same year, another key computer was released—the CDC 6600 (the first iteration of the CDC 6000 series). The first commercially successful supercomputer, it comprised a central processor with 10 peripheral processors, 12 data channels, and a central magnetic core memory, plus cold water plumbing, which was used to cool down its flaming hot circuits. It contained a super amount of wires as well.
Crucially, the CDC 6600 contained a set of two screens and a keyboard at its control console, allowing you to do all your inputs in one place. At the time, this was a revelation; no more switching and running around between components necessary. The unit became a must-have for research laboratories.
While the 60s was definitely still a time of XXL machines, it was also the decade when the first use of the term ‘personal computer’ was recorded, in a 1968 issue of the American scientific journal Science). Specifically, the term was directed at the Hewlett-Packard 9100A, a programmable calculator.
The machine was the brain-child of Palo Alto-based physicist Malcolm McMillian and electrical engineer Tomas E. Osborne. Both of them independently designed prototypes of a similar machine, and HP moderated to take the best ideas from both designs and merge them into one super design.
Introduced in March 1968, the HP 9100A was intended to let scientists, engineers, and mathematicians do complex equations on the fly. It had a cathode ray tube display, weighed about as much as a mechanical typewriter, and could do logarithmic, hyperbolic, and trig functions, as well as coordinate transformations.
Although it cost $4,000 (or $33,000 in today’s dollars), it was a step in the right direction as far as affordability and portability were concerned.
In the background, the Internet appears
As computers improved, and ‘big ideas’ of computing appeared in the cultural discourse, the US government was busy laying down the foundations of the Internet.
In 1969, the Advanced Research Projects Agency Network (ARPANET) was launched.
ARPANET was designed to ensure reliable communications for government and military during the Cold War so that the overall computer network could still function when one node in the network was knocked out.
The initiative was spearheaded by Joseph Lidlicker (the first director of the Information Processing Technology Office), with key support from Ivan Sutherland, Bob Taylor, and Paul Baran of the RAND Corporation.
It used a host-to-host protocol they called Network Control Protocol (NCP), which is the direct predecessor to the modern Internet's TCP/IP (developed a few years later, in 1978).
Computers get personal (1970-1977)
The 1970s was a watershed moment for postwar computing; the time when the devices finally infiltrated the cultural mainstream. Heck, even the Watergate Hotel, site of the eponymous Nixon scandal that rocked US politics, was designed with computers.
Most important of all, this is the decade when computing finally moved out of government institutions, research facilities, and business offices and into the home.
Late in 1974, the world’s first personal computer, the Altair 8800, was built by Ed Roberts, who was a research engineer at a small calculator manufacturer in Albuquerque. Roberts built the machine in a desperate play to appeal to a new market, i.e. computer hobbyists, as price wars in the (then) so-hot-right-now calculator market were driving down profits and tipping his company in the direction of bankruptcy.
Hobby computing became a viable, albeit quite nerdy thing to do in the early 70s as a result of the development of the microprocessor, which dramatically cut the cost of building a computer.
In 1971, the first commercial microprocessor was introduced by Intel— the Intel 4004. It was built by a Japanese team from Busicom (a maker of business calculators) led by engineer Masatoshi Shima, who approached Intel and collaborated with the company’s in-house development team, led by Marcian Hoff and (subsequently) Federico Faggin.
Basically, what these engineers did was take Noyce and Kilby’s decade-old integrated circuit design, then add everything else you needed to make a computer work in one place, minus the memory and interfacing components.
In 1974, Intel further improved upon the 4004’s design, introducing the Intel 8080, which was 16 times more powerful.
In yet another case of sketchy patent office hijinks, a similar device had actually been developed by inventor Gilbert Hyatt in 1970. He applied for a patent that year but before it could clear all the bureaucratic red tape, the Intel 4004 was already on the market. However, in 1990, the U.S. Patent Office did finally award him credit for inventing the first viable microprocessor.
Back to Roberts. He managed to complete his $439 build-it-yourself (or buy it pre-assembled) computer in late 1974. The Altair 8800 appeared on the cover of the January 1975 issue of Popular Electronics, and generated a big buzz among true nerds.
Paul Allen and Bill Gates, for instance, were walking through Harvard Square and spotted the issue on a newsstand. The duo was inspired to get their own Altair 8800 kit and develop their first program together. It was called BASIC, and it was a programming language that allowed anyone who owned an Altair 8800 to write their own programs.
Writing this program inspired them to launch a company by the name of Microsoft in 1975.
1977, the ‘Year of the Computer’
1977 saw the triple-whammy release of mass-market targeted personal computers from Apple (the Apple II), Commodore (the PET 2001), and Tandy, aka Radio Shack, (the TRS-80).
These “out-of-the-box” microcomputers were simple enough to use that they crossed over from the computer hobbyist community into society’s mainstream. For their formative role in making computers a thing people actually use, they’re often referred to as ‘the Trinity.’
In January 1977, the Commodore PET 2001 was launched, the de facto first publicly available personal computer. In case you were wondering, P-E-T supposedly stands for Personal Electronic Transactor, whatever the hell that means, although it’s also possible the name was a tongue-in-cheek reference to the Pet Rock craze that had gripped the nation from 1975-1976.
Designed by engineer Chuck Peddle, the computer was sold as a complete unit, with a built-in cassette drive, a keyboard, built-in display, main logic board, power supply, and a protective casing. Its CPU power was 1 Mhz, it displayed graphics at 40x25 resolution, and it had internal expansion slots for plugging in extra memory or a floppy disk drive.
A computer being sold ‘complete’ may seem like a no-brainer now, but at the time it was a big deal. You could turn on the computer and it would (more-or-less) just work.
Simply put, this meant you didn’t necessarily need to be a learned hobbyist or person with a computer science background to realize the device’s value. It also retailed for $795 (or about $3,430 today), which wasn’t peanuts but was well within the reach of middle-class consumers.
The Apple II, designed by computer legend Steve Wozniak, was launched in June 1977. It was a major improvement on the build-it-yourself Apple I, although it used the same processor running at the same speed.
The machine featured a color display, eight expansion slots for add-on improvements (like a floppy disk controller, video card, or more memory), built-in BASIC programming language, and an ensuite keyboard and hardshell case that tied all the components together.
It also came with game paddles and a cassette deck, meaning you could enjoy bleep-bloop gaming adventures from the comfort of your home, rather than schlepping it to the arcade.
Although initially priced a bit steep at $1,298 a unit, the Apple II sold millions between 1977 and 1993. It was particularly popular as a learning computer at schools (if you were a kid in the 80s or early 90s, you probably used one).
Hot on the heels of Apple’s creation. the Tandy TRS-80 personal computer hit the shelves of Radio Shacks across America in early August 1977. Like the other two computers in the trinity, it featured an all-in-one setup, with a hardshell case enclosing its components, a built-in monitor and keyboard, and a cassette recorder.
The TRS-80 was principally developed by Radio Shack buyer and computer enthusiast Don French, who worked with National Semiconductor employee/hardcore nerd Steve Leininger. The device stored memory on cassettes.
The TRS-80 was the cheapest personal computer on the market, with a basic model priced at $399 and a more kitted-out model with 12” monitor for $599. That made it the first genuinely affordable computer available.
The price point, and the fact that the computer had excellent distribution at Radio Shack stores across America, at a time when computer stores were rare, helped make it a smash hit. The company had pessimistically forecasted sales of 3,000 units annually, but the reality was that 10,000 were sold in the first 1.5 months. The next year, 1978, over 100,000 were sold, making it by far the fastest-selling computer of the ‘trinity.’
From garage-size, million-dollar machines to middle-class affordability
The mid-century story of computing could well be called something like “The Story of the Shrinking Machine.” As dimensions and costs continued to shrink, the number of users grew.
UNIVAC and subsequent general-purpose machines like the IBM System/360 kept biting at bigger and bigger markets, capitalizing on advances in semiconductor and microprocessing technologies. Silicon Valley continued to grow as an innovation hub, anchored by its association with Stanford and the ever-deepening network of tech businesses.
Then the first wave of personal computers opened the door to what we now call the “information society.”
These personal computers inevitably became obsolete, but they also became the doorstops that held the door open for a whole new wave of mainstream adoption.
Things were about to get hectic.