Computing Evolves. Part II: Peace and Silicon (1946-1959)

Last Updated:Sunday, May 28, 2023
SHARE:

Computers get memory! Silicon Valley starts up! 

In the second part of our seven-part series on the history of computers, we look at the postwar rise of computing, from commercialization to the building of Bay Area tech.

Memory and new markets (1946-1959)

World War II saw huge advances in computer technology, but applications were largely constricted to military necessity (codebreaking, artillery firing tables, atomic bomb research, etc.). Peace gave computers a chance to show what they could do outside of death, destruction, and subterfuge. 

The end of the war meant budget cuts and the end of lucrative military contracts. That initiated a scramble for wartime project teams to secure peacetime financing. To avoid becoming shiftless wastrels, they would have to dream up new civilian applications for their engineering know-how. 

UNIVAC 

So it was that a team of engineers and scientists, who all worked as codebreakers during the war, formed Engineering Research Associates (ERA) in 1946. In 1950, ERA was acquired by Remington Rand, a conglomerate that had mostly made typewriters, guns, and razors up to that point.

In 1951, they released the ERA 1011 (later re-branded as the UNIVAC 1101) mainframe computer, which was the world’s first stored-program computer (in other words, the first that could store programs in its own electronic memory). It used vacuum tubes and stored data in giant drums, and it did all forms of basic math, making it useful for general business applications and government censuses.

The UNIVAC 1011 represented one of the first practical examples of a computer using what is known as “Von Neumann architecture,” so named after mathematician and physicist John von Neumann. Von Neumann architecture allows the user to input program instructions (i.e. commands), allowing the reading and writing of data to take place within one machine.

COLOSSUS, ENIAC, and other World War II-era computers had been program-controlled, which meant that in order to run a specific program, you had to physically flick switches, and pop in a series of patch cables in the manner of a modular synth to run the desired input/outputs.

This was a liability for obvious reasons: it took up time and human labor to change programs, manual program-changing increased the possibility of mistakes, and the physical hardware could only do a finite set of things, which meant the equipment was bulky and task-specific (thus extremely expensive hardware could quickly become vaporware).

Confusingly enough, the same year the UNIVAC 1011 came out, so too did the UNIVAC I, which was designed by a totally different team for similar purposes. Said team was overseen by J. Presper Eckert and John Mauchly, who had overseen the ENIAC computer project that began in World War II. Their company, Eckert–Mauchly Computer Corporation, had also been bought out by Remington Rand. 

Postwar computing abroad

It’s worth noting that although the US was indeed the site of most finance and development in this era, the country did not have a monopoly on computing innovation. The same year the UNIVAC 1 and UNIVAC 1011 were rolled out, work started on a vacuum-tube computer research project in Japan. 

This would culminate in the production of the Todai Automatic Computer (TAC) in 1958. Yamashita Hideo, a co-founder of Toshiba, headed the project, which was a joint venture of the University of Tokyo and Tokyo Shibaura Co. Ltd. (which became Toshiba in 1978). 

As one of Japan’s first major computing projects, it helped launch other projects based on cooperation between educational institutions and private corporations. Such cooperation would help lay the foundation for the country’s economic ascendancy over the next few decades.

A year earlier, in 1950, German engineer Konrad Zuse’s company Zuse KG delivered his first Z4 commercial digital computer to ETH Zurich (aka. the Swiss Federal Institute of Technology in Zurich). The computer was easy to program and reliably solved differential equations.

The 1950 Z4 was an improved version of a model Zuse had built over 1944-1945, during World War II. In the final weeks of the conflict, his product demo had been interrupted by the sound of Soviet artillery, and he was forced to disassemble and evacuate the machine away from the front. 

Zuse’s Z4 was the only working computer in continental Europe for a two-year stretch.

Valley boys, semiconductor wars

In the 1950s, as computers began their inexorable spread across the world, a fundamental innovation was being made in semiconductors, one that would radically change computing from the ground up. 

The central figure in this story is an engineer by the name of William Shockley.

Born in 1910 in London to American parents, Shockley was not like other boys. He was highly irritable, yet also highly intellectually gifted. 

His parents moved back stateside, to Palo Alto, when he was still young. He attended Caltech in Pasadena from 1928 to 1932, earning a Bachelor’s degree in Science. Then he headed to the east coast to pursue a doctorate at MIT, which he completed in 1936. 

Right after graduation, Shockley began working for Bell Telephone Laboratories in New York, a subsidiary of telecom monopoly American Telephone & Telegraph (AT&T).

At the time he joined the company, long-distance telephone communication was facilitated by vacuum valves, which were very energy inefficient, generated a serious amount of heat, and had many, unreliable, electromechanical switches.

Shockley felt he could create a more efficient system. He began working on a semiconductor (also known as a ‘transistor’) that could perform both signal amplification and circuit switching.  

However, he couldn’t deliver a working design. The problem was that the electrons on the surface of the semiconductor material, which was made of geranium, blocked electric fields from getting inside it.

With the help of theoretical physicist John Bardeen and experimental physicist Walter Brattain, this bottleneck was resolved. The major solution was to use two gold contacts on the top of the semiconductor surface, spaced apart at just the right distance from one another, and then add a third gold contact at the bottom to encourage the flow of current through the material. 

Christmas came early for Bell Labs in 1947. On December 16, Shockley, Bardeen, and Brattain’s combined research culminated in the world’s first working semiconductor amplifier. The point-contact transistor device could successfully amplify the input signal by up to 100 times, which was a real game-changer.

With the semiconductor came the beginning of miniaturization. Small, rugged, reliable, and low on power consumption, the devices quickly made appliances, TVs, computers, radios, and any number of other electronic devices cheaper and better. The invention of the transistor is the reason you can now stick a phone with 2.53 GHz of computing power in your front pocket.  

For his work, Shockley was jointly awarded the 1956 Nobel Prize in Physics along with Bardeen and Brattain. 

Shockley left Bell Labs in 1953 to work on silicon transistors, convinced they were better than geranium ones. In 1956, he founded his own company, Shockley Semiconductor Laboratory, and set up shop in Mountain View, California, 

The seed of Silicon Valley was planted. 

Yet this seed did not exactly sprout for Shockley himself. That’s because, rather than concentrating on the production of commercially viable, cheap silicon transistors, Shockley insisted on researching a complex four-layer diode for telephones that would take years to figure out. 

It’s worth noting here that—despite his technical brilliance—Shockley was an extremely problematic person. Most disappointing of all, he was a hardcore racist, a proponent of eugenics who desired that ‘genetically disadvantaged’ individuals (i.e. socio-economically disenfranchised people of color) submit to voluntary sterilization in exchange for financial incentives (which he wanted to replace welfare).

On an interpersonal level, Shockley was a paranoid control freak and a highly ineffective manager. He was very good at finding and hiring the smartest, most qualified people, but after flattering them for a while, he would switch to ‘negging’ them, taking detailed notes on everything they ever did ‘wrong’ and micromanaging their every move. 

Shockley gets shocked | ‘traitors’ build better companies

Shockley’s streak of social acrimony was so bad, and people hated working for him so much, that his employees banded together and conspired with investors to get him removed and rotated out into an academic position where he could be a one-man show. When neither scenario played out, they quit en masse. 

These quitters were referred to by Shockley as “the Traitorous Eight.” Having found finance from deep-pocketed investor and inventor Sherman Fairchild on the sly, they left to jointly found Fairchild Semiconductor in 1958. 

Fairchild Semiconductor was a big success story, and in time gave birth to an endless array of spin-off companies known as ‘Fairchildren,’ like Intel (1968) and Advanced Micro Devices, aka AMD (1969). Today, something like 70% of publicly traded Bay Area tech companies can be traced back to Fairchild.

One of the Traitorous Eight, physicist Robert Noyce, quickly surpassed all existing transistor designs, developing the silicon integrated circuit in 1960. The integrated circuit was essentially a silicon chip with a bunch of transistors etched into it, resulting in improved efficiency in a smaller amount of space.

In an odd serendipity, Texas Instruments engineer Jack Kilby was working on his own integrated circuit at roughly the same time, which he successfully demonstrated in September 1958. Technically, Kilby was first to bat, but his design was more finicky and used germanium, which was on the way out, ensuring that Noyce‘s circuit would win out on the free market. 

Meanwhile, electrical engineer Fred Terman, who was Provost of Stanford University (1955-1965), went about transforming the blue-blooded campus into a high-tech innovation hub. 

Science/tech visionary Vannevar Bush had supervised Terman’s doctorate at MIT. Bush, who had effectively led American science during World War II, derided academic work that produced only theories, and instead, was firmly committed to research for real-world results. During the war, this meant connecting government and academic initiatives to win the war. 

Terman would re-work this formula, connecting business and academia to produce bigger profits. He nurtured relationships between the first wave Silicon Valley companies and nearby Stanford. 

In doing so, he stopped the ‘brain drain’ of professionals who would graduate and then book it to the big cities of the East Coast. This was a huge boon to Fairchild. 

Fairchild’s star rose as Shockley Semiconductor’s fell. Shockley’s company went bust and was sold in 1960. 

As for the old curmudgeon himself, he threw in the business towel and became a professor of electrical engineering and applied science at Stanford. His anti-humanist attitude only got worse, and he passed away in 1989 estranged from more-or-less everyone. 

Still, regardless of all his personal failures, Shockley undoubtedly helped lay the groundwork for the American tech industry as we know it today. As Jacques Beaudoin, a chemist who worked with Shockley would reminisce decades later, he was “the man who brought silicon to Silicon Valley." 

Conducting the future of computing

As we all know, Silicon Valley continued to grow as an innovation hub. Anchored by its association with Stanford, and the ever-deepening network of global connections forged by Shockley Semiconductor alumni at Fairchild (and later, Fairchild’s own spin-off companies), the region became the epicenter of computing.  

UNIVAC and subsequent postwar general purpose computers kept getting smaller, cheaper, and better, spreading rapidly across the globe. As more resources were concentrated in research clusters in the USA, Japan, and elsewhere, innovation accelerated. 

Meanwhile, even as hardware limitations kept machines big and pricey, the shifting cultural values of the 1960s brought forth new, iconoclastic visions of what computing might be. 

Stay tuned for part three, where we explore the 60s and 70s—the time when computers finally made the switch from ‘personnel’ to ‘personal.’

SHARE: