Computing Evolves. Part I: Moving Parts (1804-1945)

Last Updated:Monday, December 25, 2023
SHARE:

What if I told you the key to modern computing is hiding in the graphic patterns of every meta-ironic Christmas sweater you’ve ever seen?

Mad but true, we owe our computer lives to pattern-making innovations in the textile industry— innovations that date back to over 200 years ago.

In the first part of our seven-part series on the history of computers, we look at the evolution of computing, from post-French Revolution looms to the one-hundred-ton monster machines of World War II.

Floral patterns become algebraic ones: the Jacquard Loom and the first computer

At the start of the nineteenth century, industrialization was slowly but surely upbraiding life across Western Europe. Inventors, innovators, and business people everywhere were looking to speed up the pace of work, and by doing so put more commodities in the world faster.

In the city of Lyon, France, weaver and merchant Joseph Marie Jacquard was trying to figure out how to take textile weaving out of the Ancien Regime and into modernity. 

Jacquard designs a programmable loom

The upheavals of the French Revolution (1789-1799) were fresh in Jacquard’s mind, and had greatly affected him personally—he had fled Lyon as it was besieged by government forces in 1793, then joined the French Revolutionary Army and fought alongside his son Jean-Marie, who died in battle in 1797. His life, like society itself, had been turned upside down.

Prior to that revolution, he had been working on the idea of a machine that automated weaving. Putting himself back together following his son’s death, he started working on the idea again. 

For inspiration, he looked back across the last century, to Basile Bouchon, a textile artisan in Lyon who had built a machine in 1725 that used perforated tape to partially automate the weaving process. Also on his mind were Jacques de Vaucanson’s 1745 loom, which allowed warp threads to be automatically selected, and Vaucanson’s automata (complex devices that imitated living creatures), like his famous mechanical Digesting Duck (1739), which ate things… then defecated them out.

Over the course of 1804-1805, Jacquard succeeded in developing the world’s first fully programmable loom. The device allowed the weaving of incredibly detailed patterns in a much shorter amount of time than was previously possible. And this it did without the need to employ a master weaver for every complex piece. 

The Jacquard loom used a series of replaceable punch cards in sequence to automate the movements of the machine. Because you could always add more cards, it was possible to set up the machine for any sizing requirement, from tiny precision pieces to gargantuan ones.

A punch card is the physical analog to binary computing. You either have a hole in a space on the card or you don’t, just as you either have a 0 or a 1 in a string of numbers in the binary code used in digital computers—it’s the same sort of ‘base-two’ system where two possible commands are used to represent all information. In a binary string, the specific placement of 0s and 1s render everything from text to sound to images, with binary numbers of 8 or less digits making up a single byte of data.

If you multiply the number of spaces and the number of punch cards, you can store and communicate very complicated information, just like a digital computer (except with a Jacquard loom your ‘information’ will take up a lot of physical space...and will be made of paper).

The result of the Jacquard loom was radically cheaper manufacturing of patterned textiles, meaning that a newly emerging middle-class was able to buy products that had previously been affordable for only elites. A new market for fancy clothes, blankets, and tapestries was created more-or-less overnight.

Master weavers were not pleased with this disruption. Many skilled workers found themselves laid off. Jacquard’s machines were frequently sabotaged, but as they multiplied in numbers (by 1812 there were 11,000 of them), there was little the workers could do⁠—change did indeed ‘loom.’

By the 1820s, Jacquard’s invention was ubiquitous across Europe. By 1833, 100,000 looms based on Jacquard’s model were loom-ing away in England. 

As Jacquard’s machines made their way across the Channel, they caught the attention of an English mathematician and inventor by the name of Charles Babbage. 

Charles Babbage’s Difference and Analytical Engines

Babbage saw Jacquard’s punch card system as a means to develop other purposeful computational devices. If data and math formulae could be rendered on punch cards in the same way as fabric patterns, they could conceivably then be fed into a machine that did arithmetic operations and spit out the processed information.

In 1823, Babbage put forth his first concept for such a device, which he dubbed the Difference Engine. The world’s first automatic calculating machine, it performed polynomial functions, offering a way of dramatically improving the accuracy of mathematical tables, which were at the time done by teams of ‘human computers.’ This centuries-old arrangement was time-consuming and sometimes resulted in costly errors. 

The Difference Engine was never built, due to a myriad of factors (the main one being the British government pulled the plug on funding). But while developing the concept for the number-crunching machine, Babbage became cognizant of the possibility that yes, you could build a computer that performed a more diverse set of functions, beyond math tables. 

In 1834, working in collaboration with mathematician and thinker Ada Lovelace, Babbage developed the concept for what was, in practice, a steam-powered general-purpose computer. Detailed sketches and blueprints outlined a device that would incorporate input, output, memory, and a central processing computer (CPU) that executes the user’s program instructions using a series of pegs inside of rotating drums.

The machine was christened ‘the Analytical Engine.’ Only a trial version was ever built—just like the Difference Engine, it was never properly developed or implemented. Yet the computer genie was out of the bottle. 

As Lovelace surmised, “the Analytical Engine weaves algebraic patterns, just as the Jacquard loom weaves flowers and leaves.” And as the production of flower and leaves on textile became cheap, faster, and ubiquitous, so to would algebraic patterns. 

Difference and repetition: math takes on human thought

Babbage and Lovelace’s work in computing was groundbreaking, but it needed work. If the computer was to be ‘an engine of thought,’ it would have to implement some more complex rules for operation in sync with the fickle, complex nature of human behavior and human societies. 

George Boole, a self-taught English mathematician and logician, would jump-start this process. 

George Boole lays down The Laws

In 1854, Boole published The Laws of Thought, which put forward a mathematical language for dealing with human thought. His system of mental algebra, sometimes referred to as ‘Boolean logic,’ operates on the concept of true/false binary numbers, as well as and/or/not operands for directing action. 

In Boolean logic, variables are described in a binary format, according to their truth values: “True” is rendered as 1, while “False” is rendered as 0. This schema offers a formal way of describing logical relationships between things, beginning from simple propositions into complex conceptual relationships.  

What he offered was essentially a repudiation of ancient Aristotelian categorical logic, which worked around the basic model of two propositions and one conclusion. 

While thus developing a brand new spanking mode of algebra, Boole’s Laws of Thought also proposed that human society was governed by fundamental laws, which could be exposed through the analysis of troves of social data. This analysis, conducted by human ‘computers’  using standard algebra (polynomials), would be able to generate data on all aspects of society, from social class to relationships. 

Boole’s ideas provided the foundation for computer switching theory. Around 90 years later, they were practically applied by American mathematician/electrical engineer/cryptographer Claude Shannon in the creation of circuits, the basic blueprint underpinning electronic computing to this day. Shannon’s 1938 Master’s thesis dissertation “A Symbolic Analysis of Relay and Switching Circuits” relied on the notion that the “fundamental properties of general systems for the transmission of intelligence” could all be rendered in binary digits.

In his own time, Boole’s ideas influenced his close friend Sir William Thomson (aka. Lord Kelvin), a major mathematical physicist and engineer of his era. 

Sir William Thomson gives computers a proper purpose

In 1878, Thomson designed the world’s first Tide-predicting machine to chart the ebb and flow of the seas. Taking Babbage’s ideas further, it offered the first practical use of a mechanical analog computer that can perform any kind of differential equation.

Using gears, pulleys, and chains, the machine performed tidal equations using the position of the sun and moon, the shape of the coastline, and bathymetry (the underwater equivalent to topography). In one day of calculating, it could produce the equivalent of 125 days of work done by a human being.

Thomson built bigger machines based on the same principles in 1876 and 1879. In the United States, meteorologist William Ferrel built his own tide-predicting machine over 1881-1882.

Descendants of these devices were still in use all the way into the 1960s and 1970s, when digital computers finally made them obsolete.  

Computers find more uses, and modern times arrive

Perhaps the idea of a Tide-predicting machine measuring ocean waves doesn’t excite you. But it should, because it opened the floodgates to other applications for computers. Indeed, with the advent of this device, one might even be tempted to say ‘the tide had turned.’ 

Thomson’s machine provided the basic concept from which American engineer Vannevar Bush would build the world’s first analog computer.

The Differential Analyzer

Working at MIT with H.W. Nieman from 1927-1931, Bush developed a computer powered by electricity, yet highly mechanical, one that used drive shafts and gears (in the form of six mechanical integrators) to represent complex equations. It could solve differential equations with up to 18 variables.

The machine would be set up to perform a specific differential equation. Each ‘integrator’ was attached to the next one with a long rotating axle, allowing the output of the first integrator to be fed into the next one as an input, and so on until the last integrator in the sequence. 

Dubbed the ‘Differential Analyzer’, it formed the basis for a generation of electro-mechanical computers. 

Thomson’s Tide-predicting machine had been held back from processing complex equations by the technology of the late nineteenth century. It wasn’t possible to generate enough force to make one ‘integrator’ drive the next one, meaning an equation couldn’t have sequential integration steps. Bush solved this problem with a torque amplifier, which generated enough force to drive the machine in sequence, and also allowed for the delicate, high precision movements necessary to calculate equations accurately. 

The Differential Analyzer was enormous and clunky (it needed a whole room to itself), but it was relatively simple to use, and performed equations in real-time. Processing results was also a breeze—there was no need to translate any kind of high-level programming language into data; it all spewed straight out of the machine

Bush’s invention found a myriad of practical uses. In the 1930s, for example, General Electric’s Edith Clarke (the first female electrical engineer at the company) used it to analyze power lines and electrical infrastructure.

Differential Analyzers got bigger and beefier over the years and were built all over the world. 

English mathematician and physicist Douglas Hartree built one in England in 1935. Engineer Sasaki Tatsujiro and others built one in Japan in 1942, and in 1948 Beatrice Worsley (the first person ever awarded a Ph.D. in ‘Computer Science’) built one in Canada. Astrophysicist Svein Rosseland led the creation of the ‘Oslo Analyzer’ in 1938, which was twice as big as the one Bush had built at MIT. 

Not to be outdone, Bush presided over the construction of the massive Rockefeller Differential Analyzer at MIT in 1942. It weighed 100 tons and had 150 motors and 2,000 vacuum tubes connected to 12 integrators (twice the amount as Bush’s original machine). But this dinosaur of a device was already on the edge of obsolescence; an extinction event was just around the bend.

Cryptanalysis and the end of mechanical computing 

The existential pressure put on governments before and during World War II resulted in a flurry of computer research. One major aspect of said research was, perhaps unsurprisingly, devoted to cryptanalysis, or the deciphering of coded messages. 

In 1938, one year prior to the outbreak of war in Europe, Polish mathematicians Marian Rejewski, Henryk Zygalski, and Jerzy Rozycki developed the Bomba Kryptologiczna (i.e. the ‘cryptologic bomb’), also known simply as the “Bomba,” an electro-mechanical code-breaking machine for reading messages from Germany’s Enigma machine (which was used to encrypt important communications within the government, military, and important national industries).

In 1939 in Britain, British mathematician Alan Turing built a new version of the Bomba. In 1943, American engineers built one using British technology. 

Elsewhere, Turing’s cryptanalytic process was instrumental in developing COLOSSUS, which was a programmable electronic digital computer that wasn’t quite the world’s first, but certainly was the most consequential in its time. Built and designed by an engineer employed by the British Post Office named Tommy Flowers, it was up-and-running in December 1943. 

Delivered to the famous code-breaking facility at Bletchley Park on 18 January 1944, COLOSSUS broke its first message on February 5th of that year. By the end of the war, 10 of the type had been built. 

The machine’s major application was breaking the German Lorenz SZ-40/42 cipher machine, a highly advanced mechanical computer used by German High Command. The ability to read top-level German intentions and react accordingly *may* have shortened the war by several years.

On the other side of the conflagration, German computer pioneer Konrad Zuse was working on the Z3, which beat the COLOSSUS to the punch in terms of being the “first working programmable, fully automatic digital computer” in the world. 

This non-mechanical computer was completed by Zuse in 1941, a full two years ahead of COLOSSUS. However, he was working far from the cultural mainstream, and far from government organs in more-or-less total isolation (the first computer he designed, the Z1, had been built in his parent’s apartment).  

The Nazi government did not deem the Z3 ‘strategically essential’ and denied any funding, so it wasn’t put to any practical use, and further development ceased. In 1943, the computer went up in flames, becoming collateral damage of a large-scale Allied bombing raid on Berlin.  

The combination of Nazi government apathy towards the Z3, and its unintended but very convenient destruction, cleared the road for the Western powers to build the first practically implemented electronic circuit-based computers. 

ENIAC arrives

Work on the ENIAC computer often put forth as the world’s first general-purpose electronic digital computer, began at Penn State in late 1943, overseen by engineers John Mauchly and J. Presper Eckert Jr. 

Described as a ‘giant brain’ in the press, it incorporated 17,468 vacuum tubes and about 7,200 diodes in its design, reaching 1,800 square feet in size. In other words, it took up the same space as a detached starter family home. 

The beast didn’t become fully operational until after the end of WW2, in December 1945, when it was used as part of hydrogen bomb research project. It was otherwise used for calculating artillery trajectories and other military applications.

ENIAC is sometimes thought of as the first proper digital computer writ large, but it is not so. This ongoing perception is down to a couple of factors. 

The first is the lack of information about foreign projects during World War II, which influenced the way people wrote (and continue to write) about this era of computing history. COLOSSUS was classified by the British government as top secret until the 1970s, and Konrad Zuse’s Z3 computer was totally unknown outside of small circles in Germany for years (and let’s face it, is still pretty obscure). 

The second is a somewhat tragic lack of public knowledge about the Atanasoff-Berry Computer (ABC) designed in the late 1930s by physicist Vincent Atanasoff and engineer Clifford Berry at Iowa State University. Completed in 1942, the ABC used many of the same core ideas as ENIAC, but was ahead of the curve in that it used much more efficient binary computing. 

The story goes that ENIAC project leader Mauchly pinched many key ideas from Atanasoff, but never gave him credit. And since salt-of-the-earth Atanasoff didn’t bother to patent any of his ideas, the high-powered stars of ENIAC simply took them and did. It all came out in the wash during a patent dispute in the late 1960s about foundational computing concepts.

In 2010, the Pulitzer Prize-winning novelist Jane Smiley did a great biography about Atanasoff and this convoluted tale of computer pioneer backstabbing.

1804-1945: from complicated to even more complicated

Little did Jacquard know, as he was fiddling around with punch cards, that less than a century and a half later, there’d be massive electronic computers like ENIAC based on the elemental principles of his device. 

And not too long after that, his own machine’s technology would be eaten up by bits and bytes. Today, fully automatic Jacquard machines use binary code-based digital computers that scan the image of a pattern and creating a pixelated map. You can buy one on Alibaba for a few thousand bucks. 

ENIAC could do 20 hours of human math-work in about 30 seconds. Possibly, in its first few years of operation, the computer did more mathematical equations than had been done in human history up to that point. 

In the years to come, such power would unleash profound changes in human civilization. Fancy sweaters, rugs, and socks had their time; now it was complex algebra’s turn to change society.

SHARE: