If you’ve ever been an 8 year old deprived of a calculator during a long division math test, you’ve probably wondered how on earth people got on without computers. Naturally, tricky math problems must predate the invention of sophisticated machines. So what did you do when faced with a task that required a myriad of component calculations and painstaking attention to detail, not to mention showing your work? What any reasonable person would do – get someone else to do it. Or better still, get a team of people to do it and, if you’re smart, provide chocolate biscuits at afternoon tea.
This question is the focus of David Alan Grier’s book When Computers Were Human which outlines the pre-information age story of “human computers” doing calculations by hand – not just the straightforward office-worker’s calculation of how many days left until Friday; but complicated, multifaceted problems such as those required for astronomy, cartography, navigation, and (unsurprisingly) warfare. This required teams of people manually crunching numbers to create tables which could be of use to people engaged in these kinds of activities. Generally, the “computers” were something like workers on a production line: each with a specific task, the result of which would be worked on by another person and so on. In many cases, the workers involved may not have known precisely how their work would be used down the line.
In the modern world it seems odd that a human could be a computer – the sentence “allow me to introduce my wife; she’s a computer” would surely raise some eyebrows. A moment’s consideration of the word itself removes the oddness – to compute is to calculate, so it follows that a “computer” would correspond to “one who calculates”. So before the machines established their binary monopoly on working stuff out, there was only one way to go – good old fashioned, sweaty, humans.
For Grier, the story of human computers begins with Halley’s Comet in the 1700s. Halley had cleverly worked out that his eponymous comet kept coming back at semi-regular intervals. But he couldn’t for the life of him figure out with any kind of precision when the next appearance would be. Three French astronomers thought they could do better and set about it by breaking the problem down into tiny parts and dividing the labour between them.
While Halley’s Comet did not reappear precisely on cue, the astronomers were not exactly light-years off the mark and had done immensely better than Halley himself had managed on his lonesome. It was the division of labour between component parts of a problem that was a huge step in the right direction for solving complex problems. To illustrate, with two astronomers working on the planets’ gravitation fields and the other on the comet itself, a good deal of time and sanity could be spared.
From these first tentative steps, teams of humans with furrowed brows and worn-down pencils went on to perform calculations to create tables which could be used to help solve a myriad of problems. However, those of us who struggle with fractions at the best of time can be thankful that technology has since stepped into to relieve human computers of their pencils, although the transition to reliance on machines was not especially swift or smooth.
While computing machines were in development, humans still had the upper hand. A tired human can still come in to work, albeit a tad less productively. But a broken computing machine’s net output is zero point nothing, minus the man-hours spent tapping it ponderously with a wrench. Furthermore, it will sit there silently and unresponsively mocking your attempts to fix it. Eventually however, one supposes it was inevitable that improved machines would become so efficient, and breakdowns so infrequent, as to render their human forebears obsolete.
The tipping point came during WWII when a young Richard Feynman created a competition of sorts while working on the Manhattan Project. Human computers would square off against a machine named the Punched-Card Tabulator to do calculations required for the plutonium bomb. Bravely, the humans held out for two days, maintaining an equal pace to their mechanical counterpart. But on the third day the machine rose to take the lead; the humans had tired and couldn’t keep up their original enthusiastic pace.
Game over. The dominance of the human computer had come to an end, to be replaced by mechanical computers with human operators. For those of us brought up in the information age where answers are a click away it can be difficult to imagine that this time existed at all. And for those whose interest has been piqued, Grier’s book, When Computers Were Human, can provide an interesting insight on the time when actual humans performed the work that our ubiquitous mechanical computers silently and thanklessly perform for us today.
Thankfully, human computers haven’t disappeared from the scene altogether – their role has simply changed. With the machines doing the legwork and handling the math, this frees up the human labor to focus on other, more qualitative matters. Collaborative endeavors such as Wikipedia or IMDB now use crowdsourcing to organize human efforts in extraordinarily innovative ways. Modern day human computers are no longer furiously scribbling away on math problems, but form one of a group of thousands of people contributing to edit a website database from anywhere in the world. Some of them may not even know the first thing about long division.