Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Math Books Media Book Reviews Technology

When Computers Were Human 322

stern writes "In the not-so-distant past, engineers, scientists and mathematicians routinely consulted tables of numbers for the answers to questions that they could not solve analytically. Sin(.4)? No problem: look it up in the Sine table. These tables were prepared by teams of people called computers (no, really -- that's where the term comes from) who typically had only rudimentary math skills. The computers were overseen by more knowledgeable mathematicians, who designed the algorithms and supervised their work." Read below for Stern's review of David Alan Grier's book When Computers Were Human.
When Computers Were Human
author David Alan Grier
pages 424 (with index and table of names)
publisher Princeton University Press
rating worth reading
reviewer Stern
ISBN 0691091579
summary A history of the first "computers", semi-literates who did math by hand

The most important of these teams was the Mathematical Tables Project, organized by the Work Projects Administration in the United States during the Great Depression. WPA rules required the hiring of people with virtually no skills, so much of the definitive work of the Mathematical Tables Project was computed by people who had mastered only addition. They were not authorized to subtract, let alone delve into the mysteries of multiplication or division. The algorithmic steps assigned to them sometimes produced negative numbers, and it goes almost without saying that these computers had no idea what these were or how to handle them. Gertrude Blanch, the mathematician who oversaw their work, had devised a scheme whereby positive numbers would be written in black, negative numbers in red. On the wall in front of her human computers hung a poster that encapsulates much of the era of human computing. It read:

Black plus black is black
Red plus red is red
Black plus red or red plus black, hand the sheets to team 2

Grier has written a history of human computing. It begins in the 1760s and continues through the two hundred years until digital computers ended the industry.

From the start, computers were dedicated to projects in astronomy, cartography, and navigation. Grier describes the nature of these problems and why they required numerical solutions. He touches on the alternating competition and cooperation between teams of computers in different countries, and the different organizational models they employed. Perhaps the most memorable fact from the early years of human computing is that the very first team of French computers, assembled by Gaspard Clair Francois Marie Riche de Prony in the early 1790s, was composed entirely of wig-makers left unemployed by the French Revolution. They created trigonometric tables required by France's experiments with the decimalization of trigonometry (an abandoned effort to do for angle measure what the metric system was doing for the measurement of mass, length, and so forth).

Their work, though of little ultimate relevance to the modern world, illustrates aspects of human computing that would not change. Major computing efforts were always sponsored by governments. A small number of planners oversaw work by people who themselves knew little math. And the bulk of the work was done by people who were marginalized, perhaps otherwise unemployable, and who would do the repetitive calculations. This work conferred no prestige, and many were skeptical even of the conclusions drawn from it. If an equation could not be properly solved, how could one take confidence from any numerical approximation? Even Henry David Thoreau worked a dig at human computers into the manuscript for Walden, dismissing the mathematics that might allow an astronomer "to discover new satellites of Neptune but not detect the motes in his eyes, or to what vagabond he is a satellite himself."

Women emerged as the most important computers. Demand for computing spiked in wartime, when young men were off fighting and therefore unavailable, and the economics of hiring women was compelling even in peacetime. They would work for half of what similarly skilled men would. By World War II, in the United States, computing power was measured not in megahertz or teraflops, but in kilogirls.

By the 20th century, the work of human computers was augmented by mechanical or even electrical calculators that automated certain steps of their work, but these were expensive and prone to breakdown, and did not significantly change the nature of the work.

Grier devotes special attention to the Mathematical Tables Project run by the WPA, later taken over by the National Bureau of Standards, and to the mathematician Gertrude Blanch who ran that team. She is fascinating, a woman who arrived in the United States at the age of 11, who had worked to support her family and not been able to get her Ph.D until she was 39 years old. It was then 1936, the middle of the Great Depression, and the job prospects for female, Jewish mathematicians were bleak. Through luck and hard work she found her way to the Mathematical Tables Project, where she assumed a role that combined mathematician, schoolteacher, and coach. Her fanatical attention to error-checking resulted in tables good enough to win the support of those who were skeptical of work by a government relief organization. She also led by example, and solved certain problems personally when she thought that would be easier than breaking down the algorithms for her computers. Grier says that Blanch in this way personally did work that backed Hans Bethe's Nobel prize-winning model of solar evolution, though it is unclear if Bethe ever knew that the math had been done by one mathematician, rather than her computers. After the war, Blanch was hampered by FBI suspicions that she was secretly a communist. Their evidence for this was nearly nonexistent, and in what must have been a remarkable showdown, this diminutive fifty-year-old mathematician demanded, and won, a hearing to clear her name. She worked productively in numerical mathematics and algorithms for the rest of her life, but remained forever suspicious of digital computers and never adopted them herself.

Grier does excellent research, tracking down surviving computers and sorting through family letters to tell the stories of an entire industry that is being forgotten. He even finds evidence for the working environment for the women computers at Harvard Observatory in the late 1870s in the lyrics to a satire of Gilbert & Sullivan's HMS Pinafore, written by a junior astronomer there at the time.

The book is beautifully printed and has a comprehensive index. Kudos to the Princeton University Press for taking such pride in their work.

When Computers Were Human is weak in several areas. First, Grier glosses over technical aspects of human computing. What were the algorithms that these people used? How was error-checking implemented? He never tells us. Clearly, Grier's goal was to write a work of history, not math, but the people likely to read it are people who care about the math, or about computers, and he omits material that such readers would expect. Second, this is a bureaucratic story. The best human computing was done by large teams sponsored by government in wartime, and the story of these teams revolves around the politicians or bureaucrats who arranged for their funding, and the various acronym-labeled groups that gave them work or provided their employees. At times, it reads as much like a history of agricultural policies as a text about the prehistory of computers.

Grier's story follows his sources: he devotes space to the groups where he has the most material, even if others may have been larger or done more important work. Finally, his discussion of digital computers, where they play a role in the story, is cursory, and may not give credit to those who deserve it.

Is it worth reading? Yes. Consider the reviews of the final tables published by the Bureau of Standards at Amazon.com: In comments as recent as 2004, people who are still using these 50-year-old volumes comment in several languages on which chapters of the books are most useful, where to beware of errors or outdated methods, and on the special emotional role that these volumes play for those who use them, or who needed them in the past. "I probably would never have gotten my Ph.D without this book, and it is a stupendous classic." "Nearly every time you need a mathematical relation or information you will find it on this book." "If you work with mathematical research or numerical computing, you must have this book," and so forth. This praise, and Grier's book, are fine testaments to the world's first computers.


You can purchase When Computers Were Human from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

When Computers Were Human

Comments Filter:
  • by imsabbel ( 611519 ) on Tuesday July 05, 2005 @03:28PM (#12987962)
    Thats not parallel computing, that is pipelining.
    But still fascinating, as it is used in modern cpus for the very same reasons it was used back then, only on totally different scales....
  • Re:Sci-Fi Novel (Score:3, Informative)

    by Rosco P. Coltrane ( 209368 ) on Tuesday July 05, 2005 @03:30PM (#12987978)
    Actually there's a good scifi novel called "Dune" in which a class of humans, called "mentats", receive intensive training to be able to perform complex computations.

    From what I remember, there's hardly any machine-computers in Dune. The empire has great technology and all, but it's all manned (space travel by the members of the spacing guild, calculations by mentats, telepathy by the bene gesserit)...
  • Re:Slide rules... (Score:5, Informative)

    by dasunt ( 249686 ) on Tuesday July 05, 2005 @03:30PM (#12987981)
    You can keep your slide rule, and I'll keep my TI. Which can calculate sin,cos,tan as well as e and pi to 10 digits.

    Lets let wikipedia rebutt this:

    Advantages: A slide rule tends to moderate the fallacy of "false precision" and significance. The typical precision available to a user of a slide rule is about three places of accuracy. This is in good correspondence with most data available for input to engineering formulas (such as the strength of materials, accurate to two or three places of precision, with a great amount--typically 1.5 or greater--of safety factor as an additional multiplier for error, variations in construction skill, and variability of materials). When a modern pocket calculator is used, the precision may be displayed to seven to ten places of accuracy while in reality, the results can never be of greater precision than the input data available."
  • Ballistics (Score:2, Informative)

    by layer3switch ( 783864 ) on Tuesday July 05, 2005 @03:51PM (#12988148)
    During World War I, Naval Ships, mainly battleships relaying on long range artillery such as the Dreadnought used human computation for projectile of artillery. Dreadnought having eight 15-inch guns capable of firing a 1,920-pound projectile 35,000 yards (or 16 miles) and steam turbines reaching a speed of twenty-one knots, gave the extra edge to win the battle through precision of ballistic projectile from far distance.

    Having said that, I believe, some of the points which the article brought up downplayed the importance of those "human computers" in some way.

    I believe, those who filled the occupation as "human computer" led the way for greater precision and more reliable and faster computation if not life saving.
  • by Anonymous Coward on Tuesday July 05, 2005 @03:57PM (#12988198)
    I was fortunate to find in a used book store an 1868 Pocket Websters Dictionary. On a lark, after looking for all the dirty words (there were none), I looked up "computer". Sure enough, there it was as "one who computes".
  • by Mac Scientist ( 153390 ) on Tuesday July 05, 2005 @04:01PM (#12988235)
    Reminds me of an Asimov story "The Feeling of Power" written by Asimov in 1958. People of the future, who are totally reliant on personal computers, experience the wonder at being able to do arithmetic by hand.

    Are we there yet?
  • Re:Sci-Fi Novel (Score:3, Informative)

    by hazem ( 472289 ) on Tuesday July 05, 2005 @04:11PM (#12988304) Journal
    Yes, it was a result of the Butlerian Jihad. There was a big war about "thinking machines" and they were banned. Thus, mentats. I'm not sure, however, if the machines were combattants in the war, or just the subject of it.

    Even Star Trek treats the idea in Insurrection. One of the guys living on the paradise planet says, "When you create a machine to do the work of a man, it diminishes the man".

    The people on this project were not mentat-like. They're more like an op-amp in a funky human computer.
  • Re:Grier? (Score:4, Informative)

    by poot_rootbeer ( 188613 ) on Tuesday July 05, 2005 @04:25PM (#12988426)
    People could add but not subtract? They could know what a positive number is, but not a negative?

    There was a time when this was true of YOU, y'know.

    Granted, these days most of us in industrialized nations move on and grok subtraction and negative numbers by second grade, but it doesn't seem unreasonable that 3/4ths of a century ago, some unskilled works might have made it to adulthood without getting that far.
  • Re:Slide rules... (Score:5, Informative)

    by bcrowell ( 177657 ) on Tuesday July 05, 2005 @04:52PM (#12988666) Homepage
    I use a slide rule rather than a calculator or computer in situations where it's appropriate. I have a cute little one I carry in my pants pocket, comes in very handy. Here [wikipedia.org] is some discussion of the advantages of slide rules. Actually there's quite a big community of people who like slide rules, and nice ones tend to go for quite a bit of money on e-bay. When's the last time you actually needed to calculate something to eight decimal places?
  • by MattJ ( 14813 ) on Tuesday July 05, 2005 @04:56PM (#12988705) Homepage
    The Smithsonian has a great interview with Ida Rhodes, who assisted Blanch.
    Here [smithsonian.org].

  • Re:Sci-Fi Novel (Score:2, Informative)

    by bigsmoke ( 701591 ) <bigsmoke@gmail.com> on Tuesday July 05, 2005 @05:39PM (#12989048) Homepage Journal

    Wikipedia contains more information on why there are no machinal computers in the Dune universe [wikipedia.org]. There was the Butlerian Jihad [wikipedia.org] in the Dune universe, which was a crusade for the destruction of computers, robots, and anything that tries to replace the human mind with a machine (artificial intelligence).

    This battle for supremacy of humans and sentient machines is described in Dune: The Butlerian Jihad [wikipedia.org], one of the prequels by Brian Herbert and Kevin J. Anderson.

  • by Pentagram ( 40862 ) on Tuesday July 05, 2005 @05:49PM (#12989121) Homepage
    On the subject of fiction, it reminded me of an Arthur C. Clarke short story called "Into the Comet". A spacecraft's dodgy computer gets replaced by a beowulf cluster of people with abacuses.
  • by peter303 ( 12292 ) on Tuesday July 05, 2005 @05:52PM (#12989145)
    I remember the early years of computer science as being a secretarial/trade school kind of thing. I remember MIT and Stanford faculty debates as to whether they should even offer an undergraduate major in computer science because it considered too "vocational". If you were a Stanford student in comp sci you got a "stealth degree" as a minor in the math department. At MIT they hid it in electrical engineering and STILL HAVENT granted it independent department status even though at the height of the computers science boom one third of undergraduates majored in this option. Even now MIT refuses to teach a practical introductory computer science course. Their first course has been based on LISP since the late 1960s and still uses the version called SCHEME.
  • wrong (Score:3, Informative)

    by goombah99 ( 560566 ) on Wednesday July 06, 2005 @12:01AM (#12991243)
    the great depression was caused by the inability of industry to raise money for expansion, and the lack of consumer liquidity. Which is a complicated way of saying debt became expensive. As the government borrows more the expense of debt grows. Taxes go up and infrastructure goes unmaintained. the price of goods rises and industries collapse for lack of viable markets. voila the depression cycle that starts with loss of liquidity.

    ironically the only reason we have low interest rates right now is the influx of chinese trade dollars into our debt markets. That will dry up ten seconds after the chinese dollar floats. The debt however will remain and have to serviced on the backs of the next generation of income earners.

"Experience has proved that some people indeed know everything." -- Russell Baker

Working...