Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Math Books Media Book Reviews Technology

When Computers Were Human 322

stern writes "In the not-so-distant past, engineers, scientists and mathematicians routinely consulted tables of numbers for the answers to questions that they could not solve analytically. Sin(.4)? No problem: look it up in the Sine table. These tables were prepared by teams of people called computers (no, really -- that's where the term comes from) who typically had only rudimentary math skills. The computers were overseen by more knowledgeable mathematicians, who designed the algorithms and supervised their work." Read below for Stern's review of David Alan Grier's book When Computers Were Human.
When Computers Were Human
author David Alan Grier
pages 424 (with index and table of names)
publisher Princeton University Press
rating worth reading
reviewer Stern
ISBN 0691091579
summary A history of the first "computers", semi-literates who did math by hand

The most important of these teams was the Mathematical Tables Project, organized by the Work Projects Administration in the United States during the Great Depression. WPA rules required the hiring of people with virtually no skills, so much of the definitive work of the Mathematical Tables Project was computed by people who had mastered only addition. They were not authorized to subtract, let alone delve into the mysteries of multiplication or division. The algorithmic steps assigned to them sometimes produced negative numbers, and it goes almost without saying that these computers had no idea what these were or how to handle them. Gertrude Blanch, the mathematician who oversaw their work, had devised a scheme whereby positive numbers would be written in black, negative numbers in red. On the wall in front of her human computers hung a poster that encapsulates much of the era of human computing. It read:

Black plus black is black
Red plus red is red
Black plus red or red plus black, hand the sheets to team 2

Grier has written a history of human computing. It begins in the 1760s and continues through the two hundred years until digital computers ended the industry.

From the start, computers were dedicated to projects in astronomy, cartography, and navigation. Grier describes the nature of these problems and why they required numerical solutions. He touches on the alternating competition and cooperation between teams of computers in different countries, and the different organizational models they employed. Perhaps the most memorable fact from the early years of human computing is that the very first team of French computers, assembled by Gaspard Clair Francois Marie Riche de Prony in the early 1790s, was composed entirely of wig-makers left unemployed by the French Revolution. They created trigonometric tables required by France's experiments with the decimalization of trigonometry (an abandoned effort to do for angle measure what the metric system was doing for the measurement of mass, length, and so forth).

Their work, though of little ultimate relevance to the modern world, illustrates aspects of human computing that would not change. Major computing efforts were always sponsored by governments. A small number of planners oversaw work by people who themselves knew little math. And the bulk of the work was done by people who were marginalized, perhaps otherwise unemployable, and who would do the repetitive calculations. This work conferred no prestige, and many were skeptical even of the conclusions drawn from it. If an equation could not be properly solved, how could one take confidence from any numerical approximation? Even Henry David Thoreau worked a dig at human computers into the manuscript for Walden, dismissing the mathematics that might allow an astronomer "to discover new satellites of Neptune but not detect the motes in his eyes, or to what vagabond he is a satellite himself."

Women emerged as the most important computers. Demand for computing spiked in wartime, when young men were off fighting and therefore unavailable, and the economics of hiring women was compelling even in peacetime. They would work for half of what similarly skilled men would. By World War II, in the United States, computing power was measured not in megahertz or teraflops, but in kilogirls.

By the 20th century, the work of human computers was augmented by mechanical or even electrical calculators that automated certain steps of their work, but these were expensive and prone to breakdown, and did not significantly change the nature of the work.

Grier devotes special attention to the Mathematical Tables Project run by the WPA, later taken over by the National Bureau of Standards, and to the mathematician Gertrude Blanch who ran that team. She is fascinating, a woman who arrived in the United States at the age of 11, who had worked to support her family and not been able to get her Ph.D until she was 39 years old. It was then 1936, the middle of the Great Depression, and the job prospects for female, Jewish mathematicians were bleak. Through luck and hard work she found her way to the Mathematical Tables Project, where she assumed a role that combined mathematician, schoolteacher, and coach. Her fanatical attention to error-checking resulted in tables good enough to win the support of those who were skeptical of work by a government relief organization. She also led by example, and solved certain problems personally when she thought that would be easier than breaking down the algorithms for her computers. Grier says that Blanch in this way personally did work that backed Hans Bethe's Nobel prize-winning model of solar evolution, though it is unclear if Bethe ever knew that the math had been done by one mathematician, rather than her computers. After the war, Blanch was hampered by FBI suspicions that she was secretly a communist. Their evidence for this was nearly nonexistent, and in what must have been a remarkable showdown, this diminutive fifty-year-old mathematician demanded, and won, a hearing to clear her name. She worked productively in numerical mathematics and algorithms for the rest of her life, but remained forever suspicious of digital computers and never adopted them herself.

Grier does excellent research, tracking down surviving computers and sorting through family letters to tell the stories of an entire industry that is being forgotten. He even finds evidence for the working environment for the women computers at Harvard Observatory in the late 1870s in the lyrics to a satire of Gilbert & Sullivan's HMS Pinafore, written by a junior astronomer there at the time.

The book is beautifully printed and has a comprehensive index. Kudos to the Princeton University Press for taking such pride in their work.

When Computers Were Human is weak in several areas. First, Grier glosses over technical aspects of human computing. What were the algorithms that these people used? How was error-checking implemented? He never tells us. Clearly, Grier's goal was to write a work of history, not math, but the people likely to read it are people who care about the math, or about computers, and he omits material that such readers would expect. Second, this is a bureaucratic story. The best human computing was done by large teams sponsored by government in wartime, and the story of these teams revolves around the politicians or bureaucrats who arranged for their funding, and the various acronym-labeled groups that gave them work or provided their employees. At times, it reads as much like a history of agricultural policies as a text about the prehistory of computers.

Grier's story follows his sources: he devotes space to the groups where he has the most material, even if others may have been larger or done more important work. Finally, his discussion of digital computers, where they play a role in the story, is cursory, and may not give credit to those who deserve it.

Is it worth reading? Yes. Consider the reviews of the final tables published by the Bureau of Standards at Amazon.com: In comments as recent as 2004, people who are still using these 50-year-old volumes comment in several languages on which chapters of the books are most useful, where to beware of errors or outdated methods, and on the special emotional role that these volumes play for those who use them, or who needed them in the past. "I probably would never have gotten my Ph.D without this book, and it is a stupendous classic." "Nearly every time you need a mathematical relation or information you will find it on this book." "If you work with mathematical research or numerical computing, you must have this book," and so forth. This praise, and Grier's book, are fine testaments to the world's first computers.


You can purchase When Computers Were Human from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

When Computers Were Human

Comments Filter:
  • by goombah99 ( 560566 ) on Tuesday July 05, 2005 @03:06PM (#12987749)
    Many of the great tables were compiled during the depression era. Public works projects. like our bridges and trail systems we live on that legacy and dont appreciate it was aone-off event.

    Well I take that back, George Bush has scheduled the next Depression in about 8 years. See you there in the computer room or the breadline. Your current skills will be worthless during the depression.

    Dont believe me? the national debt had doubled under George. For the current generation that's a debt of about $150,000 per head.

  • by Anonymous Coward on Tuesday July 05, 2005 @03:07PM (#12987756)
    It is very very humbling to think about all those teams sitting around calculating the sine and log for the damned tables.

    I find it has the opposite effect. All I can think about is, what a waste, here's an entire team of people I could replace with a script.
  • by alw53 ( 702722 ) on Tuesday July 05, 2005 @03:59PM (#12988222)
    I think you have your figures wrong, however,
    this table shows every single Demo president since Carter _decreasing_ national debt as a percentage of GNP, and every Repo president _increasing_ it. So the common wisdom that Demo's overspend vis-a-vis Repo's is just wrong, and your point is basically correct even though the number is wrong.

    http://www.skymachines.com/US_National_Debt_Per_Ca pita_Percent_of_GDP_and_by_President_1976-2004.htm [skymachines.com]
  • by paploo ( 238300 ) on Tuesday July 05, 2005 @04:16PM (#12988337)
    $7.8e12 / 2.96e8 = $26,351.35 per person.

    -2 points for violation of significant figures. (Yeah, I've been a physics TA before). :)

    Seriously, though, this is a (petty and pedantic) pet peeve of mine. You have two sig figs on one number and three on the other. How the hell did you get precision to the nearest penny? You should have $26,000 per person, but if I were grading I'd also accept $26,300 per person since basic sig fig rules aren't precise anyway. (You need error analysis techniques to be better!) :)
  • Re:progress? (Score:3, Insightful)

    by Gzip Christ ( 683175 ) on Tuesday July 05, 2005 @04:45PM (#12988604) Homepage
    So instead of asking a hunk of plastic and metal for answers to math problems, I would have been asking a room full of educated unmarried women? This is progress!?!?!
    It is for the women.
  • by arete ( 170676 ) <xigarete+slashdot@nosPam.gmail.com> on Tuesday July 05, 2005 @05:38PM (#12989039) Homepage
    Your example IS the example of why sigfigs are inherently tricky. Your answer is right - and in addition, decimal places will always get you a reasonable answer. But you've actually increased the number of significant figures (from 3 to 4) - AND that's actually the right answer.

    You get into really, really big problems when you mix flop and integer math, and the calculator couldn't know which one you're doing. The basic problem comes from the fact that "integer" precision is commonly notated the same way as "no precision at all"

    Here's some interesting examples:
    If I divide 1 by 2, the answer should be .5 - so your "dp" thing doesn't work at all - we added a dp of precision.

    If I multiply 5 x.5 as decimals 3 is probably the right answer IF you can guarantee there are no additional sigfigs. But if I entered .5 as a shorthand for the integer 1/2, then 2.5 is absolutely the right answer.

    If I entered 5 x .5 in a calculator and got 3, I would return it immediately.

    Furthermore, 2 1/2 is _probably_ the right answer, because 1/2 is only a small fraction of a significant figure. But your calculator only knows how to display "1/2" as .5, it has no way of displaying or expressing fractional significant figures.

    Going the other way is even worse - unless you're going to make everyone enter everything in SI - which will never happen - there's DEFINITELY no way to differentiate between estimated and real values. What's the right answer to 80 x 90? 7200? or 7000 ? It depends on whether those zeros were significant zeros or placeholders...

    Finally, some really, really crazy things start to go on when you have exponents and the like - very commonly you get cases where you probably meant the base of the exponent to be an integer even if some part of the exponent itself is a decimal - because 2.0 ^ 32.0 has NO significant digits unless the 2 is actually an integer. (even if the 32 DOES have infinite precision. (for instance: 2^32 ~ 4 bil. 1.96^32 ~ 2 bil; 2.04 ~8 bil )

    But nonethless not EVERY exponent is supposed to be an integer - especially when you're simply squaring something (pythagorean theroem on an arbitrary length, anyone? )

    You really need a calculator that is very advanced - not to do the math, but to have input and display that can reasonably interact with how poorly the PEOPLE using them know sigfigs - and how poor an idea the PEOPLE usually have about their input method.

    I've never seen a calculator with an _interface_ that could handle it. I actually think it might be easiest to do in a software calculator (even if my hardware calculator was better at some of the actual math)

  • by pete6677 ( 681676 ) on Tuesday July 05, 2005 @05:50PM (#12989127)
    I fully agree that calculators are over-used in most high school math classes, but I think this is going a bit too far. There's nothing wrong with allowing trig students to use basic scientific (non-graphing) calculators. I can't imagine how it would make students more productive or to give them a deeper understanding by making them slog through old log tables. Yes, a student should be able to approximate in fraction form the sin, cos, etc. without a calculator, and by all means should be able to do simple math in their heads, but I think it is counter-productive to make high school seniors do long division or mess with log tables.

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...