Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Math Books Media Book Reviews Technology

When Computers Were Human 322

stern writes "In the not-so-distant past, engineers, scientists and mathematicians routinely consulted tables of numbers for the answers to questions that they could not solve analytically. Sin(.4)? No problem: look it up in the Sine table. These tables were prepared by teams of people called computers (no, really -- that's where the term comes from) who typically had only rudimentary math skills. The computers were overseen by more knowledgeable mathematicians, who designed the algorithms and supervised their work." Read below for Stern's review of David Alan Grier's book When Computers Were Human.
When Computers Were Human
author David Alan Grier
pages 424 (with index and table of names)
publisher Princeton University Press
rating worth reading
reviewer Stern
ISBN 0691091579
summary A history of the first "computers", semi-literates who did math by hand

The most important of these teams was the Mathematical Tables Project, organized by the Work Projects Administration in the United States during the Great Depression. WPA rules required the hiring of people with virtually no skills, so much of the definitive work of the Mathematical Tables Project was computed by people who had mastered only addition. They were not authorized to subtract, let alone delve into the mysteries of multiplication or division. The algorithmic steps assigned to them sometimes produced negative numbers, and it goes almost without saying that these computers had no idea what these were or how to handle them. Gertrude Blanch, the mathematician who oversaw their work, had devised a scheme whereby positive numbers would be written in black, negative numbers in red. On the wall in front of her human computers hung a poster that encapsulates much of the era of human computing. It read:

Black plus black is black
Red plus red is red
Black plus red or red plus black, hand the sheets to team 2

Grier has written a history of human computing. It begins in the 1760s and continues through the two hundred years until digital computers ended the industry.

From the start, computers were dedicated to projects in astronomy, cartography, and navigation. Grier describes the nature of these problems and why they required numerical solutions. He touches on the alternating competition and cooperation between teams of computers in different countries, and the different organizational models they employed. Perhaps the most memorable fact from the early years of human computing is that the very first team of French computers, assembled by Gaspard Clair Francois Marie Riche de Prony in the early 1790s, was composed entirely of wig-makers left unemployed by the French Revolution. They created trigonometric tables required by France's experiments with the decimalization of trigonometry (an abandoned effort to do for angle measure what the metric system was doing for the measurement of mass, length, and so forth).

Their work, though of little ultimate relevance to the modern world, illustrates aspects of human computing that would not change. Major computing efforts were always sponsored by governments. A small number of planners oversaw work by people who themselves knew little math. And the bulk of the work was done by people who were marginalized, perhaps otherwise unemployable, and who would do the repetitive calculations. This work conferred no prestige, and many were skeptical even of the conclusions drawn from it. If an equation could not be properly solved, how could one take confidence from any numerical approximation? Even Henry David Thoreau worked a dig at human computers into the manuscript for Walden, dismissing the mathematics that might allow an astronomer "to discover new satellites of Neptune but not detect the motes in his eyes, or to what vagabond he is a satellite himself."

Women emerged as the most important computers. Demand for computing spiked in wartime, when young men were off fighting and therefore unavailable, and the economics of hiring women was compelling even in peacetime. They would work for half of what similarly skilled men would. By World War II, in the United States, computing power was measured not in megahertz or teraflops, but in kilogirls.

By the 20th century, the work of human computers was augmented by mechanical or even electrical calculators that automated certain steps of their work, but these were expensive and prone to breakdown, and did not significantly change the nature of the work.

Grier devotes special attention to the Mathematical Tables Project run by the WPA, later taken over by the National Bureau of Standards, and to the mathematician Gertrude Blanch who ran that team. She is fascinating, a woman who arrived in the United States at the age of 11, who had worked to support her family and not been able to get her Ph.D until she was 39 years old. It was then 1936, the middle of the Great Depression, and the job prospects for female, Jewish mathematicians were bleak. Through luck and hard work she found her way to the Mathematical Tables Project, where she assumed a role that combined mathematician, schoolteacher, and coach. Her fanatical attention to error-checking resulted in tables good enough to win the support of those who were skeptical of work by a government relief organization. She also led by example, and solved certain problems personally when she thought that would be easier than breaking down the algorithms for her computers. Grier says that Blanch in this way personally did work that backed Hans Bethe's Nobel prize-winning model of solar evolution, though it is unclear if Bethe ever knew that the math had been done by one mathematician, rather than her computers. After the war, Blanch was hampered by FBI suspicions that she was secretly a communist. Their evidence for this was nearly nonexistent, and in what must have been a remarkable showdown, this diminutive fifty-year-old mathematician demanded, and won, a hearing to clear her name. She worked productively in numerical mathematics and algorithms for the rest of her life, but remained forever suspicious of digital computers and never adopted them herself.

Grier does excellent research, tracking down surviving computers and sorting through family letters to tell the stories of an entire industry that is being forgotten. He even finds evidence for the working environment for the women computers at Harvard Observatory in the late 1870s in the lyrics to a satire of Gilbert & Sullivan's HMS Pinafore, written by a junior astronomer there at the time.

The book is beautifully printed and has a comprehensive index. Kudos to the Princeton University Press for taking such pride in their work.

When Computers Were Human is weak in several areas. First, Grier glosses over technical aspects of human computing. What were the algorithms that these people used? How was error-checking implemented? He never tells us. Clearly, Grier's goal was to write a work of history, not math, but the people likely to read it are people who care about the math, or about computers, and he omits material that such readers would expect. Second, this is a bureaucratic story. The best human computing was done by large teams sponsored by government in wartime, and the story of these teams revolves around the politicians or bureaucrats who arranged for their funding, and the various acronym-labeled groups that gave them work or provided their employees. At times, it reads as much like a history of agricultural policies as a text about the prehistory of computers.

Grier's story follows his sources: he devotes space to the groups where he has the most material, even if others may have been larger or done more important work. Finally, his discussion of digital computers, where they play a role in the story, is cursory, and may not give credit to those who deserve it.

Is it worth reading? Yes. Consider the reviews of the final tables published by the Bureau of Standards at Amazon.com: In comments as recent as 2004, people who are still using these 50-year-old volumes comment in several languages on which chapters of the books are most useful, where to beware of errors or outdated methods, and on the special emotional role that these volumes play for those who use them, or who needed them in the past. "I probably would never have gotten my Ph.D without this book, and it is a stupendous classic." "Nearly every time you need a mathematical relation or information you will find it on this book." "If you work with mathematical research or numerical computing, you must have this book," and so forth. This praise, and Grier's book, are fine testaments to the world's first computers.


You can purchase When Computers Were Human from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

When Computers Were Human

Comments Filter:
  • by Fjornir ( 516960 ) on Tuesday July 05, 2005 @02:55PM (#12987623)
    You can have my circular slide-rule when you pry it from my cold dead fingers.
    • by Anonymous Coward on Tuesday July 05, 2005 @03:03PM (#12987723)
      According to my actuarial table, if you are still using a circular slide rule, I may not have have very long to wait.

    • oh dont worry, the robots have a mighty strong grip and you're a much less efficient fuel source clutching that slide rule.
    • Probably a joke killer, but I'm curious, can you legitimately call it a slide-rule, if it's not straight? i always thought the "rule" part of "slide-rule" was meant to mean ruler, or straight edge measuring device.
      • I don't know what he is talking about, but in aviation, many people use a tool called "circular slide rule" to flight plan (it helps determine air speed, ground speed, fuel, weight and balance, and many other numbers that are useful for aviating).
    • by slapout ( 93640 ) on Tuesday July 05, 2005 @04:08PM (#12988278)
      If anyone's interested, there are several sites with instructions on creating your own slide rule.

      http://www.sphere.bc.ca/test/build.html [sphere.bc.ca]

      http://solar.physics.montana.edu/kankel/math/csr.h tml [montana.edu]

      etc.
    • clicky [teacherlink.org]
  • by cloudofstrife ( 887438 ) on Tuesday July 05, 2005 @02:58PM (#12987663)
    Now what's the percentage of the businesses/governments that used open source software/algorithms on their human computers?
    • Now what's the percentage of the businesses/governments that used open source software/algorithms on their human computers?

      That's a good question. I would guess that if you exclude processes related to each company's unique business logic, the proportion of OSS software used by their human computers was probably very high. The mathematical needs of most businesses are probably quite similar, so all of the computers would likely be using public domain algorithms. No one has patented basic mathematics yet..
  • Imagine... (Score:2, Funny)

    by Anonymous Coward
    ...a beowolf cluster of those!

    (p.s. I'm not wasting perfectly good karma on this)
    • You can it is call a beaurocicy. It doesn't work well.
    • On an old episode of Dr Who, the Doctor went to a planet of human calculators (to try to get the TARDIS chameleon circuits working again, I think). Everybody on the whole planet was sitting around with an abacus in their hand.
      • I also seem to remember a sci-fi short story that involved a war with mars, missiles, and compuations for guidance.

        Nobody knew how to do math. It seems scandal of the story was a man who could do the math without the calculator - so the government wanted him to teach others how to do math so they could put people (expendible) into the rockets, instead of expensive computers.

        It's all kind of a vague memory, so the details might be all wrong.
  • by Foolomon ( 855512 ) on Tuesday July 05, 2005 @02:59PM (#12987665) Homepage
    But can they boot up with Linux? And when the supervisory mathematicians added a new table for them to use, did you have to recompile them? :D
  • ...am glad to see that David Allan has moved onto other things after his long career with M.A.S.H. in the 70s. ;P
  • by aftk2 ( 556992 ) on Tuesday July 05, 2005 @03:00PM (#12987681) Homepage Journal
    Did he write this book before or after his seminal work on "In Living Color": When Television Was Funny.
    • by Qzukk ( 229616 ) on Tuesday July 05, 2005 @03:16PM (#12987835) Journal
      One would think that with a naming convention that allows two or more alphabetic names plus a possibility of a trailing number that parents would manage to name the people they create in a non-colliding fashion. Obviously we need to create namespaces to further subdivide the population of names to help disambiguate such conflicts.

      I propose that we begin using a word to identify said namespaces. Let's call it a "title". When we then refer to a specific person, we then refer to them by title. For example, and I'm just making this up here, we may want to have several committee meetings before we settle on these namespace titles, we could refer to this person as "Comedian David Alan Grier". This would disambiguate references to that person from another person... lets call him "Professor David Alan Grier".

      Of course this is just an idea in formation stages. We'll need to hold off on any action until we have an RFC with approvals from the appropriate naming organizations and an ISO standard to help ensure worldwide compatibility.
  • Truly amazing... (Score:5, Interesting)

    by Gopal.V ( 532678 ) on Tuesday July 05, 2005 @03:01PM (#12987689) Homepage Journal
    It is very very humbling to think about all those teams sitting around calculating the sine and log for the damned tables. I hated to even use a slide-rule or the log tables - the only thing I could do in my head was approximate square roots. These are the real pioneers who made most of modern engineering math possible.

    The more interesting part is the title rather than the blurb though. It sounded almost like when men were men, women were women and small furry creatures from Alpha Centauri were small furry creatures. Sadly this seems to be a story about the people who bothered the so called computers rather than a story of grit and glory - a story of buearacracy and communist witch hunts ?.

    • Re:Truly amazing... (Score:3, Interesting)

      by quarkscat ( 697644 )
      The parent and /. review reminds me about my time
      working as a sub-subcontractor on the Hubble Space
      Telescope. The development teams for the science
      instrument packages that were to upgrade (prior to
      the SST accident) the HST would check the output
      of Oracle database stored procedures by comparing
      trig functions with those from a 20 year old trig
      tables book.

      If you thought proofreading the book in the grand-
      parent /. post book review was tedious, imagine
      having to proofread the data tables in that 20
      year old tri
    • It will be very humbling to think about people working in car factories, restaurants, Walmart, etc. in an age where robots do all the manual labor and people just play sports, theatre plays, or pretty much just keep busy with what the english snob lord community did when they went to their clubs and didn't ever do a day's of work. Man, if I tell about this in my club!
  • should be:
    Black plus black is black
    Red plus red is red
    Black plus red or red plus black, hand the sheets to team 2
    In case anyone was confused by the lack of the line break.
    • No, I was wondering more about THIS:
      Women emerged as the most important computers.
      <br>
      Demand for computing spiked in wartime
      ... so what he's saying is the demand for computers a.k.a. women went DOWN after the war. So all that "make love, not war" stuff back in the '60s was really gay-on-gay propaganda.

      Wow. Whodathunkit?

  • My God! (Score:5, Funny)

    by ShaniaTwain ( 197446 ) on Tuesday July 05, 2005 @03:01PM (#12987698) Homepage
    The Sandiego Supercomputer is made of people! You've got to tell them! Sandiego Supercomputer is people!
  • I wonder if he'll get 'Twan to write another review for his book.

    The reviewer clearly should have rated this book "Two snaps in a Z formation"
  • by Iriel ( 810009 ) on Tuesday July 05, 2005 @03:02PM (#12987707) Homepage
    So if computers are originally human, does that put the brain under the GPL liscense or are we stricly proprietary hardware?
  • Dear Old Mum (Score:5, Interesting)

    by Stanistani ( 808333 ) on Tuesday July 05, 2005 @03:02PM (#12987710) Homepage Journal
    My mother was one of those computers - she worked in England during WWII, using a 'comptometer' and had no idea what she was computing, despite hearing random roaring noises from elsewhere in the facility, until one fine day she was introduced to a Mr. Whittle, who had designed one of the first jet engines for Great Britain.
    • Re:Dear Old Mum (Score:2, Interesting)

      by tritesnikov ( 808734 )
      My grandmother actually has a comptometer that I played with when I was younger. I haven't used it in years, but it was funky how it actually worked mechanically given that I only knew electronic calculators. You had to do some funny stuff for subtracting, I think you had to hold a lever down and use a number one less than what you were subtracting, but it worked.
      • You had to do some funny stuff for subtracting, I think you had to hold a lever down and use a number one less than what you were subtracting, but it worked.

        Yeah, it's called a carry...
    • Re:Dear Old Mum (Score:3, Interesting)

      by renehollan ( 138013 )
      I once owned a comptometer.

      It weighed about 40 lbs. (about 18 kg.) and had lots of mechanical buttons, circular mechanical readouts (think car's odomoeter), and gears, all housed in a neat, if heavy desktop box. It was about the size of a manual typewriter (though it has an AC power cord).

      It could add, but arguably, some fast humans could probably add faster in their heads.

  • by DanielMarkham ( 765899 ) on Tuesday July 05, 2005 @03:03PM (#12987713) Homepage
    This sounds like a demeaning, brutal job. Almost like a factory for addition. Can you imagine what these folks talked about when they went home at night?
    "Had a bunch of sevens at the plant today. Thought we never add them all up."
    There's a slide-rule connection here. Oddly enough, numbers that couldn't be computed on a slide rule were deemed irrational. For those interested in slide rules, Here's a short history of the slide rule [hpmuseum.org] and here's a guy's collection of slide rules [eyrie.org]

    Microsoft Taken To Task On Hiring Practices [whattofix.com]
  • Did they feed them pi?
  • The beowulf cluster!

    ;)

  • Reminds me of "Souls In The Great Machine", a book I read a little while ago. In it, a giant computer is made in a similar way that this describes, sort of, although not all the components are there voluntarily.
  • progress? (Score:5, Funny)

    by colmore ( 56499 ) on Tuesday July 05, 2005 @03:05PM (#12987744) Journal
    So instead of asking a hunk of plastic and metal for answers to math problems, I would have been asking a room full of educated unmarried women?

    This is progress!?!?!
    • Re:progress? (Score:3, Insightful)

      by Gzip Christ ( 683175 )
      So instead of asking a hunk of plastic and metal for answers to math problems, I would have been asking a room full of educated unmarried women? This is progress!?!?!
      It is for the women.
    • Well, at least the hunk of plastic and metal doesn't make you buy it dinner and a movie first...
  • My high school math teacher had worked on Concorde. He mentioned how they also had a roomful of women "computers" to do various calculations for them.
  • The next title in the series will be: When Computers Were Machines...
  • Babbage (Score:3, Interesting)

    by ch-chuck ( 9622 ) on Tuesday July 05, 2005 @03:19PM (#12987863) Homepage
    Tables calculated by humans also contained a lot of human errors - I understand Charles Babbage was so frustrated by errors in human calculated tables that he wished for some way they could be calculated "by steam" (engine/machine).

  • by LouisvilleDebugger ( 414168 ) on Tuesday July 05, 2005 @03:19PM (#12987867) Journal
    Feynman is credited with an early application of parallel processing in the way he divided up his "girls" to do the yield calculations for the first atomic bomb, while they were waiting for IBM machines to be set up at Los Alamos during the Manhattan Project. Instead of each girl doing one whole equation herself, he divided the work so that one girl would only do a single kind of operation (such as cube roots.) In his memoir, "Surely You're Joking, Mr. Feynman," he writes that with this scheme he was able to get the predicted speed of the IBM machines out of his human computers. "The difference was that the machine didn't get tired and could work three shifts. But the girls got tired after awhile."
    • Thats not parallel computing, that is pipelining.
      But still fascinating, as it is used in modern cpus for the very same reasons it was used back then, only on totally different scales....
    • Feynman isn't credited with that or indeed a lot of things in "Surely you're joking Mr Feynman" and the other one the title of which escapes me. Feynman credits himself with many of those things. I'm not disputing his credentials as a great scientist, for sure he is universally recognised for those things, and as an influencial thinker (especially in self-professed "geek" circles) but even the man's best friends would and indeed on many occasions have pointed out his proclivity for self-promotion and tenden
      • Feynman isn't credited with that or indeed a lot of things in "Surely you're joking Mr Feynman" and the other one the title of which escapes me.

        I think it was called "Mr. Feynman, How Come You're So Awesome?"
  • Los Alamos (Score:3, Interesting)

    by Muhammar ( 659468 ) on Tuesday July 05, 2005 @03:20PM (#12987873)
    Most of the tedious calculations in wartime Los Alamos was done by "clever boys with engineering skils and high school diploma" that were drafted into army and then assigned to Los Alamos duty.

    Everybody there was doing the calculations on simple electromechanic calculators "Merchant" which had the unpleasant tendency to break down a lot. (They also used slide rule to get quick fist aproximations). Eventualy they purchased a great number of card-punching machines from IBM (designed for bank account tabelations) and adapted them for iterative numerical calculations by putting them into a *cycle* - a revolutionary idea at the time.

    This stil required lots of people to feed the cards into the machines at each step and the stacks of cards was going round very very slowly. The biggest problem of these calculations was that at this point the boys were pretty bored with the job. When they were told what they were actualy working on, their productivity increased ninefold!

    A very entertaining re-collection of this computing history is in "Los Alamos from bellow" in "Surely you are joking Mr. Feynman"
  • Asimov Short Story (Score:5, Interesting)

    by CrazyWingman ( 683127 ) on Tuesday July 05, 2005 @03:21PM (#12987890) Journal
    There is a great short story by Asimov, in which many years in the future, man has forgotten how to do math without an electronic computer. It then happens one day that a young man figures out a process for doing addition and multiplication on paper, and shows off his new methods to a bunch of government big wigs. The military planners are overjoyed, and they begin to redesign their rockets so they can fit a man, who will then be able to calculate his trajectory and pilot the missile to its target by using pencil and paper. This is a huge win for all involved, because humans are much cheaper than computers, of course. :)
  • Sean McMullen [bdsonline.net] has written a captivating sf trilogy in which the world is run with the aid of "calculors" -- human powered computers. The slaves which power it are called components and given names such as ADDER14 and MULT3.

    I liked it.

    • The first book (Souls in the Great Machine) is almost entirely focused on the creation and operation of the calculor and should be very well received by programmer types. By the timeline of the other books, calculors are just another tool, and McMullen is off steampunking other inventions -- which is still entertaining, but has nothing to do with computers.
  • the gender imbalance in places like CERN [www.cern.ch]

    I've heard from older physicists that in those early years the scientist-computer match was quite popular.
    (It still is, but, well...)

  • The book is a bit expensive for $35 USD on Amazon. It's bad enough that you have to pay $50 USD or more for a good technical book. But $35 USD for a history book?! Sheesh... I'll wait for the paperback.
  • by jeffmeden ( 135043 ) on Tuesday July 05, 2005 @03:35PM (#12988018) Homepage Journal
    By World War II, in the United States, computing power was measured not in megahertz or teraflops, but in kilogirls.
    For what it's worth, I still measure a computer's ability in 'kilogirls' but its not necessarily related to the processor power...
  • I find it unlikely, but is this the same David Alan Grier -- the comedian -- who was on "In Living Color" and such? I haven't been able to find anything definitive yet, but I'm assuming it's just a coincidence.

    That's got to be a pretty rare name, though...
  • I can't believe nobody wrote this yet... nobody reads articles and submissions anyways :=)

    By World War II, in the United States, computing power was measured not in megahertz or teraflops, but in kilogirls.

    WOW!! On side note, repetitive jokes involving 'kilogirls' are going to haunt /. for years to come just like beowulf jokes!!!!

  • It's the "CRC Standard Mathematical Tables", 23rd ed., (c)1975.

    This 23rd edition features upgraded interest rate information in the financial section, with compound interest and associated material from one quarter of a percent through twenty percent in intervals of one quarter of a percent.

    Brilliant!

    The neat thing with this one is that not only do you get the tables, you also get all the formulas and breakdowns of dozens of proofs!

    All in one handy volume!

  • CERN (Score:3, Interesting)

    by Adelbert ( 873575 ) on Tuesday July 05, 2005 @03:38PM (#12988044) Journal
    Possibly off topic, but a similar thing went on with the old bubble chambers [wikipedia.org] at CERN.

    People wihout much of a background in physics would trall through the images, looking for patterns that they'd been told to look out for.

    I think its important that someone is documenting the work of these heroes of maths and physics. Without them, advancements would have had to wait for the computer revolution. If we don't remember how important their contributions were, I'm sure it will only be a generation before they're forgotten.

  • Once being an engineer had dignity, skills and nifty curled up bendy ties. Now it's downgraded to menial tasks Ceramics Engineer [Dishwasher], or even worse like getting your MCSE. [Minesweeper Consultant And Solitaire Engineer]
  • Ballistics (Score:2, Informative)

    During World War I, Naval Ships, mainly battleships relaying on long range artillery such as the Dreadnought used human computation for projectile of artillery. Dreadnought having eight 15-inch guns capable of firing a 1,920-pound projectile 35,000 yards (or 16 miles) and steam turbines reaching a speed of twenty-one knots, gave the extra edge to win the battle through precision of ballistic projectile from far distance.

    Having said that, I believe, some of the points which the article brought up downplaye
  • Since "thinking machines" were naturally forbidden, the hrethgir used human slaves to compute their equations. The upside was that humans sometimes made mistakes, and mistakes sometimes had beneficial consequences (but usually the slaves fared no better).

    Not as good as his dad's stuff, but OK. Get it in paperback. [amazon.com]
  • Monday:
    - How was work today?
    - 3.1

    Tuesday:
    - How was work today?
    - 3.141

    Wednesday:
    - How was work today?
    - 3.141592

    ...
  • by Mac Scientist ( 153390 ) on Tuesday July 05, 2005 @04:01PM (#12988235)
    Reminds me of an Asimov story "The Feeling of Power" written by Asimov in 1958. People of the future, who are totally reliant on personal computers, experience the wonder at being able to do arithmetic by hand.

    Are we there yet?
    • On the subject of fiction, it reminded me of an Arthur C. Clarke short story called "Into the Comet". A spacecraft's dodgy computer gets replaced by a beowulf cluster of people with abacuses.
  • ... developed to speed up the deployment and computation of algorithms when you needed results quicker than a single Computer could handle, were lost due to the introduction of the microprocessor.

    also:

    there are sci-fi books and short stories about this sort of thing.

    one of them is "the end of eternity", by isaac asimov - Computer Harkan (Computer as in a title like Doctor) is the main character.

    also there's a short story - by greg bear, i believe - about a space expedition that got lost in deep space: t
  • by DrKayBee ( 769192 ) on Tuesday July 05, 2005 @04:24PM (#12988422) Homepage
    Is that when laptops were pretty secretaries?
  • by MattJ ( 14813 ) on Tuesday July 05, 2005 @04:56PM (#12988705) Homepage
    The Smithsonian has a great interview with Ida Rhodes, who assisted Blanch.
    Here [smithsonian.org].

  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Tuesday July 05, 2005 @05:03PM (#12988765)
    The last few years of school I went to waldorf school. We actually learned to use log tables (still got my table book here) and calculators were forbidden.
    We'd draw roots using them and all.
    The reasoning was that anyone can keypunch but understanding what log actually mean is a differn't thing and requires getting your hands dirty. It was at that time when I started programming on my first computer - a PC 1402 Sharp Pocket Computer. Amongst my friends I was the only one that actually understood what these symbols really meant.

    I'm gratefull for our teachers taking us that way. I'd actually do the same. Once you've really understood what logs are all about (and when you do your A levels with log tables you have understood what they're about) tackeling larger math problems is a piece of cake.

    Take this advice: If you have kids, don't let them near/use an electronic calculator to early. Give them log tables or a slide ruler. It's the best was to learn higher math.
    • I fully agree that calculators are over-used in most high school math classes, but I think this is going a bit too far. There's nothing wrong with allowing trig students to use basic scientific (non-graphing) calculators. I can't imagine how it would make students more productive or to give them a deeper understanding by making them slog through old log tables. Yes, a student should be able to approximate in fraction form the sin, cos, etc. without a calculator, and by all means should be able to do simple
  • by peter303 ( 12292 ) on Tuesday July 05, 2005 @05:52PM (#12989145)
    I remember the early years of computer science as being a secretarial/trade school kind of thing. I remember MIT and Stanford faculty debates as to whether they should even offer an undergraduate major in computer science because it considered too "vocational". If you were a Stanford student in comp sci you got a "stealth degree" as a minor in the math department. At MIT they hid it in electrical engineering and STILL HAVENT granted it independent department status even though at the height of the computers science boom one third of undergraduates majored in this option. Even now MIT refuses to teach a practical introductory computer science course. Their first course has been based on LISP since the late 1960s and still uses the version called SCHEME.

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...