Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology Books Media Book Reviews

The Limits of Software 132

Thanks to Jason Bennett, who wrote this review of The Limits of Software. Robert N. Britcher explores in this book what software is and where software is going -- and what it really means.

The Limits of Software
author Robert N. Britcher
pages 214
publisher Addison Wesley
rating 7
reviewer Jason Bennett
ISBN 0-201-43323-0
summary Where we've been, where we're going, and the implications therein

Background

Before I launch into my latest review, I'd just like to say thanks to Hemos and Slashdot on the occasion of my twentieth review posted here. It's been 25 months since the first one (August, '98), and I've really appreciated the opportunity they've given me. Nice excuse to do something I should do anyway! :-)

The Scenario

"But it is not the practitioners alone who are so moved. A thousand years in the making, the religion of technology has become the common enchantment, not only of the designers of technology but also those caught up in, and undone by, their godly designs. The expectation of ultimate salvation through technology, whatever the immediate human and social costs, has become the unspoken orthodoxy, reinforced by a market-induced enthusiasm for novelty and sanctioned by a millenarian yearning for new beginnings. This popular faith, subliminally indulged and intensified by corporate, government, and media pitchmen, inspires an awed deference to the practitioners and their promises of deliverance while diverting attention from more urgent concerns. Thus, unrestrained technological development is allowed to proceed apace, without serious scrutiny or oversight -- without reason. Pleas for some rationality, for reflection about pace and purpose, for sober assessment of costs and benefits -- for evidence even of economic value, much less larger social gains -- are dismissed as irrational. From within the faith, any and all criticism appears irrelevant, and irreverent." (TLOS, xxiii)

-- David F. Noble, The Religion of Technology, as quoted in The Limits of Software

I had the privilege of spending a few weeks with a good friend of mine in Eastern Europe back in July. Of course, to go anywhere on a budget in Europe requires a lot of train travel. Alas, there are no bullet trains in Slovenia, which gave me plenty of time to take in some reading when I wasn't chatting with my fellow passengers ...

The Limits of Software is a unique book in many ways, not the least of which is that it reads more like a collection of life stories than a lecturing textbook. Most computer books simply give you data, or even information, in a straightforward manner, hopefully punctuated by some interesting anecdotes. Britcher, instead, has packaged with words slices of time which illustrate various points about where computer programming has been, and where software development is going (note the terminology change). I certainly won't try to describe them all, but theme which runs through the book is illustrated in the opening quotation: software is not our savior. There is no "one great system" that will be able to handle things. The FAA's botched air traffic control system is used as one illustration in the book, but the point is made about all software: we cannot and must not worship it.

There's one point that I find simultaneously funny and sad: It's in the chapter on testing, and the inherent futility of such an activity on complex programs. Britcher discusses the Y2K bug, and mentions the survivalist movement.

"Just as regular folks built bomb shelters in the 1950s and 1960s to add life time to a planet white with nuclear snow, regular folks are now storing large caches of food, water, toilet paper, clothing, and, of course, the American twinship: sacred literature and ammo. One man who agreed to be interviewed for the piece was quoted: 'When you first hear about it, most people are in total denial. They can't believe that Bill Gates won't come up with a magic bullet.' (That the general population believes that Bill Gates has the answers to our programming problems is more frightening than the rollover of the millennium.)" (TLOS, 59)
I quote this not as a shot at Bill (although, this being Slashdot, I'm sure some will take it that way), but to point out the inherent risks in the statement, which illustrate Britcher's point. Software is dangerous, because it does so much yet is so fragile. We (even we programmers at times) view it as a holy grail. We cannot understand how our mechanical saviors could possibly fail us. Yet, software failures are rampant, in every facet of our society (see the Risks Digest if you need examples). Software cannot solve our problems. Our problems are inherent within ourselves. As we continue to rely more and more on machines to live for us, we must remember that they, like their creators, are fallible.

What's Bad? / What's Good?

When I finished TLOS, my first reaction was to think of the old saw about the life of a fighter pilot: "hours and hours of sheer boredom, punctuated by moments of sheer terror." Britchner's stories seemed to drone on at points. The FAA story was left to the end. Why did he have to go on and on about all this random stuff?

In retrospect, though, I think I have a better grasp of what Britcher was trying to convey. This is not a disaster movie told in the guise of software engineering; this is a story about one man's journey through software, and the conclusions he's come to. Read this as an technological autobiography, and I think you'll appreciate the points being made. As I said earlier, it's different, but rewarding in the end.

So What's In It For Me?

A reminder that the Tower of Babel still lives in the hearts and minds of men.


You can purchase this book at Fatbrain.

Table of Contents

  • Foreware by Robert L. Glass
  • Prologue
  • Part I
    1. Early Systems
    2. Theories of Programming
    3. The Human Element
    4. Designing
    5. Code: The Stuff of Programs
    6. Testing Computer Systems
    7. The Impossible Profession
    8. Life on the Project
  • Part II
    1. Supervision Through Language
    2. How Technology Changes Methods
    3. Size and Intellectual Gravity
    4. The Marketing of Science
    5. Errors
    6. The One Great System
    7. The Government of Programming
    8. The System to End All
    9. The End-All of Programming
  • Afterward
  • Reading List
This discussion has been archived. No new comments can be posted.

The Limits of Software

Comments Filter:
  • by Lonesmurf ( 88531 ) on Thursday September 14, 2000 @05:15AM (#780196) Homepage
    "hours and hours of sheer boredom, punctuated by moments of sheer terror."

    funny, that's pretty much my life. :)

    how about: "hours and hours of coding, punctuated by moments of delirium."

    or: "hours and hours of sheer boredom, puctuated by lunch."

    Rami
    --
  • Fatbrain is being bought by B&N. No more cheap books for us :-(

    That aside, everyone should check out Bookpool [bookpool.com]. Their prices are even cheaper than Fatbrain's (most of the time) and I've never had a problem with them.

    Just trying to help the little guys,
    psxndc

  • by FascDot Killed My Pr ( 24021 ) on Thursday September 14, 2000 @05:21AM (#780198)
    I bet there was some ancient Sumerian who gave long lectures on "The Limits of Literacy" warning that we shouldn't worship reading and writing--those skills can't make our lives better. Imagine his face if we were to show him a modern, industrialized nation.

    It's a poor workman who blames his tools. If there is a way for an air-traffic to be controlled by a system, and your air-traffic control system doesn't do it, the reason is not inherent in the "limits of software"--there's some problem with your design/implementation. Software is a universal machine simulator--pure algorithms implemented as 1's and 0's. If that idea isn't worthy of awe I don't know what is.
    --
    Linux MAPI Server!
    http://www.openone.com/software/MailOne/
  • Anyone having used voice dictation software could tell you that.
  • What a pity the first bit of posts are so useless.

    On a related note, the trains in Europe are great places to read. The book seems to be a case of "common" sense. Of course software isn't a magic bullet. Expecting it to be is rather like expecting the royal road to geometry.

  • I think computers and software should be developed and used to assist us and make our lives easier, however at no point should we become so reliant on software that we would not be able to function without it or have great difficulty doing so. I think we are at that point now. It's a very serious matter. Most of us consider redundant backups for electronic equipment in case it goes down, but what backs up the electronics? Sure, there are those who are very carefull, but we as a society are in bad shape.

  • my local small town book store has a web site where I can enter the ISBN for any (in print or at least avaiable) book and they will put it in their next order. I pick it up there, but since it is in town it isn't a big deal (and I don't worry about it arriving in the rain). I pay sales tax, but avoid shipping. I support a small town book store that gives me a place to look over books for those that are worth reading.

    Now if I just had time to read...

  • Unfortunately it's a valid point - no piece of software ships with all of its bugs fixed, or even discovered (note that when working for software companies these two things are not synonymous). A decent testing department given a reasonable amount of time can spot many of the more obvious bugs, but for every obvious bug there will be a dozen sneaky ones which ship.

    You can guarantee that it'll be the end users that'll find these bugs, and that they'll complain to whoever sold/provided them with the software. Remember, end users can be relied upon to use the software in the most convoluted, even stupid, ways, and when they're doing something in a manner the programmer didn't anticipate chances are they'll find bugs.

    This is why I think that for any large software project an open database allowing customers to report bugs is essential. As long as the programmers work on clearing these lists then problems get ironed out over time, and making sure patches are freely available reduces the number of flaws in a program and makes everyone happy.

    Another problem that introduces a lot of errors today is the increasing use of third-party components and applications within software projects. These introduce another point of failure, and one in which the programmers have less control over what goes on, and less knowledge about possible solutions. Even if you have the source for the component, it's still much more likely that using it will introduce problems, either in the interface between your app and the component, or in the component itself.

    The solution? I don't really think there is one that can guarantee perfect reliability. The best you can do is ensure that the number of problems is kept to a minimum, and that problems that are found are quickly dealt with.

  • "The expectation of ultimate salvation through technology, whatever the immediate human and social costs, has become the unspoken orthodoxy, reinforced by a market-induced enthusiasm for novelty and sanctioned by a millenarian yearning for new beginnings. This popular faith, subliminally indulged and intensified by corporate, government, and media pitchmen, inspires an awed deference to the practitioners and their promises of deliverance while diverting attention from more urgent concerns. Thus, unrestrained technological development is allowed to proceed apace, without serious scrutiny or oversight -- without reason."

    I know people who go to the opposite extreme -- they see the problems with tecnological developments (auto pollution, atomic weapons, and the like) and take the attitude that technology is evil, or that it often is used that way.

    Both the worship of techology, and the hatred of technology seems to me to miss the point. Technology is simply a tool. It can be used to create good things, and bad. It can result in good side-effects, and bad.

    Speaking specifically of software development, there are times when people outside the field look with awe on the process of creating code. That ignorance of the process may contribute to those extreme ways of thinking about technology. Once you know how something works, it gets de-mystified in your mind and becomes just another tool or skill. Programming, I tell people, is like playing the piano -- it's a skill that can be learned, not magic.

    Similarly, technology is a tool, not magic, not a panacea, not the solution to human problems by itself.
    ________________

  • only by the ability of the human mind to manage complexity. I'm tempted to fit Gödel's incompletness of formal systems in there somewhere but that seems to imply an UNlimited condition rather than limited. If 'software' is something inbetween mathematics and hardware, we see that the ability to count toward infinity is limited only by the physical realization of the computer (register size, etc).

    As for fragile software, just try to 'diskcopy' a Linux boot disk in NT4-BugFix5, hehehe.
  • ...Are defective users. ;>
  • by guinsu ( 198732 ) on Thursday September 14, 2000 @05:33AM (#780207)
    I find it ironic that the opinion that technology can't solve our problems comes right after a story about Dean Kamen, who thinks technology can solve our problems.

    My personal feeling is that it can solve a lot of our problems permanently (hunger, disease, etc) however it won't spontaneously happen, it does require people and their priorities to change too. If a large enough portion of this planet decided tommorow that clean burning engines, safe drinking water and better agriculture were the areas they wanted to focus research on instead of the latest electronics gadget, then eventually we would solve those problems. In a certain way, technology is the answer, but only if society puts effort/money into moving technology ina certain direction.
  • Wrong 90% of Software problems are caused by using Windows.
  • Read carefully. He is talking about software, not technology in general.
  • Even so, the difficulty of finding bugs doesn't excuse much of the schlocky v1.0 stuff that gets sold.
    There exist software engineering models that, if used, can improve the quality end products. For example, look here [eiffel.com].
    Also, the FAA has a particularly onerous protocol for use in developing real time life critical software.
  • Exactly. But just like playing a piano, there are master pianists and there are casual players. And unlike the piano, we desperate need the talented programmers to work on genuinely useful software.
  • I am definitely going to read this book. It seems to be hitting on one of the biggest problems facing humanity today, which is the elevation to a religion of the scientific method of observation. I agree with some of you that technology itself is only a tool, and that attacking technology is pointless. However, the faith that human beings put in their technology, and in the skills and logics that produce it, is profoundly disturbing. I agree with the reviewer - it's always good to have a reminder that the Tower of Babel will always be somewhere lurking in our collective consciousness.
  • If more companies used Formal Methods such as 'Z' when designing software, we would not have to endure the current deluge of patches and 'service packs'

    The downside is that I know it costs a lot more money to develop using Formal Methods, but isn't it worth it to have better quality software?
  • I entirely agree. That's a comparison I like to make. When people say to me things like "we'll always need graphical user interfaces and user-friendly software like Windows: people will never learn to master the complexities of the Command Line", I reply that in the middle ages, the few literate people might have thought "we'll always need to have little icons in front of the shops so people can tell the inn from the blacksmith; these ignorant peasants will never learn to master the complexities of writing".

    Guess what? In the year 2000, over half of the population of the world knows how to read and write. Let's make it our objective for 2100 that over half of the population of the world have a computer and know how to program.

  • I am a programmer and have had problems reported in my apps that were "User Problems". That is the user wasn't using the app as designed. The real problem isn't the user though it is the design of the app. If the app were designed so the user could understand it there wouldn't be a user problem Now I realize that not every app can be usable by every user, but if 90% of problems are the users, then it sounds like the app wasn't designed for 90% of the intended users.
  • Sadly, the "small town" technical bookstore _I_ love going to is Softpro [softpro.com]. Their Burlington, MA store is about 20 minutes away from me and oh what a glorious sight. Wall to wall tech books (they even sell OpenBSD stuff :-o). The problem is that they only sell stuff 5% off list price. I'm willing to pay a little more to help the little guys out, but when I get 37% off at Bookpool, minus shipping comes to 25% off, its still really hard for me to justify buying at Softpro. But man, do I love their store.

    -psxndc

  • I think you missed my point. ;>

    Most of the time the user doesn't spend any amount of time "playing" with the application and getting to know it, or even a complete lack of common sense. The crys to the help desk to ask how to "bold" something pretty much proves that.

  • Not forgetting the classic help desk query

    "where's the any key"
  • I have long thought that software was just a fad and will fade soon. The old days of designing complex systems with just a few vacuum tubes will eventually prevail and you'll be throwing away those worthless laptops over a Eniac simply because it looks cooler!
  • I beg to differ; it will more likely be much hotter with all those lightbulbs.
  • Software is fragile -- but that is a problem only if you don't recognize this fact and are unprepared to deal with it.

    Yes, the world depends more and more on software. Yes, there is no such thing as a bug-free piece of code. But failures need not be catastrophic. It is perfectly possible to build robust systems, ones which assume that something will crash and are able to deal with it.

    It's all really risk management. Software can fail -- but so can mechanical parts. So far, at least, the biggest problems (like Chernobyl) were caused by human error, not a software glitch.

    Robust systems can be build out of fragile components.

    Kaa
  • Can I use your icons-in-front-of-shops example in the future?
    --
    Linux MAPI Server!
    http://www.openone.com/software/MailOne/
  • test.c:1: `#include' expects "FILENAME" or <FILENAME>
    :-P
  • All successful software projects have one guy who understands how everything fits together.
    When projects get so complex as to go beyond the ability of any one person to understand them you get a big mess of bug-ridden crap that is years late.

    Windows 2000, for instance.

    -Pete

  • The fact that software is, as you say, a universal macine simulator(note also simulator not emulator), puts some pretty strict limits on software i.e. only computable problems can be solved(Godels Incompleteness Thorem). Other problems will remain intractible because of the time complexity of the best algorithm which can solve them (factoring a number) untill the advent of practical quantum computing (I would imagine this is still some time away). I think more importantly though that the complexity of large scale systems is such that proving their reliability and correctness is an impossible task, you can fairly easily prove a bubblesort algorithm logically but doing the same for a program of a million lines is nigh on impossible unless you get a computer to do it quickly and then you have the problem of proving that computer, and all its software's correctness. I think the point about worship and the comparison with religion is often supported by posts on /. I imagine the sumerians face would be pretty much a mixture of apoplexy and disgust.
  • Hell no.

    Programmers are artists, not engineers. Engineers have to wear shoes and act like adults. Programmers don't even need to wear socks and are perfectly within their rights to have swordfights in the hall whenever the mood strikes.

    Formal methods are for book-following engineers, not free-spirited artists.

    or something...

  • by Philbert Desenex ( 219355 ) on Thursday September 14, 2000 @06:16AM (#780227) Homepage
    "Formal methods" sounds really great at first, as does "better quality". But each phrase contains so much hidden meaning that using them identifies you as someone who hasn't really thought about the issues at all. "Formal methods" falls down because using them assumes that we can specify accurately and precisely what the system should do. And you have to do that specification up front. Such exacting up front specification poses more of a problem than iterative development by service packs and patches. "Better quality" has no specific meaning. The American Society for Quality says: "Quality is a jubjective term for which each person has his or her own definition." This allows each of us, programmers, managers, salesmen or customers to "know" what quality is, and use the phrase informally. Then, when the rubber has to meet the road, you can't actually write code that represents "quality" because it doesn't exist except as a figment of some executive director's imagination.
  • Misquoting the previous poster: I think wires and electricity should be developed and used to assist us and make our lives easier, however at no point should we become so reliant on electricity that we would not be able to function without it...

    The fact is that we are already highly reliant on technology in general. Over here in the UK the last few days have just taught us an object lesson in that, when a few people decided to blockade fuel supplies.

    Anyone remember "Connections" by James Burke? The first programme opened with a description of a nationwide power cut in the US in the 60s, and the way in which people just assumed that power and civilisation would be restored RSN. Then he went on to discuss survival options if it had not been restored. 1: get out of the city. 2: find a farm, take and hold it by force. 3: learn to plough. Face it: if the electricity grid goes down and stays down then 90+% of the population of an industrial nation is toast.

    The same thing is now happening to software: its becomging ubiquitous to the extent that things don't work without it. Fortunately we have the opportunity to ensure that systemic failures that bring down civilisation are not possible. The question is, will we have the wisdom to take advantage of this?

    Paul.

  • Ummm.... I thought those were vacuum tubes.
  • Yep, completely spaced the ;> Sorry.
  • yeah, but they do look like big light bulbs and they glow and get bloody hot.
  • At my college, my Engineering Graphics class is right after a Java class. The workstations (each one a Pentium 90, 64MB RAM, NT4SP6, ughhh...) are always bogged down with the Java libraries; I always have to run taskmgr.exe and clear out the stupid compiler to free up 14MB of RAM so I can use AutoCAD to its fullest.

    Right now I'm in the computer lab, on a Dell OptiPlex GX110 with 256MB of RAM (funny, Windows 2000 regcode stickers on each computer's right side, and they have NT4 installed. One step forward, two steps back). And, for some lame reason, the Real Player "Start Center" is in the taskbar. I closed that damn icon on about 3 stations; it's just 3MB of memory being taken up. I kinda hate this; I'm in a college where people are learning the very programming language I've come to hate: Java. [ravenna.com] I sincerely think that Java is a lazy excuse for the programmers to clip off minutes from their coding schedule just so they don't have to wait through compiling code. I say, program in C++, the compile process is my most favorite! It is the defining moment which determines whether you have made progress or not.

    As for the rest of my opinion on software engineers, see my quote, and think of this quote from TRON: "Oh that'll be great. Programs will start to think and the people will stop!"

  • Comment removed based on user account deletion
  • First, thanks for the review.

    That said, humans are fallible. What can we expect but that the artifacts they create will also be fallible? There are assuredly limits on software - limits on computability - but the limits I see here are more limits on human ability to describe solutions to problems that become larger and larger as the simpler ones are mastered.

    As for technology as religion, it's not religion to me, it's a source of comfortable living.

    Now, if only I could stop my VCR from blinking "12:00....12:00...."
  • Actually, they'll prolly tell you "C can warship knowledge..."
  • Vacuum tubes .. light bulbs ... who cares?
  • by Mindwarp ( 15738 ) on Thursday September 14, 2000 @06:31AM (#780237) Homepage Journal
    Readng this review brought to mind an anecdote shared by an old professor of mine in a class discussing the need for completeness testing and mathematical proof of algorithms.

    The incident related was to do with the testing of a new auto-pilot/terrain following system being installed on the Tornado fighter-bomber (the British air-to-ground attack aircraft). All was going well with the test flight until the aircraft dipped into a valley. Shortly after entering the valley the aircraft inverted and performed the classic 'lawn dart' manouver.

    There was of course an insuing investigation and analysis of the auto-pilot software during which it was found that this exact behaviour would occur if the altimeter returned a negative value.

    How could this happen? The valley was below sea level.

    --
  • Let's make it our objective for 2100 that over half of the population of the world have a computer and know how to program.

    That's like that middle ages guy saying, "Let's make it our goal that by 2000 half the world knows how to read and knows how to operate printing presses, binding machines and book distribution.

    I agree that it's important that people become computer "literate", but it's NOT important that everyone learn to program. I make this point because it comes up far too much on Slashdot, particularly in the context of education ("Let's teach every student how to program!"). It's an important point that is lost on many Slashdotters... the vast majority of the population are not interested in engineering, and never will be. And that's OK. Everyone doesn't have to know or care how the hammer is forged to drive a nail.


    --

  • ...and most of them can be found on slashdot.
  • Fortunately we have the opportunity to ensure that systemic failures that bring down civilisation are not possible. The question is, will we have the wisdom to take advantage of this?

    Of course not. People are notorious for refusing to fix what isn't broken yet. It's almost a given that nothing is done about anything, until the shit hits the fan.

    Pessimistic? Sure. Blame the last 1/4 of Metamagical Themas.

    -c

  • You make a good point. A certain level of reliance is inevitable. We just have to be carefull.

  • People who believe this are part of the reason most modern software is as bad as it is.

    Though my post was 100% sarcasm, i agree, 100%.

    -c

  • by wnissen ( 59924 ) on Thursday September 14, 2000 @06:44AM (#780243)
    It's a poor workman who blames his tools. If there is a way for an air-traffic to be controlled by a system, and your air-traffic control system doesn't do it, the reason is not inherent in the "limits of software"--there's some problem with your design/implementation.

    I believe this misses the point of the book, at least judging from the quote by Prof. Noble. The point is, it's a poor workman who assumes his tools are all-powerful and not subject to normal criteria like cost/benefit analyses. The engineers who built cathedrals (no ESR metaphors intended here) used secret design principles painstakingly determined by trial and error through the centuries. If the building stayed up for hundreds of years, it was a Good Thing. Software is a young discipline, largely without solid rules, and yet is widely acknowledged to be changing the world. Software will drastically change the next generation of children, whose parents will prbably all have to use a computer at some point. There are what, a hundred million computers in the US? And yet the primary operating system on them is actively bad. Even Linux is often klunky, and despite better stability isn't even as easy to use as that mass of inconsistency, Windows. Why is this? Because the culture of software believes "good enough" really is good enough. Most people who become programmers are lucky if they get a few semesters of training learning different languages. They get no training at all in controlling the number of defects in their code. This results in lots of software of average quality, that is to say, shit.

    A lot of this is driven by the fact that many people believe it is okay to admire software as math. It's not. Software is meant to accomplish something, and to do it better or faster or cheaper than however it was done before. It has absolutely nothing to do with pretty math, or purity, or any of those things you say are worthy of awe. If I designed a bridge with a novel way of laying out the supports that was really neat, but the bridge fell over because the supports interfered with the arches, no one would complement me on my support placement algorithm. They'd think I was an idiot who couldn't design a bridge to save his life. And yet I see all the time, "Wow, this code is really cool, it doesn't all work yet but isn't it neat?" because as long as it's software no one cares.

    Walt
  • While moving toward the paperless office (Storing all of our records on magnetic digital medium) we are potentially setting ourselves up for catastrophe. We do not know what all exists in space, but lets say something happens, and it sends a _REALLY STRONG_ magnetic wave to earth. BAM! All of our data goes bye-bye. We might not even know it until we've lost all of our data. And then, if the magnetic wave is permanent, we may never get our precious computers back on line. I know this sounds paranoid, but the smaller we get, the more fragile we get as well. I see this as a potentially dangerous trend.
  • I like this better (and think that it is much closer to the original statement):

    Hours and hours of coding, punctuated by screaming from my wife for attention. (That should be enough to produce terror in any man.)

    OK, now in reality, my wife never has to (nor has she) scream for attention. In fact, most of the time she has to say, "Leave me alone for a few minutes you dumbass!"

    //disclaimer (Sorry honey, it's all in good fun.) //end disclaimer.
  • one of the biggest problems facing humanity today ... is the elevation to a religion of the scientific method of observation.

    Well, it is, you know.

    There's absolutely nothing about the scientific method itself that a priori demands that you use it to find out things about the world. When you employ the scientific method to find out how fast (e.g.,) your car will go, you're taking a leap of faith that that's the right way to go about finding out such things.

    As it turns out, the "religion of science" does pretty darn well. While I might go into a mystical trance or search through Biblical revelation or read Tarot cards or try to probe the minds of Car and Driver's editors through ESP to find out what my car will do in the quarter mile, chances are I'm going to be a lot happier with the answers I get if I do what I can to limit confounding variables and run some quarter miles.

    But how happy I am with the results I get is wholly outside the scientific method. I might value reliability, consistency with prevailing theories that turn out to be useful in other areas, and so on -- and those things would lead me to choose the scientific method over others -- or I might be in a frame of mind where I value more the communion with the divine I achieve when I examine pig entrails.

    So, choosing the scientific method is not only a leap of faith that it's the best way to find out about things, it's a reflection of the values you bring to the inquiry.

    The scientific method is terrifically limited. It can tell me time in the quarter mile and gas mileage, but it can only give me vague hints about whether black is a better color than red or whether I ought to buy a car now or later. And as to whether, with millions of people starving, I ought to buy a car at all, science is completely at a loss.

    But, like Churchill's famous quote about democracy, when you're interested in practical, mundane things like whether you have time to cross the street before the car that's coming hits you, and you think not getting hit is far more important than the feeling you get when you search the bumps on your head, then the scientific method is the worst possible way to find out, except for all the others.

    Which is to say, you can't have your cake and eat it too. If you're totally against science on moral or even epistemological grounds, simple intellectual honesty demands that you close your eyes while you drive. Just let me know when you're planning to do it, though. My poor worldview, deprived as it is of transcendent riches, cautions me to keep well clear when you're proving your faith.

  • Yes, the world depends more and more on software. Yes, there is no such thing as a bug-free piece of code.
    There is, however, software that approaches bug-free as a limit, as opposed to merely replacing one set of bugs with another. I offer TeX as an example.
  • I agree with the sentiment, but not the words. I hope computing will become ubiquitous to the extent that you will not be able to point at something and say 'that is a computer', because everything is.

    Anyway, in the year 2000, there are still far too many people killed in petty conflicts and too many people with inadequate food+water (usually the result of the conflicts, not nature).

    Miss Worlds should still ask for world peace, not world computer literacy.
  • Godel's Theorem (and Turing's halting problem) are not serious limits on the powers of computers. I know I can't make a machine that determines if a computer program halts, but I can't say I've ever wanted to.

    Also, I think that proving systems correct is not something we (mostly) want to waste our time with. We don't prove formally that 747s won't fall apart, and yet they don't. While I do agree that good software engineering practices are important, I don't think that we know everything there is to know about software engineering yet.

    I don't think that people really worship computers, but rather some idea of what "computers" will be. Like nanotech, which seems to have the potential to eliminate hunger and revolutionize everything. Of course, it also has the power to turn the world into a mass of "gray goo" - it's technology: dangerous and useful. We should fight to see it used well.


    -Dave Turner.
  • I think wires and electricity should be developed and used to assist us and make our lives easier, however at no point should we become so reliant on electricity that we would not be able to function without it...
    Sorry, too late. Our present level of population cannot be maintained without our present level of technology. More generally, no level of population could be maintained without the corresponding level of technology: you can't support Renaissance Europe on Babylonian-era agriculture.

    The corollary of this fact is that any person who advocates "rolling technology back" or "going back to the land" is advocating genocide -- and should be regarded on that level of (im)morality.

  • You should probably try to program in Java first before you make statements like that! Java is compiled, just like C++, and always has been. Anyway, the web page you pointed to only mentions Java _on web pages_. By far the largest use of Java these days is server-side, and nothing to do with graphics at all. I'd agree that gaudy web pages are pants, but that has more to do with the designers than the language they write in!
  • You can guarantee that it'll be the end users that'll find these bugs, and that they'll complain to whoever sold/provided them with the software. Remember, end users can be relied upon to use the software in the most convoluted, even stupid, ways, and when they're doing something in a manner the programmer didn't anticipate chances are they'll find bugs.

    What? Since when do people complain when they buy shoddy software? Most of the people I know just sigh and reboot.

    Another problem that introduces a lot of errors today is the increasing use of third-party components and applications within software projects. These introduce another point of failure, and one in which the programmers have less control over what goes on, and less knowledge about possible solutions. Even if you have the source for the component, it's still much more likely that using it will introduce problems, either in the interface between your app and the component, or in the component itself.

    Code reuse decreases the amount of bugs and the cost of developing/maintaining software. Why? Somebody has already debugged the component - and with every application that uses the component, there is a greater chance of finding and fixing an obscure bug. Compare that to writing entirely new code, and I'm sure you'll find that there are more bugs in the new code.

  • Why single out software? What is being discussed is a problem with virtually *any* complex system, and I personally think it has more to do with the (human) processes of discovery, invention, innovation, and implementation. These are fragile processes, because they require, by nature, the identification and eradication of the various aspects that cause failure. Add to this any additional problems associated with organizatonal dynamics (management, motivation, morale, compensation, interpersonal dynamics, etc.), and it easy to see how something complex can fail.

    I don't think it's possible to work out all of the bugs in a piece of software, simply because there are too many unseen variables. But I also don't think it's fair to cast software development in a different light merely because it is subject to the same "rules" that govern any complex system. As a means of comparison, consider how long it takes to implement a new commercial aircraft design. Consider how many defense-related projects are, or have been over budget, poorly designed, or scrapped altogether. It was, after all, a hardware "bug" that caused the explosion of the Challenger shuttle. Hell, I'd argue that it's even a hardware "bug" responsible for the Firestone debacle.

    Is it fair to hold software development to a different standard? No matter which complex system you're dealing with, it always involves getting from point A to point B, and it's not always going to be an easy ride.

  • I think you mistake me. I am not "totally against science" in any sense. I am against the construal of science as a panacea, I am against bowing down at the altar of science and technology as though they can explain all and ultimately lead to my own divinization. Take a look at game theory and psychology - these scientific analyses would like to use science to get to the bottom of which color is better and whether to buy a car now or later. Science has even tried to explain why prayer for the sick helps them. The problem I see here is that too many people think that everything is reducible to electrons, quarks, photons, and their interactions. Nothing is more depressing to me than the philosophy that all human choices are dictated by random quantum events. Can I disprove that explanation? Maybe not, in the sense that "prove" is taken by those raised in our system of modern rationalism, but can I propose alternate, self-consistent explanations that lack only an interface with conventional "scientific understanding?" Yes, I can, and that is what modern man must realize. We must take care not to lose our humanity by becoming slaves to the god of reason.
  • The corollary of this fact is that any person who advocates "rolling technology back" or "going back to the land" is advocating genocide -- and should be regarded on that level of (im)morality.

    Now, wait a minute--that would imply that both the Unabomber and the Cultural Revolution were misguided. You might want to rethink that.
    --

  • I agree. Sitting in our $2000 polycarbonate ergonomic chairs with our lava lamps, and geek chic wardrobe, isolated from "RL", it is all too easy to get wrapped up in the novelty of invention. Now I'm not a luddite, I think this stuff is great. But, like Bill Joy, I have this creeping shadow on my conscience...that perhaps this mad rush forward needs to be analyzed a bit more soberly. What we need to realize is that technology is a tool. It should serve a purpose. A lot of people talk about "progress". What are we "progressing" to, I wish people would ask themselves. We should stop once in a while and ask ourselves what *is* this "manifest destiny" for which we are rushing into the golden west of technology. Civilizations existed quite happily for millenia without the "progess" we think new technologies will bring us. That is being evinced in things like the Long Now Foundation.
  • I figured someone would be stupid enough to state the obvious pun in that sentence. Most of us merely noted it and moved on.
    ________________
  • If you choose to use them, there do exist objective quality measurement methods for software. Defects per line found in system test springs to mind first. In a sufficiently large system, defects found in system test can predict the actual number of defects in each module.
  • I would say that software is far less limited than hardware. Hardware is limited by technological, physiological and cost restraints. Software is affected by these on a much lesser scale.

    With one piece of hardware (the x86 instruction code, for instance) you can create millions upon millions of software applications. One piece of software, however, can usually only be made to run on a small number of pieces of hardware.

  • It's easy to say that we should avoid any risk of failure. But what does that really mean? The real question we should ask ourselves comes down to risk versus reward. Does the potential benefit outweigh the potential risk? Even where the costs of a total failure are high, that doesn't mean we must avoid it.

    Take our telephone infrastructure for instance. It's provided society with tremendous and undeniable benefits. Yet the possibility of a total failure is not an impossibility, and certainly wasn't earlier. Would we really have been better off avoiding it because we could never absolutely insure against catastrophe? I'd argue that the costs of avoiding it, of being overly risk averse, far outweigh the benefits that each and every generation has had. Worst case scenario: the telephone networks in the US fail for a couple days. That'd cause some chaos certainly and it might cost the US billions. But I'd argue that having not used it, or having forced excessively costly redundancy [back when before we had digital technology and the like], would have cost the US most of the benefits of the 20th century (i.e., trillions of dollars).

    This is not to say that where affordable precautions become available they should be shunned. That factors into a risk/reward equation. Now certanly there is plenty of room for debate about what is an acceptable level of "risk", but the point is to show that even where apparently extremely costly worst case scenarios exist doesn't necessarily mean avenues of progress should be avoided. Nor do I mean to say that everyone has taken a rational approach to risk/reward; certainly there are some instances where it's been totally out of whack. In the vast majority of cases though, it's been my experience that the people that run these systems are nothing if not risk averse. Because they fear failure more than they probably should, the systems generally are acceptably safe.
  • > There are assuredly limits on software - limits on computability - but the limits I see here are more limits on human ability to describe solutions to problems that become larger and larger as the simpler ones are mastered.

    I don't really think complexity is the ceiling we're up against. Software benefits much more than most disciplines in its ability to break a complex problem down into simpler pieces.

    I think the main problem we're having is that software is, by and large, a marketing discipline rather than an engineering discipline. Obviously so for shrinkwrapware, but to a big extent also for DenverAirportware.

    I hold out hope that as our society becomes more and more aware of its dependence on software, it will eventually decide that software that is stable and behaves as expected is every bit as important as bridges that are stable and petrochemical plants that behave as expected. At that point (my dream goes) we'll see marketing, hype, an unrealistic expectations concerning cost and delivery fade into the background, and good design principles move to the foreground.

    Once we learned to build good bridges, we quickly learned to build bridges that are aesthetically pleasing as well as good. Maybe when the world learns to build good software, it will soon learn to build featureful software that is still good, too.

    Not to say that there aren't limits on computability, and probably also on managable complexity; it's just that with few exceptions, such as the on-board shuttle group, we're running up against social barriers before we get to the computational ones.

    --
  • >It's a poor workman who blames his tools.

    And it's a fool who fails to realize the limitations of his tools, his own foiables, and their reprecussions.
  • That sounds right to me: both the Unabomber and the Cultural Revolutionwere misguided.

    Or take the Khmer Rouge...


  • But you have to pay in time and money.

    Read They Write the Right Stuff [fastcompany.com]

    A very readable article from a few years ago describing, at a high level, some of the efforts needed to become SEI 5. Use this as a benchmark to measure your software endeavours - and when looking at books describing the fragility of software.

  • I didn't mean to imply that Java doesn't comply at all. What I really wanted to emphasize is how Java always compiles on the USER side. Ever viewed the source of a webpage that uses Java? it's flat-out source code you're looking at, not binaries. Sure, it's nice since it's platform independent, but because of the user-side compile, runtime performance suffers (I'm refraining from saying "Client-side compile," because, like you said, some servers do use Java daemons and whatnot).

    Want to see some proof of this slower performance claim? Go to a Windows machine with Quake3, and go to The ShugaShack. [shugashack.com] Download the 1.17 DLL files for Quake3 and place the DLL files in the \quake3\baseq3 directory (you'll see uix86.dll and qagamex86.dll, quake3.exe will use these instead of ui.qvm and qagame.qvm). Voila! A 15% performance gain! And all because the DLL files are already compiled, unlike the QVM files which are compiled at runtime (evidence: the console message saying "VM file qagame.qvm compiled to X bytes of code").

    Before you arrogantly defend the putrid mug of steaming, flaming code, be sure to study its nature. Ever wonder why the DeCSS code was never written in Java? Want to know why XMMS uses .so files? That's right! RUNTIME PERFORMANCE! I believe that this should be the be-all end-all factor of programming.

  • Software is unlimited, or at least a lot more unlimited than this apparently shortsighted book book points out. Over the next 10 years, genetic and evolutionary programming methods will take over where human 'formal' logic programs cannot. According to the creator of Mathematica, Stephen Wolfram, nearly everything is a form of computation. The Earth's complex biosphere is a perfect example that extremely computations are possible. There is no fundmental mathematical reason why software cannot emulate the same degree od complexity.

  • There's no guarantee that automatic checking would fix this bug. If the model didn't account for negative altitudes properly, it wouldn't catch the bug.

    Still, it's probably easier to see the error in a clearly-written formula than in a few thousand lines of dispersed code.

  • You don't have to be an expert at programming, but having an idea about how a computer does its job certainly helps using it. When you're really good at using a computer, you've necessarily understood the basic principles of programming. Future computer users will probably use them almost everywhere and any time. There will be no "just need to know Word" users, and even those better know about scripting to get simple repetitive jobs done quick. The difference between a user and a programmer is that a user memorizes all the stuff he uses, while a programmer understands it and generalizes. With ever growing computer use, tell me which way is more efficient...
  • "Formal methods" falls down because using them assumes that we can specify accurately and precisely what the system should do. And you have to do that specification up front. Such exacting up front specification poses more of a problem than iterative development by service packs and patches.

    Very good point! This article [bleading-edge.com] has a great in-depth discussion of this kind of issue.

    In general, if a specification is detailed enough and unambiguous enough to write all the code from, then that specification basically is the code expressed in a different form, and writing it is essentially the same task (perhaps in disguise) as writing the code. On the other hand, any specification that can be written more easily than writing the code must leave out a lot of information, and that very vagueness leads to "defects", "bugs" and "misunderstandings" when the code is eventually written.

    This is where software is really, really different from most other things. A recipe is not a meal: if you eat the recipe, it doesn't taste good. A blueprint is not a house: you can't live in a blueprint. But a complete and full description of what a program ought to do is the program (though perhaps expressed in a different set of symbols). Let's all meditate on this :-)

  • I admire software as math, but only when it's correct math.

    Incorrect math is not math. 'Nuff said.

  • There is, even, demonstrably correct software. Not likely if you code in C though.
  • You don't have to be an expert at programming, but having an idea about how a computer does its job certainly helps using it. [...] With ever growing computer use, tell me which way is more efficient...

    If that's true for a particular system or program, then it's a failure of the user interface. You absolutely should not have to know how the computer works in order to use it. Now, I don't know the general solution to the problem. Interfaces are still evolving. CrApple has probably come the closest, but interfaces are still too complicated.

    I do know this... the solution is NOT to demand that everyone learn programming.


    --

  • by Anonymous Coward
    > The Earth's complex biosphere is a perfect example that extremely computations are possible. What, now HHGG references?
  • OK. I thought at first you were trolling, but I've come to the conclusion you are just monumentally stupid. There is a huge difference between Java and Javascript. That 'source code' you are looking at is probably Javascript. Java code is deployed precompiled in .class files, you can't read the source unless someone gives it to you. "Want to know why XMMS uses .so files?" - No. I don't care. "RUNTIME PERFORMANCE! I believe that this should be the be-all end-all factor of programming." - enjoy yourself, with your assembly language programming- moron.
  • Software design is in its infancy, more magical incantation than science.

    We start with incomplete design documents, unknown inputs, and impossible to predict interactions. Still, we get complex monstrosities to run with an extraordinary level of reliability. All this using a mechanism that is totally unrelated to the biological processes have built reliable systems for billions of years.

    Software will grow up and start solving general problems like biological systems. Throw in the software evolution and software development will have reached the level of maturity where we can truly judge its impact.
  • Yes, that's called cursing.
  • The reason people don't see programming as part of required general knowledge is that they think it is an awfully complex skill. What I am talking about is iteration, variables and structure. The "user" has to wait for someone to design a special user interface for the task. Someone with basic programming skills can just "script" his way through it. You have almost written the program when you have understood the problem. From that point on, formalization is all that is left to do. Maybe people shouldn't learn C++ or Java or whatever classic programming language you prefer, but graphical programming is still programming.
  • Civilizations existed quite happily for millenia without the "progess" we think new technologies will bring us.

    It's true that "progress" is a hallmark of only western civilization - and, that, only for the last 500 years or so.

    But, hey, my life depends on progress. If progress stopped, nearly every single mother-loving one of us /. types would soon be out of a job - because our profession depends on convincing people to pay us for software and then pay us again, and again, to improve that software.

    Face it - programmers, not just Bill Gates, have a vested interest in writing code that's good enough for people to use, but crappy enough to ensure our continued employment.

    To be sure, this is a very dubious definition of progress - but as long as the user's don't catch on, we're safe!


    --

    Greetings New User! Be sure to replace this text with a

  • Amen.


    --

    Greetings New User! Be sure to replace this text with a

  • What, you mean that technology is a tool, not a (god/mommy/daddy/etc)?

    Woah!

    Seriously, this is the truth -- tech is a tool, like (yes, way overused analogy here, but it's true) a hammer -- we can spend our time perfecting hammers, making bigger hammers, having religious wars over claw backs vs flat backs, etc etc, but ultimately, if I don't use the hammer to build the house or hang the picture, it's nothing but a waste of energy and matter.

    Also, along the same lines, my gf has a serious thing for *big* hammers (in her defense, she's a former construction worker) -- but last night, when we had to hang a picture, it was to my collection of little (1' or less) hammers we went, since hers (the smallest is about 2' long and probably weighs 10 pounds or so) would probably go right through the wall...

    there's a lesson about (high/electronic/digital) technology here too

    Or one could point out that hammers are also technology.

    But, unfortunetly, we haven't found a technological solution for the fact that people just don't think.

  • Let's make it our objective for 2100 that over half of the population of the world can...

    perform brain surgery on their neighbors

    repair their refrigerator if the compressor breaks

    rewire their own house without causing an electrical hazzard

    design and build all the bridges they need to cross when driving

    (Not to irony impaired - ever hear of *specialists?*)

  • All through the 1800's steam boiler explosions were quite common--steam technology was pushed until it was understood and matured. The raw material used to build boilers improved with the making of steel. Early Comet jets exploded until DeHavilland understood how a previously unknown phenomenon (metal fatigue from repeated pressurizations) caused a failure. But today steam boilers and jet airplanes very very rarely explode.

    One difference is that with complex physical systems it is easy enough to build in redundancy. The Brooklyn Bridge has had dozens of diagonal stays (and maybe a few suspender cables) fail over the past century yet there is enough redundancy to prevent its collapse. But if your computer system doesn't have some sort of watchdog timer feature added by the manufacturer you have no protection form a wayward
    linelabel BranchUnconditional linelabel
    from hanging your system.

    Today, we license boiler engineers, civil engineers, plumbers, CPAs, and even hairdressers. Not only does licensing provide revenue for the state, but it also insures that the person holding the license has passed a series of tests demonstrating minimal competence (think Driver's license). Once a licensed xyzzy has [designed a boiler] [designed a bridge] [installed a toilet] [cooked your books] [cut your hair] you can go back and hold that person responsible (i.e. sue them and their company) for a bad boiler/bridge/pipes/audit/shave.

    Perhaps this is the next phase of the maturity of the software development profession. (A few years back New Jersey was thinking of this, but mostly as a scheme to raise revenue.) We'll have licensed Software Engineers. They won't be every member of a firm (not all Civil Engineers are necessarily licensed, but all designs are reviewed by them). We'd have to start with an operating environment that someone is willing to sign their name to "it does this-subset-of-functions." From that base, we can have a set of utilities that does another subset of functions, then applications that does the final set of functions.

    We already have a software that people have signed their names to--the example that immediately comes to mind are the ascent, descent, and life support software of the Shuttle. The downside is that such software is expensive, but if sold in quantity the unit cost will go down. The APIs of such a system are nice and orthoganol (unlike Windows). Processor technology is at a level where even cheap processors can put processes in their own virtual space and hardware will detect something amiss.

    Certifications are a start, but Certificates are not the same as Licenses. A certificate can go stale, but true licensed Professionals (doctors, lawyers, CPAs) have to keep up to keep their license. If they don't, they can be sued for malpractice.

    So if I were to predict what's going to happen in the next few years (and I'll be the first to admit you can't predict anything about this business) I'll say that we may see Software Development emerging as a true Profession. A body of software developed by Licensed Professionals will start to be developed, and this licensed software will be used to perform critical tasks (run a Space Station or a nuclear plant, run a microwave, route IP packets). There will still be a lot of "unlicensed" software that no-one cares if it crashes (games, DVDs, toaters, office applications). Most of the programmers working at companies will be licensed, but almost everything going out the door will be reviewed, picked over, prodded, and stretched by someone who will sign their name on it. And if the program crashes badly, that someone will have to fix it, and/or be sued, and/or lose their license.

    I'm kind of looking formward to this. I'm not willing to put my name on anything I've written recently, but there have been programs that did a few things and did them well--one program I wrote ran umnodified for about 6 years without stopping. (A work distribution system which during its tenure grew to handle about four times the maximum load it was originally designed for--the UNIX system it ran on was rebooted about twice a year and the program would pick right up from checkpoints. The good old days.)

  • OK, where should I start. Hmmmm....
    • Java does not "always compile on the user (sic) side." It is much more common for Java applets and servlets to be running as middleware on servers these days, for portability and standardization.

      The days of Java applets being downloaded and executed in a JVM in the client's browser are long gone. Microsoft killed them by "extending" the Java Virtual Machine that comes with Internet Explorer (which is bundled with Windows blah blah blah), breaking most "cross-platform" Java code in the process.

    • If you're talking about the scripts that are embedded in HTML code, that is javascript, a Netscape invention, and has nothing to do with Java(tm) the Sun programming language. Embedded HTML scripts, of course, are always executed client-side, just like VBScript, ActiveX controls, and DHTML.

    • You are so woefully misinformed on this topic that people are mistaking you for a troll. Why don't you go look up the difference between Java and Javascript sometime, instead of just making stuff up that you overheard somebody talking about once?

    • If run-time performance is the only thing you care about, perhaps you should take a class in Assembly language. Assembly programs have runtimes on the order of magnitude of 20-100X faster than compiled C code.

    I don't usually even bother to respond to people who flame whole languages without even knowing the basics of said languages.

    However, your posts are written as if you think you're some sort of programming expert, which you obviously are not, if you can't even tell Java from Javascript.

  • Change is the basis of survival. Adapting to the changing situation is essential and any form of life which denies this, is a dead end on the tree of evolution.
    Progress means exactly that type of change. No one is actively deciding where we as a species are progressing to and this is probably a good thing. We are just adapting to changing needs. Bored (and thus prone to doing stupid things and/or becoming unproductive)? We'll do something about that. Can't solve a problem, because lack of communication stalls the project? Invent a new way of communicating. And always: Create (read: modify), test, repeat.
  • Unfortunately, there will always be a disconnect between specs and code. Programming languages have cruft and artifacts that will always force code != specs.

    Here's an interesting paper called "Haskell vs. Ada vs. C++ vs. Awk vs. ..., An Experiment in Software Prototyping Productivity" [haskell.org] about a programming language experiment. The Naval Surface Warfare Center had programmers implement a "geometric region server" (a component of the Navy's AEGIS system) in their language of choice. Program length and development time for each project are compared for the different languages. The surprise "winner" was an extemely concise program written (by the paper's author, of course) in the functional programming lanaguage Haskell. The other languages have so much cruft that eventually makes the code look nothing like the "human readable" spec.

    Compare! ;-)


    LANGUAGE, LINES OF CODE, LINES OF DOC, DEV TIME (HOURS)

    Haskell #1, 85, 465, 10
    Haskell #2, 156, 112, 8

    My favorite quote from the paper:

    "In conducting the independent design review at Intermetrics, there was a significant sense of disbelief. We quote from (CHJ93): "It is significant that Mr. Domanski, Mr. Banowetz, and Dr. Brosgol were all surprised and suspicious when we told them that the Haskell prototype P1 is a complete tested executable program. We provided them with a copy of P1 without explaining that it was a program, and based on preconceptions from their past experience, they had studied P1 under the assumption that it was a mixture of requirements specifications and top level design. They were convinced it was incomplete because it did not address issues such as data structure design and execution order."

  • You don't have a clue what you're talking about. For example: - Most (and best) use of Java is on the server side, not the client side. - Java applets on web pages are useless 90% of the time, but very useful the other 10% of the time. Try developing a complex web-based application (not a simple vanity page) without using client-side ActiveX, Java, Shockwave or DHTML and you'll find that these things can make a big difference in the either the usability or basic capabilities of the app.

    - Java applets are not compiled on the user side; they're compiled by the developer and then executed by a virtual machine on the user side. If you're looking at actual code you're looking at JavaScript (as a previous reader pointed out). Java has absolutely nothing to do with JavaScript.

    - Java offers many features that drastically improve the reliability and robustness of Java code: garbage collection, enforced error handling, the use of bytecode instead of machine code (which prevents buffer overflows, one of the biggest holes for hackers), great compile-time checking, clean frameworks for collections, multi-threading, distributed code, etc. etc. etc.

    - In the implementations where Java is most used, its features allow designs which are more scalable and flexible than what you could get with a more limited language, despite its slower runtime speed. Check out any of the Enterprise JavaBean servers (WebLogic for example).

    With regards to the original post (since we seem to have gotten a bit off-topic), I don't think people ought to blindly have faith in technology's ability to save the world. Our technologies have flaws because we have an imperfect understanding of both 1) our own goals and 2) how best to achieve them. But so what? That doesn't have any bearing on whether we ought to try to understand these things. Java is one example of a technology designed with a "make-it-better" goal, not a "make-it-cool" goal (though I think it's pretty cool).


  • LANGUAGE, LINES OF CODE, LINES OF DOC, DEV TIME (HOURS)

    Haskell #1, 85, 465, 10
    Haskell #2, 156, 112, 8
    Ada, 787, 714, 23
    Ada9X, 800, 200, 28
    C++, 1105, 130
    Awk, 250, 150
    Lisp, 274, 12, 3


  • Sorry, too late. Our present level of population cannot be maintained without our present level of technology.

    Sorry, but you have no idea what you're talking about.

    Our present urban population densities cannot be maintained without out present technologies. That's not the same thing as our present level of population. Many of our rural areas have become depopulated over the last decades. Wes Jackson estimates that, according to the best archeological evidence, his area of Kansas supported more population in pre-Columbian times that live there today.

    The corollary of this fact is that any person who advocates "rolling technology back" or "going back to the land" is advocating genocide -- and should be regarded on that level of (im)morality.

    You are presenting your assertion as fact, and laying a classic ad hominem on anyone who might disagree with your "fact."

    If today's technologies are unsustainable (and there's good evidence that they are), isn't it "genocidal" to advocate marching forward blindly until the crash happens? Better, I think, to look for "soft landing" scenarios -- which do involve "back to the land" and "rolling back technology."

    There are several disproofs by example for your thesis, but I will simply name one -- the Amish. The Amish way supports a higher population density than the "English" techno-agri-business way, with less environmental impact at the same time. Keep in mind that the Amish do innovate technologically -- the key is that they most emphatically do not worship technology, but subordinate their technological choices to their religious and cultural priorities. If only the rest of us were so wise as to keep technology our servant rather than our master.

  • Civilizations existed quite happily for millenia without the "progess" we think new technologies will bring us

    ...and were then crushed when progressive civilizations decided to move into their territories.

    The purpose of progress is not a more pleasurable life (as many have discovered, and enshrined in religions and philosophies that ought to be considered forms of insanity, perfect eternal happiness is simply a matter of learning to ignore your sensory input), but survival. The societies which progress and become stronger crush those which grow more slowly.

    Putting the brakes on and deciding that you've progressed far enough is societal suicide.

    --------
  • No such thing as a bug-free piece of code? Nonsense. Here:

    10 GOTO 10

    or if you prefer,

    X = 200

    This is ridiculous, a complete cop-out. What you are really trying to say is, "Above a certain (nowadays common) complexity, software is such a complicated mechanism that the human brain can't keep track of it well enough to avoid mistakes".

    This is a DIFFERENT! claim than 'There's no such thing as bug-free software'. It points to obvious possible solutions (don't expect the human brain to keep track of it but assemble big software out of smaller known pieces made to interact in known ways, or alternately stick to writing software that _will_ fit within the human brain) rather than claiming no solution is possible. I would say 'There's no such thing as bug-free software' is very akin to saying 'There is a God'. You can look around and see many things that might suggest it- does that constitute proof? More relevantly, what attitude does this belief encourage? One hopes it encourages robust error checking and loads of sanity checking (when programming, it's helpful to assume the program's environment is insane- early Apple code was tested on an automated 'user from hell' program that just fed in endless random actions and was great at accidentally jockeying the victim program into 'insane' conditions that caused a crash). However, every time I see 'There is no such thing as bug-free software' my first reaction is to think it's being said by some marketroid type making excuses for why they shipped early, buggy, and don't actually plan to fix the broken bits. And thanks to the wonderful suggestibility of the human brain and the phenomenon of justification, those marketroids have heard this little mantra too and to _them_ it means 'software by definition is all screwed up and non-deterministic, so we won't even try, we'll just plan PR campaigns'. _They_ don't have the technical background to put such a remark in context. They will be taking it literally, as a license to produce untrustworthy crap without a bit of self-examination, with no attempt to do better- and it becomes their argument to ship earlier and earlier, buggier and buggier. "You're not done fixing bugs? ...button it up, we ship it just as it is now!"

    *sigh*

    If you write software, write bug-free software.

    If you can't write bug-free software, try.

    If you won't try- please, quit writing the stuff. :P

  • by FallLine ( 12211 ) on Thursday September 14, 2000 @07:15PM (#780306)
    Uh well. For one, most of the Amish live on some of the most fertile soil in the United States. Secondly, they're not entirely independent, they sell much of their wares so they can buy certain goods and services. Thirdly, it's a mistake to compare the Amish which support relatively small populations on large plots of land to the even small populations on even larger plots, because those farmers are generating most of the food for people in cities. The Amish simply don't need to produce nearly as much food per acre.

    I don't quite get your argument about Kansas. Though I sincerely doubt the indians were able to support high population densities given their primitive agricultural skills, the fact is that Kansas EXPORTS most of their product. So that's a bogus analogy too. In any event, if the Amish lifestyle is so ideal, why don't you join them? I'm sure they'd take you.
  • That's weird; according to Epic Games Unreal Techology Faq [epicgames.com], they coded the engine entirely in C++, not Java.

    As for your lag problems, who knows. I have a P-II 350 with an ancient Voodoo2 card that runs Unreal just fine. I'll grant you that the Microsoft Java Virtual Machine is damned slow to start, though.

  • You mean,

    the same Department of Defense that got rated a D+ [slashdot.org] in their security audit last week? And we're supposed to look to them for an example?

    heh.

  • Fertile soil is created, by nature and by good husbandry
    Only to a very limited extent. Some land is simply too inhospitable to support extensive agriculture. What's more water is finite. Some land needs to be externally irrigated, but there simply isn't enough to go around. What you are saying is equivelent to saying that people can make themselves athletic. Sure, to a certain extent. You might see miraculous improvements, but only a few people have the necessary genetics to make it to the Olympics. The Amish are on the Olympics of land. They've done a good job preserving what they have, but they choose the most fertile land they could when they got here.

    ...and destroyed by bad farming practice
    I don't disagree with this. One can do great damage to fertile land. But that doesn't mean the inverse is true, that we can take any land and make it sufficiently arrable.

    This planet could support a population of 12,000,000,000, using essentially 19th century technology
    I sincerely doubt that. Based on your previously blatent miscontruing of the facts, I doubt your premises.

    I know little of Kansas
    I don't live in Kansas, I live outside Philly. But I know enough about agriculture and history to know most of the Indians lacked based agricultural skills.

    ...but in the Australian outback one family often struggles to make a living (grow enough to feed themselves and earn enough to pay for other necessities), on land that once supported more than 200 Aboriganal Australians
    Sounds awefully bubbly to me. Though I'm not terribly familiar with the Australian Aboriganess, I suspect they were more hunter-gatherer than they were agricultural, especially given the conditions. So what does it really mean to say on the "same" land. They, most likely, took an entirely different approach to the land. Exploiting and eating things that no Westerner would dream of. Furthermore, I doubt they confined their activities to a single acre...and even if they did, it's not as self-confined as, say, farming can be. You kill a wild animal on your land, and that has an impact on the surrounding area. Though there may be been clusters/tribes/city-states of these Aboriginees, their overall lack of density would allow for certain practices that couldn't be practiced today.

    As for the American Indians "primitive agricultural skills", we are talking about a people who lived in harmony with their land for thousands of years.
    We're also talking about a people whose population was curbed by the constant threat of infant mortality, starvation, sickness, etc. Yes, in their limited sizes I would agree that they didn't "rape" the land. Because they didn't farm any land, they couldn't easily damage any land. However, without farming they simply could never has sustained a large population. The analogy is a poor one.

    I think some agricultural problems today definetly need to be addressed, but it is foolish to assert that the primitive cultures produced more food per acre [or rather that they could have sustained our populations]. You certainly have not made a convincing argument for it. You can't demonstrate that the vastly inhospitable lands in this world can suddenly be turned around into an idealistic Amish community. And you certainly can't demonstrate that ALL [or even most] land can do this.
  • It seems that the myth of the "noble savage" has been replaced by the myth of the "ecological savage". But this myth has no basis in fact.

    Mammoths were hunted to extinction by my stone-age ancestors. American Indians used to drive whole herds of buffalo off cliffs merely to save the bother of isolating and killing the one they needed. They also wiped out a lot of the native american megafauna when they first arrived. The moa (large flightless New Zealand bird) was hunted to extinction in around 160 years after the first humans arrived. Tell me your ethnic origin, and I'll find a list of the species your stone-age ancestors drove to extinction.

    Oh, and that beautiful speech by Chief Seattle which starts "How can you buy or sell the sky? The land? The idea is strange to us..." was never said by him. It was actually written by a film scriptwriter named Ted Perry for a 1972 documentary. Type their names into Google for more info.

    To be sure, many stone age peoples have lived in "harmony" (i.e. rough equilbrium) for many thousands of years. But this is not due to any special wisdom on their part, merely the fact that the ones who didn't went extinct so we don't see them. And my 19th century ancestors who destroyed the dodo and introduced rats and rabbits to Austrailia are no better. We in the 20th century can claim exactly one point of superiority over any of them, which is that we have finally figured out what we are doing. Those who would have us forget that and go back to the stone age have seriously missed the point.

    Paul.

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...