Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Book Reviews Books Media

Geekonomics 227

Ben Rothke writes "First the good news — in a fascinating and timely new book Geekonomics: The Real Cost of Insecure Software, David Rice clearly and systematically shows how insecure software is a problem of epic proportions, both from an economic and safety perspective. Currently, software buyers have very little protection against insecure software and often the only recourse they have is the replacement cost of the media. For too long, software manufactures have hidden behind a virtual shield that protects them from any sort of liability, accountability or responsibility. Geekonomics attempts to stop them and can be deemed the software equivalent of Unsafe at Any Speed. That tome warned us against driving unsafe automobiles; Geekonomics does the same for insecure software." Read on for Ben's take on this book.
Geekonomics: The Real Cost of Insecure Software
author David Rice
pages 362
publisher Addison-Wesley
rating 9
reviewer Ben Rothke
ISBN 978-0321477897
summary How insecure software costs money and lives
Now the bad news — we live in a society that tolerates 20,000 annual alcohol-related fatalities (40% of total traffic fatalities) and cares more about Brittany Spears' antics than the national diabetes epidemic. Expecting the general public or politicians to somehow get concerned about abstract software concepts such as command injection, path manipulation, race conditions, coding errors, and myriad other software security errors, is somewhat of a pipe dream.

Geekonomics is about the lack of consumer protection in the software market and how this impacts economic and national security. Author Dave Rice considers software consumers to be akin to the proverbial crash test dummy. This combined with how little recourse consumers have for software related errors, and lack of significant financial and legal liability for the vendors, creates a scenario where computer security is failing.

Most books about software security tend to be about actual coding practices. Geekonomics focuses not on the code, but rather how insecurely written software is an infrastructure problem and an economic issue. Geekonomics has 3 main themes. First — software is becoming the foundation of modern civilization. Second — software is not sufficiently engineered to fulfill the role of foundation. And third — economic, legal and regulatory incentives are needed to change the state of insecure software.

The book notes that bad software costs the US roughly $180 billion in 2007 alone (Pete Lindstrom's take on that dollar figure). Not only that, the $180 billion might be on the low-end, and the state of software security is getting worse, not better, according the Software Engineering Institute. Additional research shows that 90% of security threats exploit known flaws in software, yet the software manufacturers remain immune to almost all of the consequences in their poorly written software. Society tolerates 90% failure rates in software due to their unawareness of the problem. Also, huge amount of software problems entice attackers who attempt to take advantage of those vulnerabilities.

The books 7 chapters are systematically written and provide a compelling case for the need for security software. The book tells of how Joseph Bazalgette, chief engineer of the city of London used formal engineering practices in the mid-1800's to deal with the city's growing sewage problem. Cement was a crucial part of the project, and the book likens the development of secure software to that of cement, that can without decades of use and abuse.

One reason software has significant security vulnerabilities as noted in chapter 2, is that software manufacturers are primarily focused on features, since each additional feature (whether they have real benefit or not) offers a compelling value proposition to the buyer. But on the other side, a lack of software security functionality and controls imposes social costs on the rest of the populace.

Chapter 4 gets into the issues of oversight, standards, licensing and regulations. Other industries have lived under the watchful eyes of regulators (FAA, FDA, SEC, et al) for decades. But software is written removed from oversight by unlicensed programmers. Regulations exist primarily to guard the health, safety and welfare of the populace, in addition to the environment. Yet oversight amongst software programmers is almost nil and this lack of oversight and immunity breeds irresponsibility. The book notes that software does not have to be perfect, but it must rise to the level of quality expected of something that is the foundation of an infrastructure. And the only way to remove the irresponsibility is to remove the immunity, which lack of regulation has created a vacuum for.

Chapter 5 gets into more detail about the need to impose liability on software manufacturers. The books premise is that increased liability will lead to a decrease in software defects, will reward socially responsible software companies, and will redistribute the costs consumers have traditionally paid for protecting software from exploitation, shifting it back to the software manufacturer, where it belongs.

Since regulations and the like are likely years or decades away, chapter 7 notes that short of litigation, contracts are the best legal option software buyers can use to leverage in address software security problems. Unfortunately, most companies do not use this contractual option to the degree they should which can benefit them.

Overall, Geekonomics is an excellent book that broaches a subject left unchartered for too long. The book though does have its flaws; its analogies to physical security (bridges, cars, highways, etc.) and safety events don't always coalesce with perfect logic. Also, the trite title may diminish the seriousness of the topic. As the book illustrates, insecure software kills people, and I am not sure a corny book title conveys the importance of the topic. But the book does bring to light significant topics about the state of software, from legal liability, licensing of computer programmers, consumers rights, and more, that are imperatives.

It is clear the regulations around the software industry are inevitable and it is doubtful that Congress will do it right, whenever they eventually get around to it. Geekonomics shows the effects that such lack of oversight has caused, and how beneficial it would have been had such oversight been there in the first place.

To someone reading this review, they may get the impression that Geekonomics is a polemic against the software industry. To a degree it is, but the reality is that it is a two-way street. Software is built for people who buy certain features. To date, security has not been one of those top features. Geekonomics notes that software manufacturers have little to no incentive to build security into their products. Post Geekonomics, let's hope that will change.

Geekonomics will create different feelings amongst different readers. The consumer may be angry and frustrated. The software vendors will know that their vacation from security is over. It's finally time for them to get to work on fixing the problem that Geekonomics has so eloquently written about.

Ben Rothke is a security consultant with BT INS and the author of Computer Security: 20 Things Every Employee Should Know.

You can purchase Geekonomics: The Real Cost of Insecure Software from amazon.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

Geekonomics

Comments Filter:
  • It's spelled Britney Spears.
  • by jorghis ( 1000092 ) on Monday January 21, 2008 @04:43PM (#22130382)
    Software written for most industries where human lives could conceivably be on the line IS under the watchful eyes of regulators. As an example, if you are going to write software that goes into an airplane you can expect to have your work audited by the FAA. Similar circumstances exist for most other industries where a software failure could cause loss of human life or similar catastrophes.
    • by kebes ( 861706 ) on Monday January 21, 2008 @05:14PM (#22130670) Journal
      Indeed. Analogies to bridges and cars only make sense for software that can endanger lives: medical systems, bridge-designing systems, vehicle-control systems, etc. As you point out, in all those cases, the software (as well as any designs the software spits out) will be verified in detail and validated. The software vendor will usually be bound by stringent contracts and will indeed be contractually and legally responsible for defects.

      The rest of software, like word processors, and spreadsheets, and music apps, doesn't need that kind of stringent oversight. A better analogy in such cases is to other mundane things: books, binders, pencils. Poorly designed binders and pencils can lead to lost productivity in the same way that poorly designed software can. Those who care will go for the higher-quality product (which may require more money, either in initial expenditure or in staff expertise). Again, errors in books can certainly lead to lost productivity, but is there really any need for more "book security" and "book oversight" and "book regulations" to make sure that the contents of books are robust and error-free?

      I submit that such oversight is not really necessary (again, except in issues of health and physical safety). Most people can tolerate the occasional annoyances of breaking pencils, typos in books, and crashes in software. Ideally people should be educated about risk (e.g. don't put important documents in a flimsy box, put them in a safe; similarly, don't put important data in a low-security computer, get a properly administered server), so they can make informed choices. But more laws and regulation? Not necessary.
      • by HappySmileMan ( 1088123 ) on Monday January 21, 2008 @05:19PM (#22130722)

        Again, errors in books can certainly lead to lost productivity, but is there really any need for more "book security" and "book oversight" and "book regulations" to make sure that the contents of books are robust and error-free?
        I've yet to see a flaw in a book steal my, or anyone elses, credit card number, or delete all my other books, have you?
        • Flaws in books can have disastrous consequences if someone depends on them to be flawless.

          Imagine a repair manual for a gas stove that said "blow out pilot light, turn on gas, wait one hour, invite your friends over, and light a match." Sure, it might not steal credit card numbers but in the face of an ignorant and trusting user, it could prove fatal nonetheless.
        • by kebes ( 861706 ) on Monday January 21, 2008 @05:35PM (#22130890) Journal

          I've yet to see a flaw in a book steal my, or anyone elses, credit card number, or delete all my other books, have you?
          I mentioned 'books' as an example real-world object with errors, not a one-to-one mapping to software. (I'm always reticent to use analogies, since they inevitably break down so quickly.)

          There are of course meat-space analogies for identity theft and data loss arising from faulty products (locks, paper shredders, photocopiers) or services (shipping errors, clerical errors, corruption). The point is not the analogy per se... the point is that faulty products and services in the real world lead to losses (of time, money, data, personal information, etc.) and to crime. We could reduce these losses by spending more money and effort on higher quality products and services, but there reaches a point where people just don't care anymore (either because they are ignoring the risk, or because the risk is low enough that it isn't worth the additional cost).

          The same applies to software: we could make it much more robust, but is the added security worth the burden of more regulation, more overhead, and more money? In some cases, it is... but in many cases it really isn't. Software related to health, personal safety, and financial information should be regulated (in the same way that medicine and financial institutions are regulated). But over-riding laws mandating software security and software liability are not necessary. End-user education is overall more important (both to prevent real-world losses, and computer losses).
        • by syousef ( 465911 )
          Again, errors in books can certainly lead to lost productivity, but is there really any need for more "book security" and "book oversight" and "book regulations" to make sure that the contents of books are robust and error-free?

          I've yet to see a flaw in a book steal my, or anyone elses, credit card number, or delete all my other books, have you?

          In addition there is book oversight and book regulation, in the form of existing negligence laws, advertising laws etc. Publish a textbook on bridge building that ha
          • Ob Disclosure: I wrote The dotCrime Manifesto: How to stop Internet crime which is a companion book in the same series. My take is rather different however.

            If books did steal credit card numbers, whose fault would that be? The authors, the publishers, the readers or the banks who use credit card numbers as an authentication mechanism rather than Chip and PIN smart cards?

            It is really easy to point fingers, but working out where the responsibility should lie is rather harder. I don't think that Microsoft,

            • by syousef ( 465911 )
              I think you'll find that while improving security is easy over the current situation, provided you're willing to wear the cost of doing so (which unfortunately may be the difference between a company or technology being technologically viable and not), the idea that a perfect attack-proof system could be built is a fallacy.

              It's way too easy to blame the initial inventors of the C language for not checking for buffer overflows, but that too is a mistake. They wrote the system in the 60s for machines that tod
              • I think you'll find that while improving security is easy over the current situation, provided you're willing to wear the cost of doing so (which unfortunately may be the difference between a company or technology being technologically viable and not), the idea that a perfect attack-proof system could be built is a fallacy.

                Every successful attack against Chip and PIN to date has been against the transition arrangements to support legacy systems. While smartcards are not invulnerable (Pau Kocher's timing a

                • by syousef ( 465911 )
                  Every successful attack against Chip and PIN to date has been against the transition arrangements to support legacy systems. ...because they're currently the weakest link.

                  Anyway that's a rather sweeping statement.

                  While smartcards are not invulnerable (Pau Kocher's timing attacks etc) they are more than sufficient to mitigate risk. ...until the entire infrastructure moves and there is no more "low hanging fruit" for the criminals to pick. Anyway you're assuming smartcards can be implemented perfectly. They
      • This sounds good in principle, but a lot of software that we programmers assume has crap correctness is actually being used in critical ways. Excel, for example, is in the pipeline for a lot of engineering and scientific calculations, even though it's riddled with bugs and usability problems. You can say that users should know better, but Microsoft doesn't make clear that Excel is really just a toy, not for use on things that matter.
    • by mrbooze ( 49713 )
      I once worked for a company that made software for blood banks, pharmacies, and surgical suites. I worked in the pharmacy division, and as far as I ever heard there was little to no government oversight of our product (this was back in the early 90s). However, the blood bank (and I believe the surgery) software packages were rigidly regulated. Even minor software patches had to be submitted to the government for auditing and approval.
    • The problem is that with the rise of 1) mass e-commerce, e-government and Internet banking, and 2) Internet-enabled desktops, now EVERY piece of conceivably internet-facing software installed on a consumer desktop carries the risk of exploitation, criminal intrusion and identity theft.

      Yes, a security hole in a web browser won't directly cause loss of *life*. However, what it *can* do by allowing a trojan in is:

      a) Drain all your life savings from your bank
      b) Place illegal pornography on your computer, leadin
      • by lennier ( 44736 )
        I should have added:

        e) By installing a keylogger (if you're a telecommuter with a VPN, or if you reuse passwords between home and work systems), potentially gain access to internal proprietary corporate networks, with the ability to conduct industrial espionage or control enterprise automation systems or SCADA networks

      • b) Place illegal pornography on your computer, leading to serious prison time
        Has that ever actually happened? People keep bringing it out as a scary scenario, but I've yet to hear of an actual example of this being done to somebody.
    • "Software written for most industries where human lives could conceivably be on the line IS under the watchful eyes of regulators."

      I'm not so sure about that. Nobody checked Diebold's software, and now a whole bunch of people are dead in Iraq.

  • by Kalriath ( 849904 ) on Monday January 21, 2008 @04:43PM (#22130386)
    Just to get in the troll everyone is going to use, even though it's pretty much a load of bollocks:

    "This book could be summed up in three words: 'don't use windows'"

    I suppose that should be suffixed with some 'tard thing like "lol!!111!!1one"
    • by Guppy06 ( 410832 )
      ""This book could be summed up in three words: 'don't use windows'""

      Microsoft can afford to defend themselves against a few liability lawsuits. Can Linus?
  • by Bearhouse ( 1034238 ) on Monday January 21, 2008 @04:52PM (#22130438)
    Few people (rightly so) would tolerate Boeings or Airbuses that fell out of the sky through faulty software.

    And yet, as a former coder then vendor, I always found it hard to get people to pony up for better education for programmers, analysts, project managers, or better coding tools, exhaustive testing protocols, whatever.

    Now as a consultant, I face the same struggle getting people to be serious about backups, redundancy/eliminating single points of failure...

    As long as it's not their head on the block, even senior managers will most often favour commercial expendiency over prudence. This in the face of many high-profile disasters that cost a lot more to put right than they would have done to do properly.
    • It's a gamble (Score:2, Insightful)

      by davidwr ( 791652 )
      If it costs 10x as much to fix a problem than prevent it, but for every $100 you spend on prevention you only prevent 1 failure, you are in the hole $10. That's rational math at work.

      If you are a greedy-bastard manager and you expect to be in your position for only a few years, all you care about is the failures that will come back to haunt you. You don't care if spending $5M now will save $1M in expenses over the next 5 years but save an additional $20M 10 years down the road. By then you and your greed
  • by jejones ( 115979 ) on Monday January 21, 2008 @04:56PM (#22130476) Journal
    Regulation is a means by which the established companies keep possible competition from developing. MS can pay for that overhead from pocket change; can Open Source developers?
    • How did this get flagged as flamebait? This is exactly the kind of crap regulation does. Do you think for a minute that regulation in software is going to do squat against the giant coffers of corporate America that can afford to pay out fines and such? Now what happens when someone uses an Apache server for something critical, and it turns out an Apache error caused the failure...now who is going to get shafted? The regulation idea is a nightmare waiting to happen with a huge chilling effect. Let us a
      • "No other industry can say "you can't sue us if the product you purchased from us does not do what it was intended to do""

        And here's where the GPL comes in handy.

        There's the difference - you don't have to purchase free/libre open-source software; if you want to be indemnified, you can buy a distro from RedHat, Novell, Oracle, and (soon) Sun.

        A contract requires a "consideration", usually payment, in return for the "good" or "service". There is no contractual arrangement between someone who downloads a

        • by db32 ( 862117 )
          That is my point. This should be enforced through contract law, not regulation. If it is enforced through regulation you will see FOSS die a quick and bloody death as those "rogue programmers" get sued out of existence. To me it sounds like this book is simply using the very real problem of security and lack of liability as an argument for regulation that will give MS and crew exactly what they want, a playing field free from any competitor that can't just pony up the fines when things go wrong.

          To be h
    • by colmore ( 56499 )
      Commercial open source vendors could pay the overhead in order to sell verified software.

      This would be different from other regulations because it's totally impossible to make unregulated software unwritable or unrunnable. If legislators don't grasp that, then all bets are off, of course. The regulation would more likely come in the form of watching businesses over a certain size and making sure they used approved software where needed.

      Under that kind of regulation, open-source could flourish as in many a
  • as the review says (Score:5, Insightful)

    by ILongForDarkness ( 1134931 ) on Monday January 21, 2008 @04:57PM (#22130488)
    "Software is not sufficiently engineered to serve as a foundation" [for society] - I agree whole hardily. Things are getting better but we still have very little idea whether what we code "works" or not, let alone is secure. For example: a software vendor will say we have 80% path coverage. Great, now tell me: do you have 80% path coverage because only that 80% was deemed risky, or because writing tests for the remaining 20% was deemed too time consuming (or worse your test/dev team weren't skilled enough to write tests for those paths)?

    In my experience there is so much feature creap in software projects that there always seems to be that last feature that needs to get squeezed into the next release at the last moment and there isn't time to test. "lets just hope that 10k line module works and is secure. Even if it's not, we can always release a SP after we have the product on the market". It is even to the point where major software companies (MS comes to mind) have a concept of Zero Bounced Bugs. That is the point where the bugs getting fixed equals the bugs being found. If no "major" bugs and you've reached ZBB you ship. Now I can see you can't wait forever to ship, but there is this inherit acceptance of flaws in software that you won't see in say bridge building.

    • by jorghis ( 1000092 ) on Monday January 21, 2008 @05:06PM (#22130586)
      I always thought the bridge building analogy was a little bogus.

      Bridge building isnt really all that complex, there is a hell of a lot more going on in a software product of any real magnitude than in a bridge. Sure, there are a few things like wind you have to take into account, but there really arent as many variables in bridge building as there is in software development.

      In addition to that, software has to be exactly perfect, with a bridge you can just say "screw it, lets reinforce/add supports/whatever here, here, and there just to be safe" and you are good to go. (I know I am oversimplifying to some degree, but you see my point) It is possible to give yourself a lot more room for error.
      • Re: (Score:3, Insightful)

        by mcpkaaos ( 449561 )
        Bridge building isnt really all that complex

        (I know I am oversimplifying to some degree, but you see my point)

        Have you ever stopped to wonder if you are actually over-complicating software design rather than over-simplifying the analogy?
        • by moderatorrater ( 1095745 ) on Monday January 21, 2008 @05:49PM (#22131038)
          He's not, and here's why. In building and designing a bridge, you're not going to have your boss walk in halfway through the construction and tell you that you need to use this new concrete that only comes from LargeHard(c). You're not going to build the bridge so that you can take it from a two lane bicycle bridge to a 12 lane, double decker toll bridge with a minimum of work. You're never going to have someone walk over the bridge and promptly say, "sorry, this river is actually 50 feet wider, and I don't like the color, can you change that?" Feature creep is the biggest killer of productivity and security.

          Another reason is that you have too many people building a bridge for the majority to be badly built. You have the engineers, the construction company, the foremen and the works are looking at the bridge. Are all these people going to be qualified to catch an error? No, but enough of them will be qualified enough to catch an error that it's unlikely to be a problem. On the other hand, we have software, where there are lines of code that have never been seen by anyone but the original programmer.
          • by Naturalis Philosopho ( 1160697 ) on Monday January 21, 2008 @06:23PM (#22131380)
            Oddly enough, you just made one of the best arguments I've heard to date for regulation and licensing of software designers and engineers. If we can't trust people to make rational decisions, then we may very well have to regulate them into it.
            • So you think having a bunch of civil servants running a bunch of tests is going to improve the quality of software developers? Have you worked with many people with MSFT certifications?
            • by dodobh ( 65811 )
              You could build the software to those specs. NASA does it. It just will cost you a few million dollars per line of code. Also, it will be certified to work on only specific hardware and software combinations.

              Design me a bridge which needs to be dropped in place across multiple places (which may be a rivulet, or the Grand Canyon, or the Bering Straits, or a bridge between Mt. Everest and Mt. Kiliminjaro), and just work out of the box. It needs to be capable of supporting any type of vehicle, including those
          • by Coryoth ( 254751 )

            In building and designing a bridge, you're not going to have your boss walk in halfway through the construction and tell you that you need to use this new concrete that only comes from LargeHard(c). You're not going to build the bridge so that you can take it from a two lane bicycle bridge to a 12 lane, double decker toll bridge with a minimum of work. You're never going to have someone walk over the bridge and promptly say, "sorry, this river is actually 50 feet wider, and I don't like the color, can you change that?"

            And the question you should be asking yourself is why does civil engineering not have radical and constant requirements changes through the entire design process. The answer is: often they actually do, though not to the extent that some software projects do; and also they tend to have contracts that rule out last minute silliness -- or at the least make it rather expensive. Software developers bring it on themselves to soem extent by simply accepting all these changes. If someone comes to you with a last m

      • I work in the medical field (radiation treatments). The vendors have triple redundancy in the software (three workstations have to agree on the position of components), + hardware backup (analog computer anyone? :) ). Agreed software is more complicated, it can even be said that software requires more intellectual capital (you get smart people sitting at a desk all day thinking), versus a lot of other engineering (where a vacuum cleaner sales man can come up with an idea and grab readily available parts to
      • by PitaBred ( 632671 ) <slashdot&pitabred,dyndns,org> on Monday January 21, 2008 @05:29PM (#22130840) Homepage
        I take it you've never actually taken any Engineering classes. A bridge really is pretty damn complex. It requires materials knowledge, static force calculations, dynamic force calculations, as well as weathering and other concerns, not to mention consideration of failure modes, etc. You don't give yourself any room for "error", you give safety tolerances for the people driving over the bridge and to account for imperfect materials, as well as exceptional conditions (earthquake, tornado, whatever).

        Designing a serious bridge is a LOT more difficult than 90% of software projects out there. You have a base you can build on of tried and true designs, but from scratch, it's not very easy.

        I say this as someone who works with computer administration, programming and database work professionally, but got I a minor in Engineering. I know what goes into it.
        • Yeah I was going to say much the same thing. You also have soil condition, and seasonal changes etc etc. I can't count how many times I've heard software vendors say: what you have program X version 1.45 installed? That's the problem you need to roll back to 1.3. They mandate (especially with "complicated" software) the platform, hardware, software right down to the patch level. You can't do that in engineering a lot of time, sure you can tell the customer that the location isn't the greatest etc etc. But y
        • Designing a serious bridge is a LOT more difficult than 90% of software projects out there. You have a base you can build on of tried and true designs, but from scratch, it's not very easy.

          This was also my thought. People expect a lot from an industry that has only been around for about 70 years. If we had the history of bridges, with all it's successes, failures, and practical designs that came from them, maybe programming would be in a better state. But I think we're at the point, right now, where we're just starting to build the equivalent of bridges that are vital to major traffic, but still haven't formalized the rules for how best to do that. This is also why I somewhat fear regula

        • by sheldon ( 2322 )
          The difference is...

          When you build the bridge, you know how long it is and how many cars/trucks it will need to support.

          To take the bridge analogy to software. You start out building a bridge over the Mississippi, that will handle 10,000 cars an hour or somesuch. When you're done the client tries to place your bridge across the English Channel, and land 747's on top of it.

          It's all about the requirements.
        • Bridges are also all "open source". Everyone in the field can learn from any success or failure of any bridge in existence. Bridges are also of "modular" design. Good designs are proven over time and re-used. Good parts are mixed-and-matched. Software might benefit from this example.
      • by SAN1701 ( 537455 )
        Even more, any 3-year-old child can perfectly understand what a bridge does. It's obvious, unambiguous, clear. You only have to see it. Now, try to explain to the kid what an ERP does. Compare the functional requirements of a bridge to those of any medium-sized commercial software and find which one is more complex, or which one will have more changes during the project lifetime.

        Fact is, we have a distinct science/engineering/craft/whateveryoucallit here. Analogies are pointless.
        • by Kazrath ( 822492 )
          A three year old can also see the finished product of the next great Elmo game. What you have indicated does not even apply to the conversation. The finished, intended results of most applications is very simple and easy for a novice computer user to understand. Just like everyone understands that a bridge provides solid "footing" over a waterway or cliff side.

          A comparison that is adequate to your statement would be to indicate "Hello World" is an application and sticking a popsickle stick over a rain ru
          • by SAN1701 ( 537455 )
            Sure. And, just to use your minimalistic example, one has at least to learn how to read and write to make a "Hello World", but even cavemen knew how to make simple bridges. Beavers know how to make dams.

            But the point is, Software and Civil Engineering are so different disciplines that analogies are useless, theirs complexities are from different natures. Requirements of a bridge can't change drastically after a its construction has begun. Big new features aren't asked to be implemented after it has been
      • by sholden ( 12227 )
        We have thousands of years of bridge building (and failing) worth of experience. And they still collapse - which might be some indication that it's not as simple as you imply.
      • Bridge building isnt really all that complex

        Yeah, what could possibly go wrong? [google.com]
      • by mihalis ( 28146 )

        Bridge building isnt really all that complex, there is a hell of a lot more going on in a software product of any real magnitude than in a bridge

        I'd really like to see someone prove this or even provide any evidence at all. A real bridge is a collection of thousands or even millions of parts. Each and every part is unique when considered in fine detail. The crystal structure of the metal, the exact surface detail, the exact overall shape, the stresses experienced during manufacture, the stresses experie

    • "Software is not sufficiently engineered to serve as a foundation" [for society] - I agree whole hardily. Things are getting better but we still have very little idea whether what we code "works" or not, let alone is secure.

      This is a complete load of crap. To use a car analogy, 50,000 people are killed on the roads every day. Traffic and accidents cost billions of dollars of manhours every year not even considering the environmental factor. I'd say our transportation system isn't "engineered to serve as a foundation." How many lives did software kill last year? I'm guessing it's pretty far in the black in that it saved far more than it cost. How many dollars in manhours did software cost companies last year? Well I'll tell y

  • Well, excuuuse me... (Score:5, Informative)

    by Chemisor ( 97276 ) on Monday January 21, 2008 @04:59PM (#22130498)
    Companies don't spend much time on security because features are what the customers want. If you want security and unlimited liability, by all means ask for it. Of course, it will cost you extra, due to the need for security audits and the outrageous cost of liability insurance, but you can certainly get it. If you pass a law to require perfect security and liability, the cost of software will rise even higher than it is today. Take your pick.
    • Some of the major software vendors (MS, Apple etc.) have high to very high margins and profits - WELL more than enough to make their software MUCH more secure if they wanted to, at a cost of only a miniscule fraction of their current profits. Nobody said anything about demanding "perfection", that's a strawman or false dilemma (i.e. it's not "choose between 100% or no extra effort") --- the vast majority of the world's software security problems, and associated costs, could be drastically reduced with just
      • by Chemisor ( 97276 )
        > Some of the major software vendors (MS, Apple etc.) have high to very high margins
        > and profits - WELL more than enough to make their software MUCH more secure if they wanted to,

        So you now want to dictate every company how much profits it can make? What a socialist attitude! Profit margins and quality are separate considerations. If you raise quality, you raise the price; that's how business works. If the company decides to spend more effort on security, it will raise the price. Not because it has t
        • So you now want to dictate every company how much profits it can make? What a socialist attitude!

          WTF!? Where did I say that? I'm a libertarian, you idiot, get some reading comprehension skills and read my post again. Seriously, are you just trolling, or are you really so thick that you are unable to comprehend basic English?

          I'm suggesting that FUCKING MARKET COMPETITION bring down prices, you dolt.

          If you think that raising the quality always raises the price, you are completely clueless as to how even

          • by Chemisor ( 97276 )
            > or are you really so thick that you are unable to comprehend basic English?
            > I'm suggesting that FUCKING MARKET COMPETITION bring down prices, you dolt.

            I must have trouble comprehending your basic English, since I can't seem to find where you suggest that :)

            > If you think that raising the quality always raises the price, you are completely
            > clueless as to how even the basics of business work. Where do you think profits go?

            They go to:
            1. Pay a bonus to the CEO
            2. Hire consultants
            3. Restructure the busines
        • So you now want to dictate every company how much profits it can make? What a socialist attitude!

          Here, since you are having trouble reading, let me repeat my very own words in bold for you again:

          "I'm certainly not advocating legally mandating anything, I'm in favor of free markets, and free markets can 'solve' this problem if the markets become more informed and start demanding better"

          Now how you got from that to socialism and "dictating how much profits" companies make, only God knows. Perhaps you mea

    • the cost of software will rise even higher than it is today

      Actually the point of that book is that the "real" cost of software is already much higher than what you see as the price because of externalities. A lousy manufacturer that pollutes a river forces some of the costs of production onto the downstream inhabitants; similarly, a software vendor that sells insecure software pushes external costs down onto its own users (e.g. the cost of antivirus and anti-spyware software, downtime from virus infection

      • by Chemisor ( 97276 )
        > "real" cost of software is already much higher than what you see as the price because of externalities.

        People always prefer to pay the externalities over the base cost. For example, most stores offer extended warranties on the stuff they sell, and yet most people choose not to buy them, even though the extra liability thus purchased would offset future costs of repair or replacement. I, for one, prefer cheap software that breaks occasionally to expensive software that never breaks, since the perceived
  • by Anonymous Coward on Monday January 21, 2008 @04:59PM (#22130510)
    To get a loan for my $50,000 PC which requires $300/month insurance to operate.

    I hope to pass my operators test so I can get my license.
  • what about OSS? (Score:5, Interesting)

    by quest(answer)ion ( 894426 ) <adminNO@SPAMmindofmetal.net> on Monday January 21, 2008 @05:00PM (#22130514)
    what, a whole book review on software development, and not a single mention of open source? how did this make it onto slashdot?

    OSS cracks aside, it would be nice to see if the book talks about that side of things at all; the impression i got from the review is that there's not much distinction drawn between software licensing and development models, and that it's all sorta lumped in together.

    so if, as the book seems to suggest, software development were regulated more closely, who would be accountable, or audited, or whatever, for an OSS project with heavy community involvement that's seeing commercial applications? or with an OSS project that gets implemented as part of a for-profit piece of software?

    i'm curious, because i have less than zero experience in how this stuff actually works, but it seems like it would be a weird situation. anyone have any insight?
  • by nullchar ( 446050 ) on Monday January 21, 2008 @05:01PM (#22130532)

    The software vendors will know that their vacation from security is over.
    It would be nice if a book like this could change the software industry. But realistically, what industry will lobby their respective governments for this change? Obviously the established software companies will not advocate change. And, IMO, obviously the open-source community has little to gain with extra regulation and imposed cost on a Free and often voluntarily produced product.

    I say the market itself will solve the problems with software security. New companies or new software products will only replace existing ones if the new ones are better. And like the book mentions, "better" is often measured in features. However, if enough damage is done with the current software flaws, some of the new features will include better security.

    Example: Company A is sued by Customer B when Attacker C exploits a hole in Company A's software resulting in a financial loss for Customer B. Like the book mentions, Customer B usually has no legal grounds to sue. However, if this happens multiple times, Customer B may get wise and ensure proper contracts when entering new agreements.

    These contracts could be required by customers when dealing with both closed source and open source companies. Buying a support contract from Sun for MySQL _could_ include certain software security requirements. And if Sun does not support this service, a business opportunity exists for another company.
    • >But realistically, what industry will lobby their respective governments for this change?

      An industry full of big incumbents who can afford the overhead of a regulatory compliance department, an industry afraid of small fast-moving competitors, competitors who could be mired in tar and crushed by the burden of regulation.
  • Oh Christ! I hate made up words like that. They make me think of Reaganomics and those "FUN" days.
  • Bad software costs us 180 billion dollars a year? That would be about $600 per person in the US. Per year. I call bullshit. Unless you are going to claim that Mozilla is costing my family money because it allows me to waste time on /., this just doesn't make a shred of sense.
    • Re: (Score:3, Funny)

      by zotz ( 3951 )
      Dude! You need to take remedial Geekonomics! ~;-)

      all the best,

      drew
      • I dunno. I took four years worth of the real kind, with a healthy does of statistics and accounting along the way. It seems hyperbolic to me to claim that an average family of 4 people, with a median income of somewhere around $65k a year (US Census) is contributing $2400 of that to bad software. That's about 5% of after tax income.
        • by zotz ( 3951 )
          I agree with you, hence my joke. Or attempt at a joke in any case...

          It really is a curvy sort of statement to make isn't it?

          all the best,

          drew
    • Re:Hm-m-m-m... (Score:4, Interesting)

      by blahplusplus ( 757119 ) on Monday January 21, 2008 @05:45PM (#22130980)
      "... Bad software costs us 180 billion dollars a year? That would be about $600 per person in the US. Per year. I call bullshit."

      I disagree. Add up all the time spent re-installing windows, cleaning PC's, deleting or countering spam, etc, etc. I think they are right on target, spam, spyware, buffer over-runs, worrying about your popular website being hacked and extorted by crime.

      A few points:

      1. Organized crime takes advantage and exploits / extorts companies (the kid who made the milliondollarhomepage was threatened with extortion).
      2. The capacity for economic espionage is quite large.
      3. Then there is 'just for kicks' aspect of causing havoc.
      4. Bad people who don't like us attack our networks/software/etc.
      5. Orwellian trojans (i.e. governments, criminals, or corporations of the world infecting your computer with rootkits, i.e. we already have one example: Sony).

      Also corporations who are criminals such as Mediadefender, which was hacked

      http://blog.wired.com/27bstroke6/2008/01/interview-with.html [wired.com]
  • OK, so now that doctors are operating at near-zero profits, malpractice lawyers need a new profession to plunder. Wonderful.
    Luckily, its much easier to switch out of software than out of medicine (given that one has invested 8+yrs to a career in medicine.) So the smart folks switch out, leaving the weaker folks to create more buggy programs. A race to the bottom!
  • The book tells of how Joseph Bazalgette, chief engineer of the city of London used formal engineering practices in the mid-1800's to deal with the city's growing sewage problem.
    Why is it that any time someone talks about software engineering they always bring up bridge/house/skyscraper building? Yes, Joseph Bazalgette used "formal engineering practices" to build London's sewers, but where did these formal practices come from? Why yes, through trial and error. Thousands of years of trial and error. Use
  • OT: Drunk driving (Score:4, Insightful)

    by operagost ( 62405 ) on Monday January 21, 2008 @05:30PM (#22130848) Homepage Journal

    Now the bad news -- we live in a society that tolerates 20,000 annual alcohol-related fatalities (40% of total traffic fatalities) and cares more about Brittany Spears' antics than the national diabetes epidemic.
    I love analogies, but I'm going to have to go way OT here and set you straight. In the USA, drunk driving is NOT tolerated. After years of onerous regulations, infringements on drivers' (and sometimes passengers') rights in the form of sobriety checkpoints, and ridiculously low BAC requirements (now commonly .08), we still have fatalities due to drunk driving.

    But this isn't because we don't care.

    Obviously, all those things I listed show that people do care; however, they are going the wrong things to address the problem. We have allowed special interests like MADD, who are modern-day temperance societies, dictate these changes to us with little review or oversight. It has been statistically proven [ct.gov] that fatalities do not decrease with a .08 BAC law, yet 15 states have passed such laws and MADD continues to pressure more. Sobriety checkpoints were begrudgingly allowed by the courts in the 1980s and 1990s to address the drunk driving "emergency"; but since judicial decisions don't have a sunset, and no one wants to challenge a policy that protects "the children", this infringement on our personal rights continues. The federal government infringed on states' rights in order to force the drinking age to 21 in the USA, even though Canada (with age limits of 18 and 19) has shown that drunk driving could be greatly reduced without infringing on the rights of young adults. Now MADD wants to require breathalyser interlocks in all new motor vehicles; ignoring the privacy rights, expense, and technological issues raised by such draconian policies. Think about how many miles passenger cars travel in a year, and decide in practical terms how many fatalities are practical and acceptable. Think about other oppressive regulations you could impose if safety were truly paramount: reducing the speed limit to 25 MPH, requiring 15 MPH bumpers, requiring driver retesting annually, etc. Rationalizing these kinds of laws in absolute terms such as "for the children" and "if it saves one life" makes no sense as we deal in statistics and weight everything in the balance every day. Life is truly precious, but we live in an evil, dangerous world-- not a rubber room.

    Maybe we need to do more. But remember that there will always be people who insist on doing the wrong thing, and finding a way to do it.
    • Re: (Score:3, Insightful)

      It's worse than that - 20,000 alcohol related deaths doesn't really mean anything. If anyone involved in an accident has measurable alcohol in their system, it's alcohol related. If you're looking for the number of DUI style fatalities, it's probably around 3000/yr, but we don't know because nobody tracks that. But yeah, everything else you said is right - the 3000 deaths are committed by people who blow .15 or more and often have multiple DUIs - lowering the BAC limit only serves MADD's agenda, which is pr
    • by homer_s ( 799572 )
      Some economist once said that a good way to reduce vehicle fatalities is to require that every new car have a sharp steel spike mounted on the steering so that it is inches from the driver's eyes. This would make sure that everyone drives slowly and carefully.

      I hope MADD doesn't hear this and think he was being serious.
    • by zsau ( 266209 )
      A BAC of .08 is ridiculously low? What would you consider reasonable? considering .08 is about as high as it gets, and only really common in North America and South America. Many countries have zero bac tolerances, and values between .02 and .05 are easily more common than values above .05.
  • The /. summary talks about three completely different things: regulation, licensing, and liability. Regulation seems completely nutty to me; legislators don't have the expertise to do it right, and if they do it wrong they could easily, e.g., strangle the open-source movement in the cradle. Licensing historically has has very little to do with safety or quality; in my state, IIRC, hairdressers are licensed, and it's basically a way of reducing competition in the labor pool so that the hairdressers who lobbi

  • Here's another take that argues against liability laws: http://lwn.net/Articles/247933/ [lwn.net]
  • Farming, manufacturing has waste and other write-offs. What kind of percentages are wee talking about for various industries? 5%? 10%?
  • by Helevius ( 456392 ) on Monday January 21, 2008 @06:03PM (#22131204) Homepage
    This Amazon.com review mentions Mr. Rice's opinions on open source:

    Geekonomics reviewed by Richard Bejtlich [amazon.com]:

    As far as open source goes (ch 6), the author makes several statements which show he does not understand the open source world. First, on p 247 the author states "While a binary is easy for a computer to read, it is tremendously difficult for a person -- even the original developer -- to understand." This is absolutely false, and the misunderstandings continue in the same paragraph. Reverse engineering techniques can determine how binaries operate, even to the point that organizations like the Zeroday Emergency Response Team (ZERT) provide patches for Microsoft vulnerabilities without having access to source code!

    Second, on p 248 the author states "The essence of open source software is the exact opposite of proprietary software. Open source software is largely an innovation after-the-fact; that is, open source software builds upon an idea already in the marketplace that can be easily replicated or copied." On what planet?

    Third, on p 263 the author states "[O]pen source projects are almost always threatened by foreclosure," meaning if the developer loses interest the users are doomed. That claim totally misses the power of open source. When a proprietary software vendor stops coding a product, the customers are out of luck. When an open source software developer stops coding a product, the customers are NOT out of luck. They can 1) hope someone else continues the project; 2) try continuing the project themselves; or 3) hire someone else to continue developing the product. Finally, if the author is worried about open source projects not having an organization upon which liability could be enforced, he should consider the many vendors who sell open source software.


    David Rice responds on his blog [geekonomicsbook.com].
    • I don't have any moderation points to add to Helevius' karma, but I can send my things for posting an informative article.

      Its pretty clear to anyone paying attention that the fact that software vendors like Microsoft pay no price for security failures in their products means that they don't have much incentive to fix them (see Why are there still e-mail viruses? [bearcave.com]). So I was inclined to agree with what seemed to be the theme of Geekonomics. But from looking at the quotes above and the authors blog, it

    • by Khelder ( 34398 )
      Thanks a lot for the link to the author's reply on his blog. I now know I need not consider buying this book any longer.

      His argument that the ability to continue to use (and improve and fix bugs in) a product someone else decided wasn't worth their while to contine working on is an economically bad thing is totally bizarre. I also think his terminology is unnecessarily biased. A developer choosing to work for company A instead of company B does not "rob" company B of their efforts.

      I'd be kind of interested
    • by renoX ( 11677 )
      Thanks for the post, the "answers" given on the blog are so uninteresting that it clearly show that this book isn't worth my time and money..
  • So, this book blows the problem completely out of proportion, demonizes the producers and calls for more attention from the clueless masses?

    Well, let's just get ready to welcome presidential candidate David Rice!
  • by jvkjvk ( 102057 ) on Monday January 21, 2008 @07:29PM (#22132058)
    Well, that may be true. How much is good software going to cost us if everyone is liable for the code they write?

    There are three avenues I can see that a company or individual doing development in the US could take if this becomes law:

    1) Pay the costs to develop bug free software.
    2) Stop developing software.
    3) Move to a country with a less onerous position.

    Of the three, the only one that is actually not feasible is 1! Why, you might ask? Because the company must make a profit, thus must sell the software for more than they developed it for.

    Yes, the shuttle software has ~0 bugs. The cost of that has also been estimated at $1000 per LOC. Apache, for example, might have around 81852 lines of code... $81,852,000, which is not bad considering! The linux kernel (2.6) ~5.2M LOC. Hmm $5B??? Not to mention the glacial pace that shuttle sw moves. The pace Hurd is moving at would look like light speed compared to changes to any medium to large sized codebase.

    But, you might say, what about people who give their software away for free? After all, I just used Apache and linux as examples of what it might cost if commercially developed but they were not! We could just get all that work for free. Free!

    Well, show of hands - who wants to give some software away for free and be liable for the results? Put something up as an individual and one lawsuit (even if wrongly brought) is enough to bankrupt you. I guess there is always posting anonymously but I assume any distributor of the software would then be liable. How many projects on SourceForge would be available if either the contributors (non-anonymous) or SourceForge (for anonymous projects) were liable? Likewise e.g. RedHat, could all be held responsible for not only code they wrote but what they distribute if it was anonymous code.

    Then there are shared objects like libraries. Is is misuse of the library by the end developer that caused the issue or a bug in the library itself? Or should this have been caught by the QA of the end developer? Are both liable? It could get very entertaining.

    So, we may be experiencing $180B loss for bad software, but I happen to think that we might lose much much more if software liability were a reality.

    Not that MS, IBM, Oracle, Apple, Adobe, RedHat, etc... would ever allow this to happen.

    Please note: Nothing in the above states that I'm for buggy software being written. I believe that we simply don't have the tools to liability proof these types of products yet in a cheap, fast way. We can write good software. We can even write great software. But that one bug you didn't catch is the one they will sue you for.
  • I really hope this doesn't inspire some federal policy maker to require some sort of licensing to write code.

    I don't even have a bachelor's degree - but I'm the best programmer in an office full of CS graduates, by their own admission.

  • Now the bad news -- we live in a society that tolerates 20,000 annual alcohol-related fatalities (40% of total traffic fatalities) and cares more about Brittany Spears' antics than the national diabetes epidemic. Expecting the general public or politicians to somehow get concerned about abstract software concepts such as command injection, path manipulation, race conditions, coding errors, and myriad other software security errors, is somewhat of a pipe dream.

    Thank God. The last thing we need is someone in Washington writing the "SQL Injection Elimination Act of 2008," or some such nonsense. Even when the government has good intentions, it screws things up. For example, you mentioned diabetes. The rise of Type II diabetes can be linked most readily to the rise of corn based products in our food supply, especially corn syrup. What's the most heavily subsidized food? Corn. The government is actually paying people to make us unhealthy.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...