Geekonomics 227
Ben Rothke writes "First the good news — in a fascinating and timely new book Geekonomics: The
Real Cost of Insecure Software, David Rice clearly and systematically
shows how insecure software is a problem of epic proportions, both from an
economic and safety perspective. Currently, software buyers have very
little protection against insecure software and often the only recourse they
have is the replacement cost of the media. For too long, software
manufactures have hidden behind a virtual shield that protects them from any
sort of liability, accountability or responsibility. Geekonomics
attempts to stop them and can be deemed the software equivalent of Unsafe at
Any Speed. That tome warned us against driving unsafe automobiles;
Geekonomics does the same for insecure software." Read on for Ben's take on this book.
Now the bad news — we live in a society that tolerates 20,000 annual
alcohol-related fatalities (40% of total traffic fatalities) and cares more
about Brittany Spears' antics than the national diabetes epidemic.
Expecting the general public or politicians to somehow get concerned about
abstract software concepts such as command injection, path manipulation, race
conditions, coding errors, and myriad other software security errors, is
somewhat of a pipe dream.
Geekonomics: The Real Cost of Insecure Software | |
author | David Rice |
pages | 362 |
publisher | Addison-Wesley |
rating | 9 |
reviewer | Ben Rothke |
ISBN | 978-0321477897 |
summary | How insecure software costs money and lives |
Geekonomics is about the lack of consumer protection in the software market and how this impacts economic and national security. Author Dave Rice considers software consumers to be akin to the proverbial crash test dummy. This combined with how little recourse consumers have for software related errors, and lack of significant financial and legal liability for the vendors, creates a scenario where computer security is failing.
Most books about software security tend to be about actual coding practices. Geekonomics focuses not on the code, but rather how insecurely written software is an infrastructure problem and an economic issue. Geekonomics has 3 main themes. First — software is becoming the foundation of modern civilization. Second — software is not sufficiently engineered to fulfill the role of foundation. And third — economic, legal and regulatory incentives are needed to change the state of insecure software.
The book notes that bad software costs the US roughly $180 billion in 2007 alone (Pete Lindstrom's take on that dollar figure). Not only that, the $180 billion might be on the low-end, and the state of software security is getting worse, not better, according the Software Engineering Institute. Additional research shows that 90% of security threats exploit known flaws in software, yet the software manufacturers remain immune to almost all of the consequences in their poorly written software. Society tolerates 90% failure rates in software due to their unawareness of the problem. Also, huge amount of software problems entice attackers who attempt to take advantage of those vulnerabilities.
The books 7 chapters are systematically written and provide a compelling case for the need for security software. The book tells of how Joseph Bazalgette, chief engineer of the city of London used formal engineering practices in the mid-1800's to deal with the city's growing sewage problem. Cement was a crucial part of the project, and the book likens the development of secure software to that of cement, that can without decades of use and abuse.
One reason software has significant security vulnerabilities as noted in chapter 2, is that software manufacturers are primarily focused on features, since each additional feature (whether they have real benefit or not) offers a compelling value proposition to the buyer. But on the other side, a lack of software security functionality and controls imposes social costs on the rest of the populace.
Chapter 4 gets into the issues of oversight, standards, licensing and regulations. Other industries have lived under the watchful eyes of regulators (FAA, FDA, SEC, et al) for decades. But software is written removed from oversight by unlicensed programmers. Regulations exist primarily to guard the health, safety and welfare of the populace, in addition to the environment. Yet oversight amongst software programmers is almost nil and this lack of oversight and immunity breeds irresponsibility. The book notes that software does not have to be perfect, but it must rise to the level of quality expected of something that is the foundation of an infrastructure. And the only way to remove the irresponsibility is to remove the immunity, which lack of regulation has created a vacuum for.
Chapter 5 gets into more detail about the need to impose liability on software manufacturers. The books premise is that increased liability will lead to a decrease in software defects, will reward socially responsible software companies, and will redistribute the costs consumers have traditionally paid for protecting software from exploitation, shifting it back to the software manufacturer, where it belongs.
Since regulations and the like are likely years or decades away, chapter 7 notes that short of litigation, contracts are the best legal option software buyers can use to leverage in address software security problems. Unfortunately, most companies do not use this contractual option to the degree they should which can benefit them.
Overall, Geekonomics is an excellent book that broaches a subject left unchartered for too long. The book though does have its flaws; its analogies to physical security (bridges, cars, highways, etc.) and safety events don't always coalesce with perfect logic. Also, the trite title may diminish the seriousness of the topic. As the book illustrates, insecure software kills people, and I am not sure a corny book title conveys the importance of the topic. But the book does bring to light significant topics about the state of software, from legal liability, licensing of computer programmers, consumers rights, and more, that are imperatives.
It is clear the regulations around the software industry are inevitable and it is doubtful that Congress will do it right, whenever they eventually get around to it. Geekonomics shows the effects that such lack of oversight has caused, and how beneficial it would have been had such oversight been there in the first place.
To someone reading this review, they may get the impression that Geekonomics is a polemic against the software industry. To a degree it is, but the reality is that it is a two-way street. Software is built for people who buy certain features. To date, security has not been one of those top features. Geekonomics notes that software manufacturers have little to no incentive to build security into their products. Post Geekonomics, let's hope that will change.
Geekonomics will create different feelings amongst different readers. The consumer may be angry and frustrated. The software vendors will know that their vacation from security is over. It's finally time for them to get to work on fixing the problem that Geekonomics has so eloquently written about.
Ben Rothke is a security consultant with BT INS and the author of Computer Security: 20 Things Every Employee Should Know.
You can purchase Geekonomics: The Real Cost of Insecure Software from amazon.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
Factual Error! (Score:2, Funny)
Re:Factual Error! (Score:5, Funny)
Re:Factual Error! (Score:5, Funny)
Re:Factual Error! (Score:4, Funny)
But I'm just teasing anyhow, you can stalk all the pop starlets you want, I don't judge.
Re: (Score:2)
Except exposing poor editing and the resulting incredulity of anything said thereafter.
Re: (Score:2, Insightful)
you want the truth or pretty spelling?
Software is under the eyes of regulators (Score:5, Informative)
Re:Software is under the eyes of regulators (Score:4, Insightful)
The rest of software, like word processors, and spreadsheets, and music apps, doesn't need that kind of stringent oversight. A better analogy in such cases is to other mundane things: books, binders, pencils. Poorly designed binders and pencils can lead to lost productivity in the same way that poorly designed software can. Those who care will go for the higher-quality product (which may require more money, either in initial expenditure or in staff expertise). Again, errors in books can certainly lead to lost productivity, but is there really any need for more "book security" and "book oversight" and "book regulations" to make sure that the contents of books are robust and error-free?
I submit that such oversight is not really necessary (again, except in issues of health and physical safety). Most people can tolerate the occasional annoyances of breaking pencils, typos in books, and crashes in software. Ideally people should be educated about risk (e.g. don't put important documents in a flimsy box, put them in a safe; similarly, don't put important data in a low-security computer, get a properly administered server), so they can make informed choices. But more laws and regulation? Not necessary.
Re:Software is under the eyes of regulators (Score:5, Insightful)
Think bad repair manuals (Score:3, Insightful)
Imagine a repair manual for a gas stove that said "blow out pilot light, turn on gas, wait one hour, invite your friends over, and light a match." Sure, it might not steal credit card numbers but in the face of an ignorant and trusting user, it could prove fatal nonetheless.
Re:Software is under the eyes of regulators (Score:4, Interesting)
There are of course meat-space analogies for identity theft and data loss arising from faulty products (locks, paper shredders, photocopiers) or services (shipping errors, clerical errors, corruption). The point is not the analogy per se... the point is that faulty products and services in the real world lead to losses (of time, money, data, personal information, etc.) and to crime. We could reduce these losses by spending more money and effort on higher quality products and services, but there reaches a point where people just don't care anymore (either because they are ignoring the risk, or because the risk is low enough that it isn't worth the additional cost).
The same applies to software: we could make it much more robust, but is the added security worth the burden of more regulation, more overhead, and more money? In some cases, it is... but in many cases it really isn't. Software related to health, personal safety, and financial information should be regulated (in the same way that medicine and financial institutions are regulated). But over-riding laws mandating software security and software liability are not necessary. End-user education is overall more important (both to prevent real-world losses, and computer losses).
Re: (Score:2)
I've yet to see a flaw in a book steal my, or anyone elses, credit card number, or delete all my other books, have you?
In addition there is book oversight and book regulation, in the form of existing negligence laws, advertising laws etc. Publish a textbook on bridge building that ha
Re: (Score:2)
If books did steal credit card numbers, whose fault would that be? The authors, the publishers, the readers or the banks who use credit card numbers as an authentication mechanism rather than Chip and PIN smart cards?
It is really easy to point fingers, but working out where the responsibility should lie is rather harder. I don't think that Microsoft,
Re: (Score:2)
It's way too easy to blame the initial inventors of the C language for not checking for buffer overflows, but that too is a mistake. They wrote the system in the 60s for machines that tod
Re: (Score:2)
Every successful attack against Chip and PIN to date has been against the transition arrangements to support legacy systems. While smartcards are not invulnerable (Pau Kocher's timing a
Re: (Score:2)
Anyway that's a rather sweeping statement.
While smartcards are not invulnerable (Pau Kocher's timing attacks etc) they are more than sufficient to mitigate risk.
More Software Safety-Critical than You Think (Score:2)
Re: (Score:2)
But all desktop software is now identity-critical (Score:3, Insightful)
Yes, a security hole in a web browser won't directly cause loss of *life*. However, what it *can* do by allowing a trojan in is:
a) Drain all your life savings from your bank
b) Place illegal pornography on your computer, leadin
Re: (Score:2)
e) By installing a keylogger (if you're a telecommuter with a VPN, or if you reuse passwords between home and work systems), potentially gain access to internal proprietary corporate networks, with the ability to conduct industrial espionage or control enterprise automation systems or SCADA networks
Re:But all desktop software is now identity-critic (Score:2)
Re: (Score:2)
"Software written for most industries where human lives could conceivably be on the line IS under the watchful eyes of regulators."
I'm not so sure about that. Nobody checked Diebold's software, and now a whole bunch of people are dead in Iraq.
It varies by industry (Score:4, Informative)
Think FDA, FAA, NRC, etc.
Now, systems that are nominally non-critical but which in fact are used as infrastructure may be unregulated and subject to the very problems described in the book.
For example, most smart cell phones aren't engineered for untra-security. If I am a terrorist and I know ACME Electric Company uses the Plinko 100(TM) cell phone to communicate with its field operators, I can hire some cyber-criminals to schedule an attack on their phones at the same time as I set off a few bombs that take out a few major transmission lines.
If ACME realized its phones were mission critical and used a hardened or at least fault-tolerant communications infrastructure, it would be a lot harder for me to knock out their communications when they need it most.
The problem isn't insecure computers per se. The problem is relying on them without understanding the risks and the consequences of failure.
Re: (Score:2, Insightful)
Should geeks really start shooting themselves in the foot over this? Should we really be screaming out: "Please fine me, jail me, and fire me because I wasn't writing code wit
Getting in ahead of the crowd... (Score:4, Funny)
"This book could be summed up in three words: 'don't use windows'"
I suppose that should be suffixed with some 'tard thing like "lol!!111!!1one"
Re: (Score:2)
Microsoft can afford to defend themselves against a few liability lawsuits. Can Linus?
Re: (Score:2)
A good coder can make windows secure. A slacker can ensure that SecureBSD is insecure.
This isn't a very useful way of looking at this, because "security" isn't a binary yes/no concept. This is akin to saying "all areas have crime, therefore all areas are equally risky to be in". Obviously that is false. In reality your underlying risk profile is effectively a probability (in matters of crime and computer security), and differs dramatically from one system / area to the next.
To continue the crime analogy,
Nothing's going to change (Score:5, Interesting)
And yet, as a former coder then vendor, I always found it hard to get people to pony up for better education for programmers, analysts, project managers, or better coding tools, exhaustive testing protocols, whatever.
Now as a consultant, I face the same struggle getting people to be serious about backups, redundancy/eliminating single points of failure...
As long as it's not their head on the block, even senior managers will most often favour commercial expendiency over prudence. This in the face of many high-profile disasters that cost a lot more to put right than they would have done to do properly.
It's a gamble (Score:2, Insightful)
If you are a greedy-bastard manager and you expect to be in your position for only a few years, all you care about is the failures that will come back to haunt you. You don't care if spending $5M now will save $1M in expenses over the next 5 years but save an additional $20M 10 years down the road. By then you and your greed
Go back and read _Free to Choose_... (Score:5, Insightful)
MOD PARENT UP (Score:2)
Re: (Score:2)
"No other industry can say "you can't sue us if the product you purchased from us does not do what it was intended to do""
And here's where the GPL comes in handy.
There's the difference - you don't have to purchase free/libre open-source software; if you want to be indemnified, you can buy a distro from RedHat, Novell, Oracle, and (soon) Sun.
A contract requires a "consideration", usually payment, in return for the "good" or "service". There is no contractual arrangement between someone who downloads a
Re: (Score:2)
To be h
Re: (Score:2)
"Leave law to the lawyers, kid."
Never. There are too many incompetent lawyers out there. Too many times I've had to fire them and finish the job myself (and I'm not the only one to notice this).
Funny how you mention doctors and lawyers - lawyers keep on trying to do the same thing, to polish their turd-like image, but the public rates them closer to used car salesmen than to doctors.
How about lawyers take responsibility for their bad advice for a change? Or better yet, lets replace them with softwa
Re: (Score:2)
This would be different from other regulations because it's totally impossible to make unregulated software unwritable or unrunnable. If legislators don't grasp that, then all bets are off, of course. The regulation would more likely come in the form of watching businesses over a certain size and making sure they used approved software where needed.
Under that kind of regulation, open-source could flourish as in many a
as the review says (Score:5, Insightful)
In my experience there is so much feature creap in software projects that there always seems to be that last feature that needs to get squeezed into the next release at the last moment and there isn't time to test. "lets just hope that 10k line module works and is secure. Even if it's not, we can always release a SP after we have the product on the market". It is even to the point where major software companies (MS comes to mind) have a concept of Zero Bounced Bugs. That is the point where the bugs getting fixed equals the bugs being found. If no "major" bugs and you've reached ZBB you ship. Now I can see you can't wait forever to ship, but there is this inherit acceptance of flaws in software that you won't see in say bridge building.
Re:as the review says (Score:5, Insightful)
Bridge building isnt really all that complex, there is a hell of a lot more going on in a software product of any real magnitude than in a bridge. Sure, there are a few things like wind you have to take into account, but there really arent as many variables in bridge building as there is in software development.
In addition to that, software has to be exactly perfect, with a bridge you can just say "screw it, lets reinforce/add supports/whatever here, here, and there just to be safe" and you are good to go. (I know I am oversimplifying to some degree, but you see my point) It is possible to give yourself a lot more room for error.
Re: (Score:3, Insightful)
(I know I am oversimplifying to some degree, but you see my point)
Have you ever stopped to wonder if you are actually over-complicating software design rather than over-simplifying the analogy?
Re:as the review says (Score:5, Insightful)
Another reason is that you have too many people building a bridge for the majority to be badly built. You have the engineers, the construction company, the foremen and the works are looking at the bridge. Are all these people going to be qualified to catch an error? No, but enough of them will be qualified enough to catch an error that it's unlikely to be a problem. On the other hand, we have software, where there are lines of code that have never been seen by anyone but the original programmer.
Re:as the review says (Score:4, Insightful)
Re: (Score:2)
Re: (Score:2)
Design me a bridge which needs to be dropped in place across multiple places (which may be a rivulet, or the Grand Canyon, or the Bering Straits, or a bridge between Mt. Everest and Mt. Kiliminjaro), and just work out of the box. It needs to be capable of supporting any type of vehicle, including those
Re: (Score:2)
In building and designing a bridge, you're not going to have your boss walk in halfway through the construction and tell you that you need to use this new concrete that only comes from LargeHard(c). You're not going to build the bridge so that you can take it from a two lane bicycle bridge to a 12 lane, double decker toll bridge with a minimum of work. You're never going to have someone walk over the bridge and promptly say, "sorry, this river is actually 50 feet wider, and I don't like the color, can you change that?"
And the question you should be asking yourself is why does civil engineering not have radical and constant requirements changes through the entire design process. The answer is: often they actually do, though not to the extent that some software projects do; and also they tend to have contracts that rule out last minute silliness -- or at the least make it rather expensive. Software developers bring it on themselves to soem extent by simply accepting all these changes. If someone comes to you with a last m
Re: (Score:2)
Still, if you want to build a bridge and insure yourself against bad design, you can often take the "brick shit house" approach and just overdesign it with only an increase in materials cost. This would be roughly analogous to just using more memory or storage in software-land.
The I-35 bridge collapse was a design flaw, so bridges aren't immune to "coding problems", either.
Re: (Score:2)
My sister in laws brother worked for a very large state agency and she left on agony over such burocracy, project mismanagement, and more."
Then he is wrong
Re: (Score:2)
Re:as the review says (Score:5, Interesting)
Designing a serious bridge is a LOT more difficult than 90% of software projects out there. You have a base you can build on of tried and true designs, but from scratch, it's not very easy.
I say this as someone who works with computer administration, programming and database work professionally, but got I a minor in Engineering. I know what goes into it.
Re: (Score:2)
Re: (Score:2)
Designing a serious bridge is a LOT more difficult than 90% of software projects out there. You have a base you can build on of tried and true designs, but from scratch, it's not very easy.
This was also my thought. People expect a lot from an industry that has only been around for about 70 years. If we had the history of bridges, with all it's successes, failures, and practical designs that came from them, maybe programming would be in a better state. But I think we're at the point, right now, where we're just starting to build the equivalent of bridges that are vital to major traffic, but still haven't formalized the rules for how best to do that. This is also why I somewhat fear regula
Re: (Score:2)
When you build the bridge, you know how long it is and how many cars/trucks it will need to support.
To take the bridge analogy to software. You start out building a bridge over the Mississippi, that will handle 10,000 cars an hour or somesuch. When you're done the client tries to place your bridge across the English Channel, and land 747's on top of it.
It's all about the requirements.
Re: (Score:2)
Re: (Score:2)
Fact is, we have a distinct science/engineering/craft/whateveryoucallit here. Analogies are pointless.
Re: (Score:2)
A comparison that is adequate to your statement would be to indicate "Hello World" is an application and sticking a popsickle stick over a rain ru
Re: (Score:2)
But the point is, Software and Civil Engineering are so different disciplines that analogies are useless, theirs complexities are from different natures. Requirements of a bridge can't change drastically after a its construction has begun. Big new features aren't asked to be implemented after it has been
Re: (Score:2)
Bridge building (Score:2)
Yeah, what could possibly go wrong? [google.com]
Re: (Score:2)
Bridge building isnt really all that complex, there is a hell of a lot more going on in a software product of any real magnitude than in a bridge
I'd really like to see someone prove this or even provide any evidence at all. A real bridge is a collection of thousands or even millions of parts. Each and every part is unique when considered in fine detail. The crystal structure of the metal, the exact surface detail, the exact overall shape, the stresses experienced during manufacture, the stresses experie
Re: (Score:2)
"Software is not sufficiently engineered to serve as a foundation" [for society] - I agree whole hardily. Things are getting better but we still have very little idea whether what we code "works" or not, let alone is secure.
This is a complete load of crap. To use a car analogy, 50,000 people are killed on the roads every day. Traffic and accidents cost billions of dollars of manhours every year not even considering the environmental factor. I'd say our transportation system isn't "engineered to serve as a foundation." How many lives did software kill last year? I'm guessing it's pretty far in the black in that it saved far more than it cost. How many dollars in manhours did software cost companies last year? Well I'll tell y
Re: (Score:2)
Well, excuuuse me... (Score:5, Informative)
Not necessarily (Score:2)
Re: (Score:2)
> and profits - WELL more than enough to make their software MUCH more secure if they wanted to,
So you now want to dictate every company how much profits it can make? What a socialist attitude! Profit margins and quality are separate considerations. If you raise quality, you raise the price; that's how business works. If the company decides to spend more effort on security, it will raise the price. Not because it has t
Re: (Score:2)
So you now want to dictate every company how much profits it can make? What a socialist attitude!
WTF!? Where did I say that? I'm a libertarian, you idiot, get some reading comprehension skills and read my post again. Seriously, are you just trolling, or are you really so thick that you are unable to comprehend basic English?
I'm suggesting that FUCKING MARKET COMPETITION bring down prices, you dolt.
If you think that raising the quality always raises the price, you are completely clueless as to how even
Re: (Score:2)
> I'm suggesting that FUCKING MARKET COMPETITION bring down prices, you dolt.
I must have trouble comprehending your basic English, since I can't seem to find where you suggest that
> If you think that raising the quality always raises the price, you are completely
> clueless as to how even the basics of business work. Where do you think profits go?
They go to:
Re: (Score:2)
So you now want to dictate every company how much profits it can make? What a socialist attitude!
Here, since you are having trouble reading, let me repeat my very own words in bold for you again:
"I'm certainly not advocating legally mandating anything, I'm in favor of free markets, and free markets can 'solve' this problem if the markets become more informed and start demanding better"
Now how you got from that to socialism and "dictating how much profits" companies make, only God knows. Perhaps you mea
Re: (Score:2)
the cost of software will rise even higher than it is today
Actually the point of that book is that the "real" cost of software is already much higher than what you see as the price because of externalities. A lousy manufacturer that pollutes a river forces some of the costs of production onto the downstream inhabitants; similarly, a software vendor that sells insecure software pushes external costs down onto its own users (e.g. the cost of antivirus and anti-spyware software, downtime from virus infection
Re: (Score:2)
People always prefer to pay the externalities over the base cost. For example, most stores offer extended warranties on the stuff they sell, and yet most people choose not to buy them, even though the extra liability thus purchased would offset future costs of repair or replacement. I, for one, prefer cheap software that breaks occasionally to expensive software that never breaks, since the perceived
I can't wait (Score:3, Funny)
I hope to pass my operators test so I can get my license.
what about OSS? (Score:5, Interesting)
OSS cracks aside, it would be nice to see if the book talks about that side of things at all; the impression i got from the review is that there's not much distinction drawn between software licensing and development models, and that it's all sorta lumped in together.
so if, as the book seems to suggest, software development were regulated more closely, who would be accountable, or audited, or whatever, for an OSS project with heavy community involvement that's seeing commercial applications? or with an OSS project that gets implemented as part of a for-profit piece of software?
i'm curious, because i have less than zero experience in how this stuff actually works, but it seems like it would be a weird situation. anyone have any insight?
Who will advocate change? (Score:5, Interesting)
I say the market itself will solve the problems with software security. New companies or new software products will only replace existing ones if the new ones are better. And like the book mentions, "better" is often measured in features. However, if enough damage is done with the current software flaws, some of the new features will include better security.
Example: Company A is sued by Customer B when Attacker C exploits a hole in Company A's software resulting in a financial loss for Customer B. Like the book mentions, Customer B usually has no legal grounds to sue. However, if this happens multiple times, Customer B may get wise and ensure proper contracts when entering new agreements.
These contracts could be required by customers when dealing with both closed source and open source companies. Buying a support contract from Sun for MySQL _could_ include certain software security requirements. And if Sun does not support this service, a business opportunity exists for another company.
Re: (Score:2)
An industry full of big incumbents who can afford the overhead of a regulatory compliance department, an industry afraid of small fast-moving competitors, competitors who could be mired in tar and crushed by the burden of regulation.
Re: (Score:2, Insightful)
What-anomics? (Score:2)
Hm-m-m-m... (Score:2)
Re: (Score:3, Funny)
all the best,
drew
Re: (Score:2)
Re: (Score:2)
It really is a curvy sort of statement to make isn't it?
all the best,
drew
Re: (Score:2)
Re:Hm-m-m-m... (Score:4, Interesting)
I disagree. Add up all the time spent re-installing windows, cleaning PC's, deleting or countering spam, etc, etc. I think they are right on target, spam, spyware, buffer over-runs, worrying about your popular website being hacked and extorted by crime.
A few points:
1. Organized crime takes advantage and exploits / extorts companies (the kid who made the milliondollarhomepage was threatened with extortion).
2. The capacity for economic espionage is quite large.
3. Then there is 'just for kicks' aspect of causing havoc.
4. Bad people who don't like us attack our networks/software/etc.
5. Orwellian trojans (i.e. governments, criminals, or corporations of the world infecting your computer with rootkits, i.e. we already have one example: Sony).
Also corporations who are criminals such as Mediadefender, which was hacked
http://blog.wired.com/27bstroke6/2008/01/interview-with.html [wired.com]
The Next Medical Malpractice? (Score:2)
Luckily, its much easier to switch out of software than out of medicine (given that one has invested 8+yrs to a career in medicine.) So the smart folks switch out, leaving the weaker folks to create more buggy programs. A race to the bottom!
My favorite example... (Score:2, Insightful)
Why is it that any time someone talks about software engineering they always bring up bridge/house/skyscraper building? Yes, Joseph Bazalgette used "formal engineering practices" to build London's sewers, but where did these formal practices come from? Why yes, through trial and error. Thousands of years of trial and error. Use
OT: Drunk driving (Score:4, Insightful)
But this isn't because we don't care.
Obviously, all those things I listed show that people do care; however, they are going the wrong things to address the problem. We have allowed special interests like MADD, who are modern-day temperance societies, dictate these changes to us with little review or oversight. It has been statistically proven [ct.gov] that fatalities do not decrease with aMaybe we need to do more. But remember that there will always be people who insist on doing the wrong thing, and finding a way to do it.
Re: (Score:3, Insightful)
Re: (Score:2)
I hope MADD doesn't hear this and think he was being serious.
Re: (Score:2)
Re: (Score:2)
regulation, licensing, liability; choices (Score:2)
The /. summary talks about three completely different things: regulation, licensing, and liability. Regulation seems completely nutty to me; legislators don't have the expertise to do it right, and if they do it wrong they could easily, e.g., strangle the open-source movement in the cradle. Licensing historically has has very little to do with safety or quality; in my state, IIRC, hairdressers are licensed, and it's basically a way of reducing competition in the labor pool so that the hairdressers who lobbi
Liability laws are insane -- another take (Score:2)
$180B compared to how much productivity (Score:2)
You will love Mr Rice's opinions on open source (Score:5, Informative)
Geekonomics reviewed by Richard Bejtlich [amazon.com]:
As far as open source goes (ch 6), the author makes several statements which show he does not understand the open source world. First, on p 247 the author states "While a binary is easy for a computer to read, it is tremendously difficult for a person -- even the original developer -- to understand." This is absolutely false, and the misunderstandings continue in the same paragraph. Reverse engineering techniques can determine how binaries operate, even to the point that organizations like the Zeroday Emergency Response Team (ZERT) provide patches for Microsoft vulnerabilities without having access to source code!
Second, on p 248 the author states "The essence of open source software is the exact opposite of proprietary software. Open source software is largely an innovation after-the-fact; that is, open source software builds upon an idea already in the marketplace that can be easily replicated or copied." On what planet?
Third, on p 263 the author states "[O]pen source projects are almost always threatened by foreclosure," meaning if the developer loses interest the users are doomed. That claim totally misses the power of open source. When a proprietary software vendor stops coding a product, the customers are out of luck. When an open source software developer stops coding a product, the customers are NOT out of luck. They can 1) hope someone else continues the project; 2) try continuing the project themselves; or 3) hire someone else to continue developing the product. Finally, if the author is worried about open source projects not having an organization upon which liability could be enforced, he should consider the many vendors who sell open source software.
David Rice responds on his blog [geekonomicsbook.com].
Thanks for the informative post (Score:2)
I don't have any moderation points to add to Helevius' karma, but I can send my things for posting an informative article.
Its pretty clear to anyone paying attention that the fact that software vendors like Microsoft pay no price for security failures in their products means that they don't have much incentive to fix them (see Why are there still e-mail viruses? [bearcave.com]). So I was inclined to agree with what seemed to be the theme of Geekonomics. But from looking at the quotes above and the authors blog, it
Re: (Score:2)
His argument that the ability to continue to use (and improve and fix bugs in) a product someone else decided wasn't worth their while to contine working on is an economically bad thing is totally bizarre. I also think his terminology is unnecessarily biased. A developer choosing to work for company A instead of company B does not "rob" company B of their efforts.
I'd be kind of interested
Re: (Score:2)
Software equivalent of Unsafe at Any Speed? (Score:2)
Well, let's just get ready to welcome presidential candidate David Rice!
Bad software costs $180B! (Score:4, Insightful)
There are three avenues I can see that a company or individual doing development in the US could take if this becomes law:
1) Pay the costs to develop bug free software.
2) Stop developing software.
3) Move to a country with a less onerous position.
Of the three, the only one that is actually not feasible is 1! Why, you might ask? Because the company must make a profit, thus must sell the software for more than they developed it for.
Yes, the shuttle software has ~0 bugs. The cost of that has also been estimated at $1000 per LOC. Apache, for example, might have around 81852 lines of code... $81,852,000, which is not bad considering! The linux kernel (2.6) ~5.2M LOC. Hmm $5B??? Not to mention the glacial pace that shuttle sw moves. The pace Hurd is moving at would look like light speed compared to changes to any medium to large sized codebase.
But, you might say, what about people who give their software away for free? After all, I just used Apache and linux as examples of what it might cost if commercially developed but they were not! We could just get all that work for free. Free!
Well, show of hands - who wants to give some software away for free and be liable for the results? Put something up as an individual and one lawsuit (even if wrongly brought) is enough to bankrupt you. I guess there is always posting anonymously but I assume any distributor of the software would then be liable. How many projects on SourceForge would be available if either the contributors (non-anonymous) or SourceForge (for anonymous projects) were liable? Likewise e.g. RedHat, could all be held responsible for not only code they wrote but what they distribute if it was anonymous code.
Then there are shared objects like libraries. Is is misuse of the library by the end developer that caused the issue or a bug in the library itself? Or should this have been caught by the QA of the end developer? Are both liable? It could get very entertaining.
So, we may be experiencing $180B loss for bad software, but I happen to think that we might lose much much more if software liability were a reality.
Not that MS, IBM, Oracle, Apple, Adobe, RedHat, etc... would ever allow this to happen.
Please note: Nothing in the above states that I'm for buggy software being written. I believe that we simply don't have the tools to liability proof these types of products yet in a cheap, fast way. We can write good software. We can even write great software. But that one bug you didn't catch is the one they will sue you for.
Oh, dear god... (Score:2)
I don't even have a bachelor's degree - but I'm the best programmer in an office full of CS graduates, by their own admission.
Please Remain Calm (Score:2)
Now the bad news -- we live in a society that tolerates 20,000 annual alcohol-related fatalities (40% of total traffic fatalities) and cares more about Brittany Spears' antics than the national diabetes epidemic. Expecting the general public or politicians to somehow get concerned about abstract software concepts such as command injection, path manipulation, race conditions, coding errors, and myriad other software security errors, is somewhat of a pipe dream.
Thank God. The last thing we need is someone in Washington writing the "SQL Injection Elimination Act of 2008," or some such nonsense. Even when the government has good intentions, it screws things up. For example, you mentioned diabetes. The rise of Type II diabetes can be linked most readily to the rise of corn based products in our food supply, especially corn syrup. What's the most heavily subsidized food? Corn. The government is actually paying people to make us unhealthy.
Re: (Score:2)
TZ
Re: (Score:2)
There's just one reason why insecure software abounds: because doing it right is expensive, and few people want to pay. Those that do want it (aviation systems, nuclear reactors, etc) do pay for it, and do get it.
Re: (Score:2)
All hail the new Slashdot, just the same as the old Slashdot!