Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Image

The Myths of Security 216

brothke writes "The Myths of Security: What the Computer Security Industry Doesn't Want You to Know is an interesting and thought-provoking book. Ultimately, the state of information security can be summed up in the book's final three sentences, in which John Viega writes that 'real, timely improvement is possible, but it requires people to care a lot more [about security] than they do. I'm not sure that's going to happen anytime soon. But I hope it does.'" Read on for the rest of Ben's review.
The Myths of Security: What the Computer Security Industry Doesn't Want You to Know
author John Viega
pages 260
publisher O'Reilly Media
rating 8
reviewer Ben Rothke
ISBN 978-0596523022
summary A contrarian provides an interesting look at the information security industry
The reality is that while security evangelists such as Viega write valuable books such as this, it is for the most part falling on deaf ears. Most people don't understand computer security and its risks, and therefore places themselves and the systems they are working in danger. Malware finds computers to load on, often in part to users who are oblivious to the many threats.

Much of the book is made up of Viega's often contrarian views of the security industry. With so much hype abound, many of the often skeptical views he writes about, show what many may perceive are information security truths, are indeed security myths.

From the title of the book, one might think that there is indeed a conspiracy in the computer security industry to keep users dumb and insecure. But as the author notes in chapter 45 — An Open Security Industry, the various players in the computer security industry all work in their own fiefdoms. This is especially true when it comes to anti-virus, with each vendor to a degree reinventing the anti-virus wheel. The chapter shows how sharing amongst these companies is heavily needed. With that, the book's title of What the Computer Security Industry Doesn't Want You to Know is clearly meant to be provocative, but not true-life.

The book is made up of 48 chapters, on various so called myths. Most of the chapter are 2-3 pages in length and tackle each of these myths. The range of topics covers the entire security industry, with topics spanning from various security technologies, issues, risks, and people.

While not every chapter is a myth per se, many are. Perhaps the most evocative of the security myth is chapters 10 — Four Minutes to Infection and chapter 22 — Do Antivirus Vendors Write their own Viruses?. But the bulk of the book is not about myths per se, rather an overview of the state of information security, and why it is in such a state.

In chapter 16, The Cult of Schneier [full disclosure — Bruce Schneier and I work for the same company], Viega takes Schneier to task for the fact that many people are using his book Applied Cryptography, even though it has not been updated in over a decade. It is not fair to blame him for that. While Viega admits that he holds Schneier in high esteem, the chapter reads like the author is somehow jealous of Schneier's security rock star status.

Chapter 18 is on the topic of security snake oil, ironically a topic Schneier has long been at the forefront of. The chapter gives the reader sage advice that it is important to do their homework on security products you buy and to make sure you have at least a high-level understanding of the technical merits and drawbacks of the security product at hand. The problem though is that the vast majority of end-users clearly don't have the technical wherewithal to do that. It is precisely that scenario that gives rise to far too many security snake-oil vendors.

Perhaps the best chapter in the book, and the one to likely get the most comments, is chapter 24 — Open Source Security: A Red Herring. Viega takes on Eric Raymond's theory of open source security that "given enough eyeballs, all bugs are shallow." Viega notes that a large challenge with security and open source is that a lot of the things that make for secure systems are not well defined. Viega closes with the argument that one can argue open versus closed source forever, but there isn't strong evidence to suggest that it is the right question to be asking in the first place.

Overall, The Myths of Security: What the Computer Security Industry Doesn't Want You to Know is good introduction to information security. While well-written and though provoking, the book may be too conceptual and unstructured for an average end-user, and too basic for many experienced information security professionals. But for those that are interested, the book covers the entire gamut of the information security, and the reader, either security pro or novice, comes out much better informed.

While the author makes it clear he works for McAfee, and at times takes the company to task; the book references McAfee far too many times. At times the book seems like it is an advertisement for the company.

Viega does give interesting and often entertaining overviews of what we often take for granted. Some of the books arguments are debatable, but many more are a refreshing look at the dynamic information security industry. Viega has sat down and written his observations of what it going on. They are worth perusing, and the book is definitely worth reading.

Ben Rothke is the author of Computer Security: 20 Things Every Employee Should Know .

You can purchase The Myths of Security: What the Computer Security Industry Doesn't Want You to Know from amazon.com. Slashdot welcomes readers' book reviews — to see your own review here, read the book review guidelines, then visit the submission page.

*

This discussion has been archived. No new comments can be posted.

The Myths of Security

Comments Filter:
  • by oldspewey ( 1303305 ) on Monday August 31, 2009 @03:24PM (#29265021)
    Lots of friends and family - people who are otherwise thoughtful, intelligent, and clueful - simply don't think about security. That will always be the weak link. You can't "design around" the casual negligence of hundreds of millions of users.
  • Common Problem (Score:3, Insightful)

    by SilverHatHacker ( 1381259 ) on Monday August 31, 2009 @03:26PM (#29265057)
    Security is only one of many issues that could be vastly improved if people cared more than they currently do.
  • by castironpigeon ( 1056188 ) on Monday August 31, 2009 @03:31PM (#29265133)
    If the book can be summarized in those last three sentences is it really worth the read? I think /.ers will realize before turning the first page that even the most ridiculously complex security system can be thwarted by stickies posted to people's monitors.
  • I try to educate people carefully and non-confrontationally every chance I get. It's an uphill battle, but one I think is worth fighting.

  • by fuzzyfuzzyfungus ( 1223518 ) on Monday August 31, 2009 @03:36PM (#29265215) Journal
    You might well be able to, actually. You just can't preserve the user's freedom while doing so.
  • by mraudigy ( 1193551 ) on Monday August 31, 2009 @03:37PM (#29265221)
    The biggest problem and risk with computer security is ultimately the users. And, unfortunately, you just can't fix stupid...
  • by fuzzyfuzzyfungus ( 1223518 ) on Monday August 31, 2009 @03:46PM (#29265327) Journal
    On the minus side, while your car may be safe, having to get one of the keys replaced will make you feel like your wallet has been stolen. Obviously, that isn't intrinsic to the technology, a similar system could have been implemented as a cheap industry standard; but that moment of technological change(while it did increase security) also allowed the vendors to strengthen their positions.
  • It can protect you (Score:5, Insightful)

    by davidwr ( 791652 ) on Monday August 31, 2009 @03:55PM (#29265425) Homepage Journal

    If it raises the cost of hurting you to higher than the adversary is willing to spend, it protects you.

    The trick is knowing how much security is worth paying for.

    If the adversary is willing to spend $1000 to attack you, and you have to spend $100 a month to raise the cost of an attack to $1001, and if a successful attack will cost you $1 and the number of successful attacks will be 1 per decade because face it, you don't have much to offer, then it's not cost-effective. On the other hand, if an adversary is willing to spend the same $1000 and it will cost you the same $100 a month to make yourself too expensive to attack, but each breach will cost you $500 and there will be about 1 breach per month if you don't invest, then suddenly things look different.

  • Re:Common Problem (Score:3, Insightful)

    by Chris Mattern ( 191822 ) on Monday August 31, 2009 @04:02PM (#29265519)

    The problem is that when computers get to that point, they won't do what you want, they'll do what *they* (and the people who made them) want.

  • Re:Common Problem (Score:4, Insightful)

    by bberens ( 965711 ) on Monday August 31, 2009 @04:04PM (#29265535)
    I'm sure I'll be modded down for this, but I don't see why a company or person SHOULD concern themselves more with security than they do currently. A simple cost/benefit analysis of what it actually entails to become "secure" shows that it's simply not worth it. It's the same math that goes into determining whether to do a vehicle recall and whether or not to install a home security system. If you look at it in those terms, you'll see we're dramatically over-spending on security.

    And yet... I'm often considered paranoid by my peers (IT and otherwise) with respect to my personal information.
  • by smartr ( 1035324 ) on Monday August 31, 2009 @04:05PM (#29265549)
    There's plenty of monetary incentive for math to come forth and reverse things. For all we know, P = NP and public key encryption is broken as a pure concept. But we don't, and no one is able to step up and take tons of money to prove one way or the other.
  • Re:Common Problem (Score:3, Insightful)

    by Meshach ( 578918 ) on Monday August 31, 2009 @04:10PM (#29265623)

    The problem is that when computers get to that point, they won't do what you want, they'll do what *they* (and the people who made them) want.

    I think that is one of the big hurdles for Linux adaption in mainstream society. People don't want an O(1) scheduler. They don't want nifty commands. They don't to fiddle with things. They just want it to work with the least effort on their part.

  • by cusco ( 717999 ) <brian.bixby@gmail . c om> on Monday August 31, 2009 @04:20PM (#29265763)
    Wow, just imagine the uproar if M$ tried something like that. I can't think of a single Windows user who wishes that Microsoft controlled access to every piece of hardware or software that would ever plug into a Windows machine, or who would be happy to pay Microsoft for that right. All I can say is, "Wow".
  • by fuzzyfuzzyfungus ( 1223518 ) on Monday August 31, 2009 @04:29PM (#29265913) Journal
    I'm sure MS would never do that (directly) to Windows; but that is basically the XBox360.

    Now, getting people to cheer them for it is something that only one of the Steves can manage.
  • Re:Common Problem (Score:3, Insightful)

    by plopez ( 54068 ) on Monday August 31, 2009 @04:34PM (#29265995) Journal

    Part of the problem is building it in from the beginning. There is much more fun and/or marketing appeal to build in eye candy, support the latest games, multi-media capabilities, mobile devices support etc. than to design in security.

    A vendor or kernel programmer group should design it in from the ground up. But there isn't really any money in it for vendors and few programmers think of it as fun. With the exception of these guys maybe http://www.openbsd.org/security.html [openbsd.org]

    So in other words, many people are dropping the ball for a variety of reasons, commercial interest, lack of skill or plain disinterest.

    Security should be "plug and play". The user shouldn't have to think about it at all, other than put in the correct key (physical or virtual). Which I think is also part of your point.

  • Re:Thanks! (Score:4, Insightful)

    by kevjava ( 259717 ) on Monday August 31, 2009 @04:40PM (#29266107)

    But, the Schneier chapter isn't meant to piss him off, I have no beef with him whatsoever. I just think the fanboys do the world a disservice by not thinking for themselves, especially when they draw from material that's a decade old.

    The thing is, you're not convincing me that the book is out of date. There is plenty of material in the Internet that is over a decade old and is still relatively current. I read the Cathedral and the Bazaar [catb.org] for the first time last month, and drew a good amount of benefit from its words, even if I'm not ready to swallow it whole. The Mythical Man Month [wikipedia.org] shed quite a bit of perspective on project management in a field that our industry has fifty or so years of experience in, and yet we still do terribly at.

    The principles of cryptography are still the same today as they were in the days of the Roman Empire and the Caesar Cipher, with all the bits about Alice and Bob with Mallory in the middle. Our toys are much more advanced today, and their rate of advance continues to increase, but just what is it that makes our pulling of information from a 10+-year-old book harmful?

    I'm no Schneier "fanboy", and haven't actually read the book; I just genuinely want to know.

  • by quickOnTheUptake ( 1450889 ) on Monday August 31, 2009 @04:46PM (#29266197)
    Yes, but with the car you still have trust issues. As in, when I give my keys to the valet, I have to trust that he actually works for the hotel and isn't just going to go for a joyride when I step in the door. Or when I give my keys to a friend I have to trust that he has good judgment and at least basic driving skills.
    Many of the run-of-the-mill infections are based as much on misplaced trust ("I wanna see dancing bunnies") as they are on weaknesses in the system itself. And trust isn't something a computer can judge (although systems can reduce the number of times we need to trust, e.g., by using the principle of least privilege, centralized software distributions, etc). At the end of the day you will always have to choose between severely limiting what the user is able to do and opening the door to social engineering and user error.
  • by Forge ( 2456 ) <kevinforge@@@gmail...com> on Monday August 31, 2009 @04:59PM (#29266383) Homepage Journal

    There are no myth's of security, just the myth of security itself. Modern computer security is based on the fact that their are algorithms that no one knows how to reverse quickly. Doesn't mean that they can't be reversed however...

    I disagree.

    There are many security myths that have made it into company policy etc...

    For-instance the idea that forcing all staff in a mid sized to large company to update their passwords every months or two is somehow more secure than allowing them to keep the same password indefinitely.

    In practice, this causes them to use simpler passwords that just barely make whatever limits are imposed (I.e. a single number and one capital letter) and to rotate throgh slight modifications of this weak password.
    Password#1
    Password#2
    Password#3

    Etc...

    Or worse yet. Some just write down the password in a place that's easy to find.

    As for those Algorithms. Sure they can be broken. As long as you update them faster than the old ones are broken you should be fine. What bugs me though is when a single bug in an OS is exploited by a thousand different bits of malware and instead of fixing the bug we have a dozen antivirus vendors producing a detector for each of the thousand bits of malware.

  • by Gverig ( 691181 ) on Monday August 31, 2009 @05:17PM (#29266635)

    Your statement, that's a myth, one of many. Sure, there is no ABSOLUTE security, but nobody claims that. There is no absolute physical security either- with enough resources anything can be stolen and anybody can be killed. It's the understanding of how secure you are in any given situation and how to improve your chances of staying safe (in virtual or real worlds) is what defines security and surely, that exists.

  • by binary paladin ( 684759 ) <binarypaladin&gmail,com> on Monday August 31, 2009 @11:06PM (#29269523)

    "What bugs me though is when a single bug in an OS is exploited by a thousand different bits of malware and instead of fixing the bug we have a dozen antivirus vendors producing a detector for each of the thousand bits of malware."

    Which in turn makes my machine run like it's running malware and requires an additional core just to handle all the "security" software I have installed.

  • Re:Thanks! (Score:2, Insightful)

    by Anonymous Coward on Tuesday September 01, 2009 @03:10AM (#29270973)

    This book would have been better off as a series of blog posts. At least then people wouldn't expect things like internal consistency.

    Seriously, was publicly disclosing what you consider to be a harmful vulnerability two chapters after your rant about how bad full disclosure is intentional irony? Or did you just not proof read your own book?

Real Programmers don't eat quiche. They eat Twinkies and Szechwan food.

Working...