Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

How to Cheat at Managing Information Security 120

Ben Rothke writes "Mark Osborne doesn't like auditors. In fact, after reading this book, one gets the feeling he despises them. Perhaps he should have titled this book 'How I learned to stop worrying and hate auditors'. Of course, that is not the main theme of How to Cheat at Managing Information Security, but Osborne never hides his feeling about auditors, which is not necessarily a bad thing. In fact, the auditor jokes start in the preface, and continue throughout the book." Read the rest of Ben's review.
How to Cheat at Managing Information Security
author Mark Osborne
pages 302
publisher Syngres
rating 8
reviewer Ben Rothke
ISBN 1597491101
summary The adventures of an information security professional and his efforts to secure corporate networks


The subtitle of the book is 'Straight talk from the loud-fat-bloke who protected Buckingham Palace and ran KPMG's security practice'. Essentially, the book is Osborne's reminiscence of his years in information security; including the good, the bad, and more often then not, the ugly.

The book is written for someone looking to develop an information security program, or strengthen an existing program, to ensure that all of the critical technology areas are covered.

The thirteen chapters of the book cover the main topics that an information security manager needs to know to do their job. The author candidly notes that this book is not the most comprehensive security book ever written, but contains most of the things a security manager needs to get their job done. The author also observes that information security is different from other disciplines in that there are many good books about disconnected subjects. The challenge is getting the breadth of knowledge across these many areas, which is quite difficult. The challenge of information security is to effectively operate across these many areas.

Chapters 1 and 2 deal with the information security organization as a whole, and the need for information security policy. Chapter 1 details the various areas where a security group should be placed, and describes the pros and cons of each scenario. As one of the scenarios which place information security below the head of audit, Osborne notes that 'if you have any sort of life, you don't want to spend it with the auditors, I promise you'.

Wherever the security group is placed in an organization, its ultimate success or failure is likely to be determined by its level of autonomy and independence. Unfortunately, in far too many organizations, information security is not given that liberty. It is often placed in a subservient role to groups with opposing interests. Any security group or security manager placed in such a situation should likely start working on their resume.

The scenario is described in 'Practical Unix and Internet Security' where author Professor Gene Spafford spells out Spaf's first principle of security administration. This principle states that 'if you have responsibility for security but have no authority to set rules or punish violators, your own role in the organization is to take the blame when something big goes wrong'. Spaf's principle is a cruel reality faced by many of those responsible for information security.

Between those chapters and a few more auditor jokes, Osborne makes the blatently obvious observation that wherever possible, one should eradicate single points of failure. As a corollary to this, Osborne notes that while trying to eliminate these failure points, companies will often build redundant systems. Part of their admiration for these redundant systems is the hope that this will simultaneously reduce performance bottlenecks. But these companies do not realize that the routers, firewalls and switches are not the bottleneck, rather it is the software application which is the bottleneck.

Osborne plays the role of contrarian in chapter 8 when he asks why we need firewalls. He notes that if every database maker, operating system programmer and CRM/ERM vendor put as much effort into security as the firewall vendors do, then there would be no need for firewalls. Furthermore, if each system administrator worked as hard on security as the typical firewall administrator did, and devoted as much time to hardening their servers and laptops as they did; then centralized firewalls would likely not be needed. Given that the firewall-free reality is not happening any time soon, chapter 8 provides a lot of good information on everything you need to know about firewalls.

Chapter 9 is about one of the most maligned security tools, the IDS. After providing an anecdote about a network manager who did not understand the fundamentals of how DHCP operates, and how he used Snort to debug the problem; Osborne provides a meaningful piece of security wisdom when he notes that IDS can help any network or security person understand network traffic. These devices can even give you information on new attacks and how they can be mitigated. But for an IDS (or any security hardware or software device for that matter) to be truly useful, a security professional needs to understand their IT infrastructure, the mechanics of networks and applications and the risks involved. Those who don't understand those three things will only be able to use these security technologies with minimal benefit.

Overall, How to Cheat at Managing Information Security, is an informative and often entertaining introduction to information security. For those that want to get a good overview of the core elements of information security, or strengthen their existing knowledge base, they will find this book to be an informative and valuable read."


You can purchase How to Cheat at Managing Information Security from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

How to Cheat at Managing Information Security

Comments Filter:
  • Osborne makes the blatently obvious observation that wherever possible,


    It is blatantly obvious that my remark on the survey about unneeded editors was correct.

    • spelled correclty.

      I believe it's Syngress (note the extra "s" at the end)

      http://www.syngress.com/ [syngress.com]

    • And what about the redundancy of "blatantly obvious"? "Blatant" means "obviously"; therefore, it says "obviously obvious" which is redundant.
      • by Nutria ( 679911 )
        And what about the redundancy of "blatantly obvious"? "Blatant" means "obviously"; therefore, it says "obviously obvious" which is redundant.

        And emphasises the point even more firmly.

      • by lucerin ( 952392 )
        thank you Dr. Pedantic and the previous posters who, it is blatantly obvious, are all members of the Anal Retentive Guild
        • by 3dr ( 169908 )
          thank you Dr. Pedantic and the previous posters who, it is blatantly obvious, are all members of the Anal Retentive Guild[sic]

          Corrected:
          thank you Dr. Pedantic and the previous posters who, it is blatantly obvious, are all members of the Anal-Retentive Guild

          Oh yes, I'm very serious.

  • by growse ( 928427 ) on Wednesday September 27, 2006 @03:01PM (#16219699) Homepage

    I'm not sure the the comments about firewalls are accurate. Sure, if every software maker paid attention well to security, then we'd be in a lot better position than we are now, but I'm not necessarily sure that building firewall-level security into every application is a good thing.

    For example, if I want to restrict the access to a particular service to a few ip addresses, I'm more likely to do this on my firewall than on the service myself. Sure, the people who make the service could include that functionality, but I like the separation of security out away from the application. I like the fact that I control all my access in one place, and not across hundreds of application-specific config files. I believe modern filewalls can do all sorts of clever things such as coping with DoS attacks, stateful examination of network traffic etc etc etc. Can you imagine what it would be like if every single service had that functionality built in, but implemented slightly differently and with slightly different types of weakness in each one? Think of the duplicated functionality and bloat!

    There's no such thing as software which is immune to malicious attack, but I like to keep my security weaknesses all in one place, and minimize them buy buying my firewalls from a company that has track record and experience in security issues, rather than a company that makes an ftp server for a living.

    • Defense in depth. (Score:5, Insightful)

      by khasim ( 1285 ) <brandioch.conner@gmail.com> on Wednesday September 27, 2006 @03:12PM (#16219871)
      The only reason I'd like to see decent firewalls on the workstations is for more "depth" to my security model.

      With a firewall, it is a single point that can be cracked. If that is your only security point, you'll be wide open if it is every cracked. And "cracked" also includes "someone brings in an infected laptop".

      Now, on the workstation level ...
      #1. No services running that aren't absolutely necessary.

      #2. No open ports that aren't absolutely necessary.

      #3. Any open ports/running services will ONLY accept connections from my servers / admin workstations. Anything else is logged and I am alerted.

      Most of this can be accomplished with an IDS. I'd like the workstation firewalls AND the IDS. Having multiple checks is good. (and the firewall, you need the firewall)
      • Re: (Score:2, Insightful)

        by growse ( 928427 )

        Of course, you're absolutely correct. Anyone who thinks that a single security device/solution will solve all their problems is barking. I was thinking from a more datacenter-oriented point of view, whereby I have lots of boxes, which may all only run a couple of services each (I have webservers, FTP servers, DNS servers, DB servers etc). The rules governing what data can go from A to B tend to get quite complex, and a centralised firewall solution to managing this is the most secure and maintainable.

        Of co

    • by Anonymous Coward
      I'm not sure the the comments about firewalls are accurate. Sure, if every software maker paid attention well to security, then we'd be in a lot better position than we are now, but I'm not necessarily sure that building firewall-level security into every application is a good thing.

      I'm not sure it's possible, let alone feasible.

      How is my application supposed to know how my network is configured? Do I have to let every application snoop into my network config? That doesn't sound secure to me...

      My firewall k
    • The point the author is trying to make is that if vendors spent the effort to ensure that their applications, databases, operating systems, servers, etc are coded securely, then the need for a firewall goes away. He is not endorsing putting firewall-type security on every node. Plain and simple, Firewalls have become the crutch for poor security within an infrastructure. We hide behind them, rather than address the inherent security issues at hand. If an application had a good security model, strong aut
      • by growse ( 928427 )
        A firewall really isn't redundant. At home, maybe, but what percentage of the firewall market do you think home users make up? Compared to corporates running huge datacenters?
        • by asdavis ( 24671 ) on Wednesday September 27, 2006 @04:38PM (#16221163) Homepage
          Ok, lets assume that there is a huge datacenter behind the firewall. What does the firewall do to protect the datacenter? Generally, you do not allow direct inward access from the Internet into a DC proper. Rather, you use a DMZ to host exposed nodes. So in the end, for the DC, the firewall is just a router. It allows traffic from select DMZ nodes to access hosts inside the network. That's really the function of a router. However, we often filter as well to ensure that only the minimum ports and services that are required are passed. Why do we do this? Because we are concerned that the DMZ nodes might get compromised and be used as a gateway into the environment to compromise nodes on the internal network. Why are we concerned about this? Because we have come to accept that the vendors of server platforms, operating systems, middleware, databases, etc ship fundamentally flawed products. They are buggy, exploitable and are not carefully coded to prevent compromise. We trust firewalls, because they are very carefully coded and great pains are taken to ensure that they cannot (generally) be compromised. That is the author's point. Let's spend the time ensuring that products are as well coded as the firewalls and we do not need a firewall. Is this likely to happen? Probably not, but it is a valid point.
          • Re: (Score:3, Interesting)

            by growse ( 928427 )

            I disagree. In a datacenter, you'll probably have each service you provide divided up into various 'cells'. Each one of these cells may connect to the outside world in some way, either through the internet, or some large MPLS cloud, or whatever. Each cell will probably be split up in a number of different ways, traditionally a core and a DMZ. You probably have some sort of management lan infrastructure behind the whole thing as well. You might also want to have some of the cells communicate with each other

            • Re: (Score:2, Insightful)

              by Xaria ( 630117 )
              You are STILL missing the point. If applications were written so they couldn't BE compromised, then it wouldn't be necessary to have firewalls. And there's always a way in - via the VPN that the systems administrators use if nothing else. Hack their workstation at home perhaps. Firewalls give a false sense of security. You *think* no one can get at your applications, so you get sloppy about other things such as IDS/Tripwire and so forth. I've seen it in my current workplace, and it's just the wrong attitude
              • by growse ( 928427 )

                We'll just have to agree to disagree. The problem with applications that can't be compromised is that any user of that software has to trust the people who made it. Or do a complete code audit which is a) expensive and b) impossible. If a company could completly audit every single piece of software it used to make sure it was completely secure, it would *still* use firewalls. Why? Because you can't necessarily trust the guy who's doing the auditing.

                Of course, you can't trust the people who make the firewa

                • by Nonesuch ( 90847 ) *
                  The way I deploy firewalls is as a component of "defense in depth", in part to ensure that one mistake or intentional act by one trusted individual cannot compromise the entire network. If you take that into account, then before your thought experiment could conclude that firewalls are not necessary, you have to postulate not only perfectly secure operating systems and protocols and applications, but also perfectly secure people.

                  At that point, suspension of disbelief goes out the window :)

                  • People should stop:

                    1. Buy faulty software
                    2. Hire incompetent system administrators.

                    The fact that both of the above not are the exception but the norm
                    says something about the software industry that I dont like.
        • It depends on what you consider the "market". If you ignore the fact that every copy of Windows ships with a personal firewall, maybe, but more and more software companies are advertising to home users that "having a firewall makes you safe, because windows doesn't."

          Of course, at this point, most people are behind a router anyways, which has a firewall...

          So yeah, if you look in terms of "specifically bought as firewalls", then yes, I'm sure the corporate sector wins out. But in general, if you consider
    • by Isao ( 153092 )
      Depends. You mentioned "I like the fact that I control all my access in one place...". That may be nice from a management perspective, but when the network behind the firewall becomes complex, the firewalls with a complex ruleset typically can't keep up with the load. Also, a firewall with several hundred (or thousand) rules can end up with rule conflicts in subtle ways, making rule integration time-consuming. Adding a separate firewall per subnet may be the answer, but then you end up with a distribute
    • I know that in our marketing-driven world it's hard to believe but I agree that strategically, firewalls aren't preferable. I cut my teeth in security at a major government-funded computing infrastructure site and the head of security there didn't believe in firewalls either. I was initially dubious but eventually was convinced. This book [amazon.com] touches on it I think.
    • "if I want to restrict the access to a particular service to a few ip addresses, I'm more likely to do this on my firewall than on the service myself."

      You win the "wrong tool for the job" award! Unless IP addresses follow your users and you have lots of anti-spoofing technology, you're biffin it, bud.
      • by growse ( 928427 )
        It was an example. If you've got a huge DoS attack coming from a large botnet, 80% of which is AOL, you can provide a good temporary measure of just blocking AOL's entire network. Sure, people can spoof their ip, bt that's not the point. A firewall gives you more control over access to your network than the individual services on that network ever could.
    • I agree with your premise, but not your supporting arguments, which is a rather unusual state of affairs, I believe.

      While the "security as an onion" has been pretty well trodden into the ground, the principle is valid, and therefore, I agree that a perimeter firewall is a necessity. However, I maintain that your internal hosts should be firewalled/ACL'd as well.

      On networks that I administer, I build a firewall into every host I put on my network. My Linux boxes all run iptables to limit tr
      • by growse ( 928427 )
        I see where you're coming from - I don't think that just having one perimeter as a firewall is a good idea. I think it's an insane idea. I also think that relying on the quality of code in your applications alone for security as well is an insane idea, which was the point I was trying to make. The author was saying that you could throw away your firewalls if you have perfect code, and I was saying that throwing away anything that makes an intruder's life difficult is insane.
    • Microsoft's recent push into the consumer security market with its OneCare service, plus the upcoming release of its next-gen Windows Vista scheduled for the first quarter of next year promise a load of security enhancements, however it is without outbound protection enabled. this will most propably be a repeat of the situation with the Windows XP built-in firewall, where there were many functionalities missing.... The objective of a firewall is to provide comprehensive protection againsts threats, both i
    • Basically, a firewall is a barrier to keep destructive forces away from your property. In fact, that's why its called a firewall. Its job is similar to a physical firewall that keeps a fire from spreading from one area to the next.Firewalls use one or more of three methods to control traffic flowing in and out of the network: 1) Packet filtering - Packets (small chunks of data) are analyzed against a set of filters. Packets that make it through the filters are sent to the requesting system and all others a
    • All these things you mention are symptoms and solutions to poor application design.

      Why do you want to limit a service to a few IPs? The service should securly challenge the identity of the user. IP address is a poor identity mechanism.

      Why do you want to add functionalliy for DoS and stateful examination of network traffic to the application? The application should not be vulnerable to these thigs in the first place.

  • by advocate_one ( 662832 ) on Wednesday September 27, 2006 @03:02PM (#16219717)
    those who can, do... those who can't, teach... and those who can't teach, audit...
    • Re: (Score:1, Informative)

      Well, the consciousness of security today has grown, but still not a lot understand of what it means and how far it should go. No one loves security, not even who have this as thier job... but most people---managers, system administrators and users alike at some point of time in thier careers feel that they'd better learn it, or at least try to understand it... some who learn.. they also learn to teach..
    • good books if we can get our hands on ...books on 'how to cheat my wife' or books on 'how to cheat tax'...
    • by Aussie ( 10167 )
      What is the definition of an auditor ?

      They are the people that go around after a battle and shoot the wounded.
      • Roles - Definitions (s/w) Project Manager - Person who thinks nine women can deliver a baby in one month. Developer - Person who thinks it will take 18 months to deliver a baby. Onsite Coordinator - One who thinks single woman can deliver nine babies in one month. Client - The one who doesn't know why he wants a baby. Marketing Manager - Person who thinks he can deliver a baby even if no man and woman are available. Resource Optimization Team - Thinks they don't need a man or woman; they'll produ
  • Is anyone in a position to compare this book to the folowing?

    http://www.cl.cam.ac.uk/~rja14/book.html [cam.ac.uk]
    • by whizistic ( 33541 ) on Wednesday September 27, 2006 @03:24PM (#16220051) Homepage
      Yes. How to Cheat at Managing Information Security is to Security Engineering as reading about morse code is to designing a fiber optic network. Hope that helps.
      • Re: (Score:3, Funny)

        God, I wish I had mod points for this.

        Mod +4 Good Analogy
                70% Funny
                30% Good Analogy
      • by KC7JHO ( 919247 )
        Wow that basic eh? Fiber optics is all about flashing lights at the correct speed and duration down a long "tube" after all. Guess you could even send the data as morse code if you really wanted to.
        And here I was thinking of actually taking a look at this book. Thanks for the heads up!
        • Actually, Senator Ted Stevens has released a good introduction to the basics [wikipedia.org] into the public domain. If you're having trouble with the link above, let me know and I'll see if I can send you an internet.

      • Maybe you are right..But i mostly prefer in How to cheat at Managing Information Security system as concepts of security, non-technical principle and practices of security and provides basic information about the technical details of many of the products - real products, not just theory..
    • ross anderson's security engineering is a great book. it is a classic.

      one of the 5 best info sec books ever.

      from what I have seen of this book, u can't compare the two.
    • by u38cg ( 607297 )
      Having just ploughed through it, yes. Security Engineering is a pretty powerful introduction to not just network security, but how to approach security at just about every level, from international politics, to commercial entities, to physical protection, internal policies, through hardware down to the nitty-gritty of how the bits can be moved around securely. Extremely densely referenced and incredibly wide ranging but never impenetrable. I'm not a computer scientist by any means, but I learnt an awful
  • by Billosaur ( 927319 ) * <wgrother AT optonline DOT net> on Wednesday September 27, 2006 @03:07PM (#16219805) Journal
    The scenario is described in 'Practical Unix and Internet Security' where author Professor Gene Spafford spells out Spaf's first principle of security administration. This principle states that 'if you have responsibility for security but have no authority to set rules or punish violators, your own role in the organization is to take the blame when something big goes wrong'. Spaf's principle is a cruel reality faced by many of those responsible for information security.

    This same principal applies to a great number of jobs in IT. If it's your job to create content for display on the Internet/Intranet and you aren't given the proper access and tools to get the job done, it is often your fault for the failure, even though you're at the mercy of others. Same goes for bad project management; if a project is slow or fails, it's not because the project manager was an ignorant troll, but was in fact due to the "inability of programmers to meet their goals," even though the goals and timelines were unreasonable and ultimately futile.

    • Re: (Score:3, Insightful)

      by MeNeXT ( 200840 )
      I would advise you to run if you work at a company like you described. First and foremost, this shows complete incompetent people are managing the business administration side. If they can blame one individual for all the security problems at the company, could you imagine how their financials are?

      People need to work as a team and be evaluated as a team. If upper management accepts the scapegoat, then they probably created the problem in the first place. Otherwise they need to resolve the issues as a group.
    • by Karellen ( 104380 ) on Wednesday September 27, 2006 @05:08PM (#16221485) Homepage
      "Authority and responsibility must be equal - else a balancing takes place as surely as current flows between points of unequal potential. To permit irresponsible authority is to sow disaster; to hold a man responsible for anything he does not control is to behave with blind idiocy."

      -- Robert A. Heinlen, _Starship Troopers_
  • by King_TJ ( 85913 ) on Wednesday September 27, 2006 @03:11PM (#16219857) Journal
    This may be a little off-topic, but I can't help but feel that the job title of "information security specialist/officer/manager/etc." is generally bogus from the start, at least as it pertains to "end users" of technology.

    I'm *not* saying that we don't need or shouldn't respect people who make a point of studying information security. But rather, that these people are most effective when they're working to build security appliances, hardware, and software that will eventually be purchased by I.T. staff. Or perhaps, when they have a specific task related to tracking down fraud in a telecommunications environment.

    In most corporations, it seems like the person or people appointed as "information security" are really just getting paid to be the fall guy(s) if and when something goes wrong. They want someone to point a finger at. The "infosec specialists" I've run across rarely have very many useful computer skills to offer a business. Rather, they're mainly good at writing up policies and procedures they insist everyone should follow for "safe computing". They can go into great detail about why a particular update patch for a router or TCP/IP stack is important for preventing a theoretical attack - yet they can't even troubleshoot a single hardware failure due to bad RAM or a failing hard disk in a workstation.

    The "rank and file" I.T. staff and management probably have just about as good a track record of keeping a given computing enviroment reasonably "secure", as long as they're diligent about keeping things updated and patched, and following some common sense procedures. They may not know (or care!) about all the technical details of why a given patch is effective, but it doesn't end up making much difference.
    • Re: (Score:3, Informative)

      by Grimfaire ( 856043 )
      Then you really don't understand how a good secop person works or have only worked with bad ones. A good one will not only help write the policy, yes help... it takes an entire IS staff and many others outside of that area to come up with a good Security Policy but will audit and help fix everything throughout the network. Gone are the days where a security person is the guy who manages the firewall. The real security IS people are in charge of the entire onion. Each layer of the network needs to be har
      • by King_TJ ( 85913 )
        Well, it may be quite true that I've "only worked with the bad ones" ... but I guess I question the "value" they really add to the typical business workplace.

        For starters, I'm of the opinion that the user *must* come first - so maybe I'm fundamentally at odds with the basic premise of their work. But from 15+ years of experience with computers and I.T., I've become convinced that usability MUST trump security, or else you've wasted your money. The trick is to make things as secure as possible, without cro
        • It's not a case of useability vs. security... it's a case of security has to be designed into the system first. Then make it usable for the users. There is no longer any case where you can toss one or the other out. But basing everything on the user first is a sure way to ignore or make securing an application/appliance/network/etc harder if not impossible.

          I've been doing one part or another of IT for close to 30 years and I've seen it all. Even in today's age, I'm working with a program where the fir

          • by cmarkn ( 31706 )

            There is a very simple rule for figuring how secure a system needs to be: It should cost more to break into the system than the information it contains is worth.

            A lot of day to day information doesn't need much security because it will be obsolete at the end of the day. On the other end, there are secrets that are the entire basis of your business, and these have to have real security. An example of this would be the formula for Coca-Cola. There is no way it belongs on any networked computer, because the e

        • Re: (Score:3, Insightful)

          by pmc ( 40532 )
          For starters, I'm of the opinion that the user *must* come first - so maybe I'm fundamentally at odds with the basic premise of their work.

          Yes, you are at odds with the basic premise. What comes first is the risk analysis. What are you trying to protect? What are the threats? Who are the agents attacking it? Are you trying to keep something confidential, or are you trying to preserve integrity, are you trying to keep availability of the system?

          Then, when you know what you are trying to achieve, you can then
          • I agree with your first statement. Yes, a risk analysis is integral to setting up a new environment. But that should be a given. Any decent systems administrator is going to read up on the pros and cons of implementing a new package, or making a change to the network infrastructure. If the "word on the street" is, package A is really insecure or doesn't "play well" with package B without turning off some of the security features, then that's a huge red flag to avoid buying package A.

            This also ties into
            • by pmc ( 40532 )
              I simply feel that by the time you're talking about the "end users" of a computing environment, the users should come first.

              Nope - the assurance of the system comes first. You've seen what happens when users are put first - see Microsoft security flaws ad nauseum. Then you allow the users as much freedom as you think they should be allowed. I know that sounds arrogant but its not really - every freedom you give users has to be counterbalanced by some other security measure. Too much freedom and you cannot s
          • I agree heartily about security policy being, in any rational scheme, a product of principled risk management. But it is worthwhile to observe that the principal risk involved is damage to business processes -- and that risk does not come only from intrusions, but also from the security policies themselves. To put it starkly: Security policies are no less potentially damaging to an enterprise than intrusions are. Both, in the worst case, have the ability to damage the enterprise fatally. While we genera
        • by RMH101 ( 636144 )

          "I'm of the opinion that the user *must* come first - so maybe I'm fundamentally at odds with the basic premise of their work. But from 15+ years of experience with computers and I.T., I've become convinced that usability MUST trump security, or else you've wasted your money. The trick is to make things as secure as possible, without crossing the line and damaging usability/usefulness of the environment."

          You're coming at this wrong. For large companies, yes, the users come first. But there are a great

    • by Soko ( 17987 )
      Lemme guess - you're a sysadmin, right?

      I agree that most security officers don't know jack about what patches are truly needed and why, just that they are needed to reduce the risk to business continuance. IOW, they are liasons to the people who have other things to do besides deploy updates and who need to learn why sysadmins would ban those bloody sticky notes from any office with a PC in it.

      They are business people, not IT people, so stop treating them as such.

      Soko
    • by Nonesuch ( 90847 ) *
      King_TJ writes:

      yet they can't even troubleshoot a single hardware failure due to bad RAM or a failing hard disk in a workstation.

      Really? That's weird. Where I work, the only staff who actually have the skills and methodical nature to effectively troubleshoot problems are the "infosec specialists". The desktop support people, the network analyst, the system admins, they all just automatically say "It must be a firewall/IPS/AV/ACL problem" and don't bother to do any sort of fault isolation, or if that

  • Memories (Score:5, Interesting)

    by Aqua_boy17 ( 962670 ) on Wednesday September 27, 2006 @03:14PM (#16219917)
    Osborne never hides his feeling about auditors
    I had to smile when I read this as it took me back to my first financial audit years back. I nervously awaited our internal auditor who had a reputation for being completely ruthless in his approach and did not give a fig if heads rolled as a result of his findings. When he first met with me, he began with a story: "You know, we auditors are often compared to soldiers, and your brothers-in-arms in the field. The only difference with us is that we fix bayonets to our rifles, and go around stabbing our own troops while they lay wounded on the ground. Now, with that out of the way let's begin your audit." I suppose that would have made most people nervous, but I was charmed by his candor, and actually wound up getting along with him quite well in the end.

    I suppose my point in telling that is that you can look at auditors two different ways. Either they're there to help you, or they are there to get in your way and point fingers. I believe that most genuinely good auditors try to be more like the former and less like the latter. And you can learn a lot from them if you remain objective and cooperative. God help you if you get the other kind though as they are usually just nothing but self-promoting tattle-telling toadies.
    • by kfg ( 145172 ) *
      "You know, we auditors are often compared to soldiers, and your brothers-in-arms in the field. The only difference with us is that we fix bayonets to our rifles, and go around stabbing our own troops while they lay wounded on the ground."

      "Keeeeeeeeeeeeewl! Can ya do me a favor?"

      "Yeah, what's that?"

      "Go stab that bastard in the next cublicle over? He's got it coming."

      KFG
    • Re:Memories (Score:5, Interesting)

      by Darth Muffin ( 781947 ) on Wednesday September 27, 2006 @03:36PM (#16220219) Homepage
      That reminds me of my experience a few months ago. We were in for our Sarbanes-Oxley (SOX) audit. One of the policies to comply with SOX is not to allow any non-company machines on the network (finally! Been wanting that for years.).

      Of course the first thing the auditors want to do is plug into our network so they can get their email. I said no, because if we do they it violates SOX and we fail their audit. They asked how they're supposed to audit us then if they can't use their e-mail? Not my problem, refer up to management :)

      I actually won this round. We ended up isolating a portion of the network so they could have access straight to the Internet.

      • Re:Memories (Score:5, Funny)

        by gosand ( 234100 ) on Wednesday September 27, 2006 @03:57PM (#16220563)
        Of course the first thing the auditors want to do is plug into our network so they can get their email. I said no, because if we do they it violates SOX and we fail their audit. They asked how they're supposed to audit us then if they can't use their e-mail? Not my problem, refer up to management :)

        We had a similar discussion with our auditors. It wasn't SOX, it was SAS70, but still a process audit. What I thought was hilarious was when I walked past the conference room that the 3 auditors had occupied, and there sat their 3 laptops, screens unlocked and nobody in the room. The urge to set their background image to goatse was almost overwhelming, but I thought better of it.

        • by RMH101 ( 636144 )
          I am loving this and the parent comment. Suspect if I started pointing this out to our auditors that we'd lose our licence to operate, but it's very tempting...!
        • As an IT auditor, I completely agree with you. In fact, if my colleagues forget to lock screens or are anal about their own procedures, you bet I let them know, as do most of my colleagues. You'll always have anal auditors and ones that don't know the stuff they should be auditing. That said, every single audit I have participated in has resulted in at least 1 or two major finding on things that any competent IT department should never have allowed in the first place. Sadly, I am convinced there still is a
      • They asked how they're supposed to audit us then if they can't use their e-mail?
        At that point I would have asked them what they needed to send via email and how they planned to secure it. This reminds me of a case I saw with a Big accounting firm where they wanted to post some unencrypted network security results on an Internet-facing web server...
    • I would have a little more compassion for audtiors if they weren't fscking idiots. I'm fed up with the morons that PriceWaterhouseCoopers, Accenture, KPMG, IBM GS, etc sends us.
  • by Petersko ( 564140 ) on Wednesday September 27, 2006 @03:45PM (#16220379)
    Auditors are necessary because IT workers often can't be trusted.

    I'm not saying they are crooked... I'm saying a lot of them rebel against structure, employ "fly by the seat of the pants" methodology, refuse to participate in process tracking, avoid completing paperwork, and think that the "art" of their business means they should be able to do things their way.

    I think IT workers in general (probably including me) need to be watched like hawks. Otherwise we end up with broken chains of approval, unmaintainable code, and important things resting on the shoulders of "the guy in the room". You know, that guy who never provides status reports and vanishes for months at a time, emerging with a completed product that may or may not do what is intended.
    • by bzipitidoo ( 647217 ) <bzipitidoo@yahoo.com> on Wednesday September 27, 2006 @04:54PM (#16221327) Journal

      To the contrary. Necessary structure is good, and IT workers know that. A lot of managers are bad. They try to impose methodologies that do not fit the problems, and demand excessive structure and paperwork. They demand schedules, and then superficially alter them until what might have been a reasonable best guess is now a death march. They think things should be done "their" way, and get in the way. That is extremely irritating when they demonstrably don't know what the heck they're doing, and can't or won't see that. IT workers don't want to hand those guys any more rope than they must; they know it's only going to be used to hang everyone. What you see as rebelling against structure itself, I see as rebelling against the abuse of structure, and against those who think the "art" of management means they don't need to know anything about the technology or science, let alone the boring technical details. They only need to know how to make engineers be productive, avoid being blamed for problems, and get the credit.

      Of course a classic way to avoid blame is to make stupid rules and then point to the engineering geeks' supposed lack of discipline for not following those rules. Really great when there's a handy stereotype available. Most people aren't going to provide accurate status reports with useful content if the main use of it will be against them. I'd say not giving a status report at all is more honest than giving a status report that's nothing but evasions, fluff, misdirection, boilerplate, and such garbage.

    • Re: (Score:3, Interesting)

      by Panaflex ( 13191 )
      Well, I call bologna..

      I worked as a developer at LARGE company - as their credit card server admin & developer (actually a small part of my other tasks).

      My first two years, auditors wanted to get copies of the credit card records - and I refused. I told them I could allow them access to review them at the location and they still wanted to have copies. Nope.

      I left the company - and less than a year later, well, you know the story. Auditor gets copies on laptop, laptop gets stolen. Big news story.

      Audi
    • by kfg ( 145172 ) *
      . . .that guy who never provides status reports and vanishes for months at a time, emerging with. . .

      . . .UNIX!

      KFG
    • Ever work in a small business IT Shop? Check, Check, Check, Check and yes...Check.
    • IMHO I think the problem mostly lies in understaffing when it comes to keeping things running, updating old things, installing new things, helping users, recovering broken things, etc.... and then there is audit work...

      DCAA audits, SOX audits, DSS audits, quarterly internal audits for SOX, etc...

      You can meet all the paperwork, process, etc.. for all of the above, if there were 28 hours a day to do it all, or a properly staffed IT department. Unfortunately, most places have neither.
    • I call BS. I am the administrator of two SOX controls for our Workstation environment. One is a "Workstation Hardening" control (what a joke), and the other is some cluster about having to hit Ctrl-Alt-Delete to see if a user that doesn't exist on the machine or domain can log onto the computer(yes, I'm serious).

      I've tried to help the auditors understand that my two controls make absolutely no sense, which basically fell on deaf ears. From an IT persons perspective, auditors are there to try to tell us h
      • The comment from this poster about "IT workers often can't be trusted"... Well... I'm really disappointed that someone that claims to be an IT person would actually say that.

        I work in a SOx-affected company as well, although we are on the Canadian side. Every year we go through auditing.

        So yesterday I'm looking into an open problem issue, one that has to go through several channels, and that must be tracked. And you know what the UNIX guy has put for the sum total of information relating to his part
    • by dbIII ( 701233 )
      If you are in the situation where you are determinining the process you can't stick to it. The solution to this is to have areas where the rules apply, some sort of isolated developmont setup where almost no rules apply, and make sure the rules are never tighter than they absolutely have to be. As an example, in a former career in engineering I had to throw out the carefully devised and verbose rules devised by the QA guy over fifty pages which even specified using Brand X sandpaper of a specific grit at
  • by xxxJonBoyxxx ( 565205 ) on Wednesday September 27, 2006 @03:46PM (#16220385)
    "...if you have responsibility for security but have no authority to set rules or punish violators, your own role in the organization is to take the blame when something big goes wrong."
    Take out "security" and fill in "hiring". Now you can probably see why the root problem is isolation of departments, not the fact that you can't taser people who don't change their passwords. Besides, who wants to listen to a on-high "security" department that says "do X or else" (often without explaining why "X" is bad). Learn to talk to people, use a few carrots and maybe fewer people (including your own employees) will think you suck.
  • by Anonymous Coward
    Save yourself $5.59 by buying the book here: How to Cheat at Managing Information Security [amazon.com]. And if you use the "secret" A9.com discount [amazon.com], you can save an extra 1.57%!
  • Security Bricks (Score:2, Informative)

    by Anonymous Coward
    This reminds me of a presentation I saw a while back from this guy Andrew Plato. He runs a security firm named Anitian (I think that is the right spelling.) He gives this hilarious presentation on all the stupid things companies and security vendors do. One of the funniest parts is at the end of his presentation. He does a "Ron Popeil" impersonation about the "Greatest Security Technology Ever Invented" while pointing to some mysterious item underneath a black sheet. He says something like "if you buy this
  • The aim of the book is to develop an information security program, or strengthen an existing program, to ensure that all of the critical technology areas are covered. Nowdays, there are attacts on corporate information systems by hackers, viruses, worms, and the occasional disgruntled employee are increasing dramatically and making companies to invest more in information security. Hereby, information security manager should watch for the known vulnerabilities and continuously monitor their network products.
  • Security isn't just something you "turn on". Security is a mindset, a set of systems and practices that affect all aspects of your work environment. And implementing security practices--especially in an organization devoid of such--is a daunting task. Firewalls, Intrusion Detection Systems, and the like are only as good as the policies that govern them. The first step in implementing security is to define an information security policy.
  • The art of managing is through;- 1)Timely obtained 2)Repeatable 3)Effective 4)Evaluate 5)Systematic Information Security is managed through;- 1)Monitoring 2)Risk assessment 3)Patching 4)Tracking (asset) 5)Coordination
  • then [reference.com]

    than [reference.com]

    At the very least, could book reviewers and submitters please learn the difference between these two words!
    • Re: (Score:2, Funny)

      Proper spelling/grammar on teh Interweb?! That's unpossible, you cant expect everyone to write that goodly!!!1!
  • security is in the mindset...a tight security can be created only by thinking like a criminal!! get a hacker to invent a security software! that may help :)
    • Tight security is only feasible by thinking like a criminal. Pleasant drugs are criminal. Therefore, good security engineering is best done stoned?
  • Like any good organisation, management plays a vital role in the development and sustainment of the organization in the new modern era. In the software world, good management systems play a more vital role. The book gives a good insight to MIS and for those who are unaware of the inner working of the MIS, this book is a good start. I agree with the author's comment of systems administrator not doing much to protect the personal computers. Coming from an organization that values MIS, problems like virus att
  • who really cares of security? where does our emails go before coming to out inbox. You is tracing through them, or scanning its contents. Are you telling me that no one out there is monitoring information. Is it as secure as what they claim?
  • Today, most business leaders currently pay as little attention to the issue of information security as they once did to technology. But just as technology now stands higher on the chief executive officer's agenda and gets a lot of attention in annual corporate strategic-planning reviews, so too will information security increasingly demand the attention of the top team. In a networked world, when hackers steal proprietary information and damage data, the companies at risk can no longer afford to dismiss suc
  • Security isn't just something you "turn on". Security is a mindset, a set of systems and practices that affect all aspects of your work environment. And implementing security practices--especially in an organization devoid of such--is a daunting task. I found this to be an excellent book in that the author obviously understands security. He's dedicated his life keeping privileged information safe. More importantly, this book is laid out in such a way that it will lead the uninitiated, newly appointed securi
    • i also agree with you siva. this book is very good and gives us an excellent and clear example about the benefits and advantages of the security.. keep it up! :)
  • Really??
  • Security... today's prime worry. A naturalist perspective or view is needed for the information security personnel. Nature, the field we need to look at in order to fill those security gaps. Picture a spider web, isn't that a security web (firewall, scanning) which filters those fat bugs? So come on let's get back to basics!
  • This book fits the bill for me!!. And it is enjoyable I have a number of other handbook style books - one that cost nearly six times more but was really a collection of articles written by a dozen different people (some with obviously conflicting views) bound under the same cover. What I liked: This book simply sets out the things I need to know about Organisations, Strategies and Audits then progresses into firewall design and security testing. And it is so funny - the cover is right this man does make s
  • he makes some money by sharing his knowledge with us. we just need to thank for him for giving some guidance by buying that book.

Dennis Ritchie is twice as bright as Steve Jobs, and only half wrong. -- Jim Gettys

Working...