The 10 CAG-egorically Wrong Ways to Introduce Standards

Posted February 20th, 2009 by

The Consensus Audit Guidelines (CAG) appear, at this point, to be a reasonable set of guidelines for mediating some human threats. I’m looking forward to seeing what CAG offers and have no doubt there will be worthwhile and actionable controls in the document. That said, there are significant reasons approach CAG with skepticism and assess it critically.

The motivation for CAG is described in a set of slides at the Gilligan Group site. It starts with a focus on what CIO’s fear most: attacks, reduced operational capability, public criticism, data loss, etc. Then it rightly questions whether FISMA is adequately addressing those problems. It doesn’t and this is the genesis of the CAG.

Consensus photo by Eirik Newth.

Unfortunately CAG subsequently develops by pairing this first valid premise with a set of false premises.  These propositions are drawn from slides at gilligangroupinc.com, attributed to John Gilligan or Alan Paller:

  1. All that matters are attacks. The central tenet of Bush’s Comprehensive National Cyber Initiative (CNCI) is adopted as the CAG theme: “Defense Must Be Informed by the Offense”. CAG envisions security as defense against penetration attacks. As any seasoned security practitioner knows, attacks are a limited subset of the threats to confidentiality, integrity and availability that information and information systems face.
  2. Security through obscurity. CAG seems to have taken the unspoken CNCI theme to heart too, “The most effective security is not exposed to public criticism.” Since its very public December 11th announcement no drafts have been made publicly available for comment.
  3. False dichotomy. CAG has been promoted as an alternative to the OMB/NIST approach to FISMA. It isn’t. An alternative would target a fuller range of threats to information and information system security. CAG should be considered a complement to NIST guidance, an addendum of security controls focused on defense against penetration by hackers. NIST has even acted on this approach by including some CAG controls into the 800-53 Rev. 3 catalog of controls.
  4. There is too much NIST guidance! This is the implication of one CAG slide that lists 1200 pages of guidance, 15 FIPS docs and the assorted Special Publications not related to FISMA as detriments to security. It’s like complaining that Wikipedia has too many articles to contribute to improved learning. Speaking as someone who scrambled to secure Federal systems before FISMA and NIST’s extensive guidance, having that documentation greatly improves my ability to efficiently and effectively secure systems.
  5. NIST guidance doesn’t tell me how to secure my systems! NIST’s FISMA guidance doesn’t step you through securing your SQL Server. The Chairman of the Joint Chiefs also doesn’t deliver your milk. Why not? It’s not their job. NIST’s FISMA guidance helps you to assess the risks to the system, decide how to secure it, secure it accordingly, check that a minimum of controls are in place and then accept responsibility for operating the system. NIST also provides documents, checklists, repositories, standards, working groups and validation of automated tools that help with the actual security implementation.
  6. Automated security controls negate human errors. With the premise of all threats being attacks this is nearly a plausible premise. But not all security is technical. Not all threats come from the Internet. DHS, NIST, Mitre, and their partners have pursued automated security controls to enforce and audit security controls for years but automated security controls can only go so far. Human errors, glitches, unexpected conflicts and operational requirements will always factor into the implementation of security.
  7. Audit compatibility as a hallmark of good security. There is a conflict of focus at the heart of the CAG, it seeks to both improve its subset of security and improve audit compatibility. For technical controls this is somewhat achievable using automation, something NIST has pursued for years with government and industry partners. For operational and management controls it results in audit checklists. But audits are fundamentally concerned with testing the particular and repeatable, security needs focus on evaluating the whole to ensure the necessary security results. An audit sees if antivirus software is installed, an evaluation sees if the antivirus software is effective.
  8. Metrics, but only these metrics over here. When selecting the current crop of CAG controls decisions on what to include were reportedly based on metrics of the highest threats. Great idea, a quantitative approach often discovers counter-intuitive facts. Only the metrics were cherry picked. Instead of looking at all realized threats or real threat impacts only a count of common penetration attacks were considered.
  9. With a sample of 1. As a basis for determining what security should focus on the whole breadth of the security profession was queried, so long as they were penetration testers. Yes, penetration testers are some very smart and talented people but penetration testing is to security what HUMINT is to intelligence services. Important players, expert practitioners but limited in scope and best used in conjunction with other intelligence assets.
  10. Assessments rely on paper artifacts. The NIST guidance does not require paper artifacts. The first line in the NIST SP 800-53A preface is, “Security control assessments are not about checklists, simple pass-fail results, or generating paperwork to pass inspections or audits-rather, security controls assessments are the principal vehicle used to verify that the implementers and operators of information systems are meeting their stated security goals and objectives.” NIST SP 800-37 specifically and repeatedly states, “Security accreditation packages can be submitted in either paper or electronic format.”

CAG is a missed opportunity. Of the myriad problems with our current FISMA regime a lot of good could be achieved. The problems with guidance have many causes but can be addressed through cooperative development of best practices outside of NIST. The Assessment Cases for SP 800-53A is an example of how cooperative development can achieve great results and provide clear guidance. Other problems exist and can be addressed with better training and community developments.

My hope is that the Consensus Audit Guidelines will move towards a more open, collaborative development environment. The first release is sure to deliver useful security controls against penetration attacks. As with all good security practices it will likely need to go through a few iterations and lots of critical assessment to mature. An open environment would help foster a more complete consensus.

Consensus photo by mugley.



Similar Posts:

Posted in BSOFH, FISMA, Rants, Technical, What Doesn't Work, What Works | 9 Comments »
Tags:

Everything I know about security, I learned from Ghostbusters…

Posted February 17th, 2009 by

(Well maybe not everything…)
I’ve been the defacto security officer at a government agency going on two years now; it’s been quite a challenge. Without getting too deeply into how this happened (since I’m a contractor), I’d like to share some of the insights, horror stories, tips, and interesting anecdotes I’ve gathered over the past 22+ months.

If nothing else, many of my “preconceived notions” about managing an effective security program at a federal agency have been confirmed. Many others have been changed in ways I would never have suspected. I’m going to attempt to explain these in what I hope is an insightful, if not humorous way.

Ghostbusters works for me… At the time (1984), it was, hands-down, the funniest movie I had ever seen–it left its mark. It sure beats “Dude Where’s My Car?” for quotes that can be applied to security. But then some may say I’ve either set the bar a bit low, or I need to expand my movie viewing habits. Hey, work with me on this one people!!!

So, here are several quotes from the movie and their application to my philosophy on information security. I hope you enjoy it!


Ecto-1 photo by chad davis.

I’m from security, and I’m ready to believe you.
Listen. Foster discussion. Then, draw upon your experience and make your decision. Do not enter into a discussion with a mandate (unless from above). Mandates do not foster discussions, especially in areas where policy is absent or maybe not-so-explicit. Most importantly, this is an invitation for the person you’re talking to begin their side of their story.
Important Safety Tip: As the security professional, remember – this is the time for you to begin listening!

“Next time, if <someone> asks whether you’re a GOD, you say YES!”
Face it. Many of us security folks are humble. We all may even know what it is we don’t know. We might be a little gun-shy in our first few weeks on the job. However, don’t let your humility or shyness overcome you…

Like it or not, you are your organization’s security expert. “The Shell Answer Man,” the “Pro from Dover,” the “Go-to Guy/Gal.” While you may not have committed the processes contained within the IKE negotiation phases to memory, and may not be able to quote RFC 3514 off the top of your head, you probably DO know where to find the information… “I don’t know,” should never roll off your lips.

When you’re hired as the subject matter expert on security, you need to be confident–whether you’re knocking a soft-toss out of the park, but especially when you tell folks that you’ll research the topic and get back to them. Come back with the facts, and your credibility will be strengthened.

Likewise, when you have reservations about a particular situation, let folks know why you’re not jumping on board their crazy train. Invite discussion. State your case plainly and propose solutions, or if you can’t suggest an alternative, discuss it offline in another meeting focused on solutions. While your mission is to guard the organization’s interests, you can’t do so at the expense of the organization’s mission. Working closely with client service or engineering teams shows that security can be an integral part of solution development, and not an impediment. Think of this as guiding others to the solution – without telling them the “right” answer. This allows others to “own” the solution – their help may be valuable, if not necessary to help you socialize a potentially contentious (or expensive) solution.

“Don’t cross the streams…”
I love this one. I get to use this at least twice a day while speaking to engineering, operations, management or other folks at my agency. It’s gotten so that people have heard it so many times, they’re using it. Best part is, they are using the phrase correctly!

So what does this mean exactly? Generally/normally, the following things should never be directly connected to one another:

  • Classified and Unclassified Networks
  • The Internet and a Classified Network
  • Networks classified at different levels
  • Development, Test, and Production Networks/Environments
  • Accredited/trusted networks / less trusted
  • Management and Production Networks

“Wait! I thought you said crossing the streams was BAD?!”
So, what does this Ghostbusters quote mean to we security folk?
Every policy, however rigidly enforced, needs a waiver process.

So what do I really mean? When you understand and can quantify the risk of a particular practice or a particular action, you can develop compensating controls to make otherwise unthinkable practices (e.g., connecting unclassified networks to classified networks) less risky. In this example, it can be done using one-way guard technology, or some other similar trusted, manual process.

Face it, jumping off a bridge can be dangerous, if not suicidal. However, when the jumper attaches themselves to a bungee cord or uses a parasail, the act of jumping off a bridge can be reduced from a Darwin-qualifying stunt to thrilling fun or awesome opening movie scene (like the opening of the first XXX movie starring Vin Diesel as Xander Cage). It may not be for everyone – but, given the right safety equipment, some of us might even consider taking the leap.

There’s an even better example. Let’s say your network security policy forbids use of USB memory devices. Anyone seen with one is given a stern talking-to, if not killed outright. Well, maybe not killed… the first time. Let’s say a virus or worm gets into your network. Hey – it happens. As a precautionary measure, your response to this type of incident requires you to sever your network connections to your business partners as well as the Internet. So… How do you get the new virus definition file and virus engine from your Platinum Support Provider and install it on your server? It just so happens that in this case, you downloaded a copy using your uninfected laptop via your home internet connection… onto a USB memory stick. So, how do you reconcile what needs to be done against your policy? Obviously, an exception to the policy needs to be made.

As a matter of fact, every organization needs a policy that allows exceptions to be made to existing policy. This may sound like doublespeak, and the above may not be the best example, but it certainly does illustrate the point.

“What about the Twinkie?  Tell him about the Twinkie?!”
Never hide stuff from superiors. They don’t like surprises.
Never hide stuff from auditors. They have less of a sense of humor than your superiors.

“Human sacrifice, dogs and cats living together… MASS HYSTERIA.”
FUD doesn’t work. Don’t try it!

I hope these good-natured examples have gotten you to laugh (minimally), or possibly gotten the aspiring CISOs among you to think about how you might use humor in your day-to-day existence. I’d like to leave you with one more thought:
If you’re not having fun, you’re doing it wrong!

Cheers,
Vlad

FUD Fighter photo by cote.



Similar Posts:

Posted in BSOFH | 4 Comments »
Tags:

The Authorizing Official and the Lolcat

Posted February 12th, 2009 by

Hot on the heels of our DAA presentation, the Guerilla CISO is proud to present our lolcat Authorizing Official.

Yes, it’s all about the budget. If you do not have a budget, you do not have the ability to change things. We have a word for people who have all the authority of an Authorizing Official but none of the budget: scapegoat.

funny pictures

And since I’m in Toronto for an extended stay thanks to the weather, today is a 2-fer:
funny pictures



Similar Posts:

Posted in IKANHAZFIZMA | No Comments »
Tags:

The Accreditation Decision and the Authorizing Official

Posted February 10th, 2009 by

The accreditation decision is one of the most key activities in how the US Government secures its systems. It’s also one of the most misunderstood activities. This slideshow aims to explain the role of the Authorizing Official and to give you some understanding into why and how accreditation decisions are made.

I would like to give a big thanks to Joe Faraone and Graydon McKee who helped out.

The presentation is licensed under Creative Commons, so feel free to download it, email it, and use it in your own training.



Similar Posts:

Posted in FISMA, NIST, Risk Management, Speaking | 5 Comments »
Tags:

Database Activity Monitoring for the Government

Posted November 11th, 2008 by

I’ve always wondered why I have yet to meet anyone in the Government using Database Activity Monitoring (DAM) solutions, and yet the Government has some of the largest, most sensitive databases around.  I’m going to try to lay out why I think it’s a great idea for Government to court the DAM vendors.

Volume of PII: The Government owns huge databases that are usually authoritative sources.  While the private sector laments the leaks of Social Security Numbers, let’s stop and think for a minute.  There is A database inside the Social Security Administration that holds everybody’s number and is THE database where SSNs are assigned.  DAM can help here by flagging queries that retrieve large sets of data.

Targetted Privacy Information:  Remember the news reports about people looking at the presidential candidate’s passport information?  Because of the depth of PII that the Government holds about any one individual, it provides a phenomenal opportunity for invation of someone’s privacy.  DAM can help here by flagging VIPs and sending an alert anytime one of them is searched for. (DHS guys, there’s an opportunity for you to host the list under LoB)

Sensitive Information: Some Government databases come from classified sources.  If you were to look at all that information in aggregate, you could determing the classified version of events.  And then there are the classified databases themselves.  Think about Robert Hanssen attacking the Automated Case System at the FBI–a proper DAM implementation would have noticed the activity.  One interesting DAM rule here:  queries where the user is also the subject of the query.

Financial Data:  The Government moves huge amounts of money, well into $Trillions.  We’re not just talking internal purchasing controls, it’s usually programs where the Government buys something or… I dunno… “loans” $700B to the financial industry to stay solvent.  All that data is stored in databases.

HR Data:  Being one of the largest employers in the world, the Government is sitting on one of the largest repository of employee data anywhere.  That’s in a database, DAM can help.

 

Guys, DAM in the Government just makes sense.

 

Problems with the Government adopting/using DAM solutions:

DAM not in catalog of controls: I’ve mentioned this before, it’s the dual-edge nature of a catalog of controls in that it’s hard to justify any kind of security that isn’t explicitly stated in the catalog.

Newness of DAM:  If it’s new, I can’t justify it to my management and my auditors.  This will get fixed in time, let the hype cycle run itself out.

Historical DAM Customer Base:  It’s the “Look, I’m not a friggin’ bank” problem again.  DAM vendors don’t actively pursue/understand Government clients–they’re usually looking for customers needing help with SOX and PCI-DSS controls.

 

 

London is in Our Database photo by Roger Lancefield.



Similar Posts:

Posted in Rants, Risk Management, Technical, What Works | 2 Comments »
Tags:

In Which Our Protagonist Discovers We Need More Good Public Policy People Who Understand Security

Posted November 4th, 2008 by

Note the emphasis on good.  Note the emphasis on public policy.

Yes, folks, we need good policy people.  Think about the state of security and public policy today:

  • We have FISMA which is a law.  Everybody’s whipping boy but it’s exactly where it needs to be to have risk-based management of IT security.
  • We have a framework for implementing FISMA.  It’s a pretty good set of process, policy, and standards that have spilled over into the private sector.
  • You need a crowbar to get good/smart security people to deal with politics, it takes a death ray to get them to deal with public policy.
  • We don’t have high-level policy-makers who understand risk management and they are co-opting the model of compliance.
  • Public policy is the upstream neighbor of information security and what public policy people do influences what we do.
  • If we want to succeed in security at the operational and tactical level, we need to have the right decisions made at the strategic level, and that includes public policy.
  • I’m not just talking about security and the Government, this is also with things like breach laws; compliance frameworks (PCI, HIPAA); and how unpatched and zombified desktops hurt everybody else.

So in true Guerilla CISO style, I’m doing something about it.  Armed with my favorite govie (who is actually the lead on this, I’m just a straphanger), The New School of Information Security (Hi Adam and Andrew), some government policy directives, and the National Strategy to Secure Cyberspace, I am teaching an Information Security Management and Public Policy class for Carnegie Mellon’s Heinz School.

The more I work with the Masters of Science in Public Policy Management program, the more I’m sold on it.  Basically the students do a year on-campus in Pittsburg, then they have the option of staying there or coming to DC.  The students who come to DC work a 32-hour week (some do more), 2 night classes, and class for most of Friday.  Our information security class fits in as a sector-specific deep-dive, the other one being healthcare (which needs smart public policy people, too).

Which is where we need some help.  It’s a little behind the game, but we’re constantly looking for Government agencies, NGOs/NPOs, and contractors who are interested in taking on interns.  Even better if you have jobs that don’t have a US citizenship requirement.  If you want to be linked up, just drop me a line.

And oh yeah, my blogging has slowed down because I’m working 2 new projects and traveling to Tennessee and teaching Thursday nights and my life just got way busy.  =)

 

Alexander Hamilton Statue photo by dbking.



Similar Posts:

Posted in The Guerilla CISO, What Works | No Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: