Comments on the Annual OMB Security Report to Congress

Posted March 11th, 2009 by

While you were looking the other way, OMB released their Fiscal Year 2008 Report to Congress on Implementation of The Federal Information Security Management Act of 2002.  Mostly it’s just the verbatim responses from the agencies and a rollup of the numbers with scarcely any analysis.

It’s interesting to contrast this with last year’s report which had a huge chunk of analysis.  In my cynical hours, I like to mentally replace “analysis” with “spin”, but not today.  =)

Another interesting thing is that since they published the actual responses, you can get some analysis like Angela Gunn of BetaNews provides.

My opinion: metrics are good, raw data is better.

Government transparency in action?  Maybe.  New staffers at OMB? Also likely.

Another interesting and related article is this one from Federal Computer News on Government security metrics. Yes, they need to be reconsidered, but for the most part the existing metrics are aimed at the major provisions of FISMA the LAW which is very high-level and very management-centric.  But hey, that’s what the law is supposed to provide, but more on that later.



Similar Posts:

Posted in FISMA | No Comments »
Tags:

FIPS and the Linux Kernel

Posted March 5th, 2009 by

Recently I was building a new kernel for my firewall and noticed an interesting new option in the Cryptographic API: “FIPS 200 compliance“.

You can imagine how very interesting and somewhat confusing this is to a stalwart FISMA practitioner. Reading through FIPS 200 it’s hard to find mention of cryptography, much less a technical specification that could be implemented in the Linux kernel. FIPS 140, FIPS 197, FIPS 186, FIPS 46 and FIPS 180 standards would be natural fits in the Cryptographic API but FIPS 200? The kernel help description didn’t clear things up:

CONFIG_CRYPTO_FIPS:

This options enables the fips boot option which is
required if you want to system to operate in a FIPS 200
certification. You should say no unless you know what
this is.

Symbol: CRYPTO_FIPS [=n]
Prompt: FIPS 200 compliance
Defined at crypto/Kconfig:24
Depends on: CRYPTO
Location:
-> Cryptographic API (CRYPTO [=y])
Selected by: CRYPTO_ANSI_CPRNG && CRYPTO

Given that examining the kernel code was a little beyond my ken and I couldn’t test to discover what it did I turned to the third of the 800-53A assessment methods, interview. A little digging on kernel.org turned up the man behind this kernel magic, Neil Horman. He was able to shed some light on what is called the fips_enabled flag.

As it turns out the FIPS 200 compliance function wasn’t as exciting as I’d hoped but it does point to interesting future possibilities.

So what does it do? In the words of Neil Horman, it is a “flag for determining if we need to be operating in some fips_compliant mode (without regard to the specific criteria)”. This means it is sort of a place holder for future developments so the kernel can operate in a mode that uses a FIPS 140-2 cryptographic module.

Did you notice the word that wasn’t included in the last paragraph? Validated. Yes, there are no validated cryptographic modules in the Linux upstream kernel. If you look at the kernel’s Cryptographic API you will find listed the “AES cipher algorithms” and “DES and Triple DES EDE cipher algorithms”. These may be compliant with FIPS standards but they are not validated.

This begs the question, why have a FIPS 200 compliance flag if you can’t meet the FIPS 140-2 requirement? This is the interesting part. Let’s say a distro decides it wants to become very FISMA friendly and get their kernel’s FIPS 140-2 cryptographic module validated. Well, if the validation of the OpenSSL VCM is an apt example the distro’s Linux kernel will need to operate in a FIPS compliant mode to verifiably load the cryptographic module. So the inclusion of the fips_enabled flag enables future compliance.

Sadly it is unlikely that any single Linux distro getting their cryptographic module validated will not translate to the upstream kernel having a validated cryptographic module. If you look at the catalog of FIPS 140-2 VCM’s the modules are only validated for particular code versions and operating mode. As the upstream kernel code won’t likely see the revisions made by the downstream distro in order to achieve the VCM until after the VCM is issued it doesn’t inherit the validation.

Polyester Resin Kernel photo by  Marshall Astor – Food Pornographer.

Two possible scenarios were discussed with Neil to allow for upstream Linux kernel incorporation of a VCM.

The first scenario would be that the upstream kernel gets all the revisions made by the downstream distro to gain the VCM designation. It then goes through the process to gain the VCM itself. Unfortunately as the code is under constant revision and can’t be locked as soon as a revision was committed to the code base the VCM would be invalidated. Only a particular build of the Linux kernel could claim to be validated.

The second scenario would be a revision to the Linux kernel that allowed for the downstream’s Linux distro’s VCM to be loaded instead of the standard Linux Cryptographic API. When asked about this scenario Neil had this to say:

“That said, theres no reason the crypto api couldn’t be ripped out and replaced with a different implementation, one that is maintained independently and its certification kept up. Of course, anyone so doing would need to keep up with the pace of kernel development, and that in turn brings the need for recertification, so its rather a lost effort in my opinion. I certainly wouldn’t recommend doing so, its just too much work.”

So the solution would either be short lived and costly or long lived and insecure.

Sadly this means that there is no easy way to include FIPS 140-2 VCM within the upstream Linux kernel. But each distro can modify their Cryptographic API and validate a cryptographic module to allow for FIPS 200 compliance. With the FIPS 200 compliance flag now in the Linux kernel it is possible for this to be verified. And that’s a happy thought for Federal Linux users.

My many thanks to Neil Horman, without whom I’d have nothing to write.



Similar Posts:

Posted in FISMA, Technical | No Comments »
Tags:

The 10 CAG-egorically Wrong Ways to Introduce Standards

Posted February 20th, 2009 by

The Consensus Audit Guidelines (CAG) appear, at this point, to be a reasonable set of guidelines for mediating some human threats. I’m looking forward to seeing what CAG offers and have no doubt there will be worthwhile and actionable controls in the document. That said, there are significant reasons approach CAG with skepticism and assess it critically.

The motivation for CAG is described in a set of slides at the Gilligan Group site. It starts with a focus on what CIO’s fear most: attacks, reduced operational capability, public criticism, data loss, etc. Then it rightly questions whether FISMA is adequately addressing those problems. It doesn’t and this is the genesis of the CAG.

Consensus photo by Eirik Newth.

Unfortunately CAG subsequently develops by pairing this first valid premise with a set of false premises.  These propositions are drawn from slides at gilligangroupinc.com, attributed to John Gilligan or Alan Paller:

  1. All that matters are attacks. The central tenet of Bush’s Comprehensive National Cyber Initiative (CNCI) is adopted as the CAG theme: “Defense Must Be Informed by the Offense”. CAG envisions security as defense against penetration attacks. As any seasoned security practitioner knows, attacks are a limited subset of the threats to confidentiality, integrity and availability that information and information systems face.
  2. Security through obscurity. CAG seems to have taken the unspoken CNCI theme to heart too, “The most effective security is not exposed to public criticism.” Since its very public December 11th announcement no drafts have been made publicly available for comment.
  3. False dichotomy. CAG has been promoted as an alternative to the OMB/NIST approach to FISMA. It isn’t. An alternative would target a fuller range of threats to information and information system security. CAG should be considered a complement to NIST guidance, an addendum of security controls focused on defense against penetration by hackers. NIST has even acted on this approach by including some CAG controls into the 800-53 Rev. 3 catalog of controls.
  4. There is too much NIST guidance! This is the implication of one CAG slide that lists 1200 pages of guidance, 15 FIPS docs and the assorted Special Publications not related to FISMA as detriments to security. It’s like complaining that Wikipedia has too many articles to contribute to improved learning. Speaking as someone who scrambled to secure Federal systems before FISMA and NIST’s extensive guidance, having that documentation greatly improves my ability to efficiently and effectively secure systems.
  5. NIST guidance doesn’t tell me how to secure my systems! NIST’s FISMA guidance doesn’t step you through securing your SQL Server. The Chairman of the Joint Chiefs also doesn’t deliver your milk. Why not? It’s not their job. NIST’s FISMA guidance helps you to assess the risks to the system, decide how to secure it, secure it accordingly, check that a minimum of controls are in place and then accept responsibility for operating the system. NIST also provides documents, checklists, repositories, standards, working groups and validation of automated tools that help with the actual security implementation.
  6. Automated security controls negate human errors. With the premise of all threats being attacks this is nearly a plausible premise. But not all security is technical. Not all threats come from the Internet. DHS, NIST, Mitre, and their partners have pursued automated security controls to enforce and audit security controls for years but automated security controls can only go so far. Human errors, glitches, unexpected conflicts and operational requirements will always factor into the implementation of security.
  7. Audit compatibility as a hallmark of good security. There is a conflict of focus at the heart of the CAG, it seeks to both improve its subset of security and improve audit compatibility. For technical controls this is somewhat achievable using automation, something NIST has pursued for years with government and industry partners. For operational and management controls it results in audit checklists. But audits are fundamentally concerned with testing the particular and repeatable, security needs focus on evaluating the whole to ensure the necessary security results. An audit sees if antivirus software is installed, an evaluation sees if the antivirus software is effective.
  8. Metrics, but only these metrics over here. When selecting the current crop of CAG controls decisions on what to include were reportedly based on metrics of the highest threats. Great idea, a quantitative approach often discovers counter-intuitive facts. Only the metrics were cherry picked. Instead of looking at all realized threats or real threat impacts only a count of common penetration attacks were considered.
  9. With a sample of 1. As a basis for determining what security should focus on the whole breadth of the security profession was queried, so long as they were penetration testers. Yes, penetration testers are some very smart and talented people but penetration testing is to security what HUMINT is to intelligence services. Important players, expert practitioners but limited in scope and best used in conjunction with other intelligence assets.
  10. Assessments rely on paper artifacts. The NIST guidance does not require paper artifacts. The first line in the NIST SP 800-53A preface is, “Security control assessments are not about checklists, simple pass-fail results, or generating paperwork to pass inspections or audits-rather, security controls assessments are the principal vehicle used to verify that the implementers and operators of information systems are meeting their stated security goals and objectives.” NIST SP 800-37 specifically and repeatedly states, “Security accreditation packages can be submitted in either paper or electronic format.”

CAG is a missed opportunity. Of the myriad problems with our current FISMA regime a lot of good could be achieved. The problems with guidance have many causes but can be addressed through cooperative development of best practices outside of NIST. The Assessment Cases for SP 800-53A is an example of how cooperative development can achieve great results and provide clear guidance. Other problems exist and can be addressed with better training and community developments.

My hope is that the Consensus Audit Guidelines will move towards a more open, collaborative development environment. The first release is sure to deliver useful security controls against penetration attacks. As with all good security practices it will likely need to go through a few iterations and lots of critical assessment to mature. An open environment would help foster a more complete consensus.

Consensus photo by mugley.



Similar Posts:

Posted in BSOFH, FISMA, Rants, Technical, What Doesn't Work, What Works | 9 Comments »
Tags:

The Accreditation Decision and the Authorizing Official

Posted February 10th, 2009 by

The accreditation decision is one of the most key activities in how the US Government secures its systems. It’s also one of the most misunderstood activities. This slideshow aims to explain the role of the Authorizing Official and to give you some understanding into why and how accreditation decisions are made.

I would like to give a big thanks to Joe Faraone and Graydon McKee who helped out.

The presentation is licensed under Creative Commons, so feel free to download it, email it, and use it in your own training.



Similar Posts:

Posted in FISMA, NIST, Risk Management, Speaking | 5 Comments »
Tags:

Could the Titanic have changed course?

Posted January 6th, 2009 by

Rybolov really struck a note with me (as he usually does) with his blog entry with his decision that S.3474 was a bad thing. It reminds me of a conversation I had with a friend recently. Basically she ask me why bad thing happen even after smart people put their heads together and try to deal with the problem before facing a crisis. Intrigued with her question, I asked her what specifically she was asking about. She shared that she had been thinking about the tragedy of the Titanic sinking.

Of course she was referring to the sinking of the passenger ship RMS Titanic on the evening of 14 April 1912. She made two points, first that the experts declared that the ship was “unsinkable” – how could they be so wrong. Second, she wondered how the ship could be so poorly equipped with boats and safety equipment such that there was such great loss of life.

The Titanic’s Disaster photo by bobster1985.

Little did she know that I have had an odd fascination with the Titanic disaster since childhood and have basically read much of the common public material about the event. So, I replied that that no expert had ever declared her unsinkable, that it was basically something that was made up by the press and the dark spineless things that hang around the press. However, I added the designers and owners of the ship had made much of her advanced safety features when she was launched. A critical feature was including water-tight bulkheads in her design. This was something of an advanced and novel feature at the time. What it meant was that you could poke a pretty big hole in the ship, and as long as the whole was not spread over several of these water-tight compartments she would stay afloat. The problem was that the iceberg that she hit (the Titanic, not my friend), ignored all of this a tore a big gash along about a third of the length of the ship.

So, my friend pressed again about the lack of safety equipment, especially lifeboats. I told her that the problem here was that the Titanic indeed did meet all of the safety requirements of the time. And that a big part of the problem was that the safety requirements were drafted in 1894 at a time when there were rapid changes and in the size and design of ships of this kind. Those regulations indicated that all passenger ships over 10,000 tons required 16 life boats, and that’s how many the Titanic had. At the time the regulations were written there were hardly any ships over 10,000 tons in size. However, when Titanic was launched she was designed to be over 50,000 tons when fully loaded. The fact was that if each of these lifeboats was fully loaded they could barely hold half of the of the passengers and crew of the ship if fully loaded. What is worse, when the ship did sink, not all of the boats were usable because of speed and angle in which the ship began sinking.

So, the bottom-line was that when the Titanic was reviewed by the safety accountants, they took out their check-list and went over the ship with a fine tooth comb. When the day was done the ship fully met all the safety criteria and was certified as safe.

This is where I see the parallels between root causes of the Titanic disaster and the odd situation we find ourselves in today in terms of IT security. Security by checklist –especially out of date checklists—simply doesn’t work. Moreover, the entire mental framework that mixes up accounting practices and thoughts with security discipline and research is an utter failure. Audits only uncover the most egregious security failures. And, they uncover them at a point in time. The result is that audits can be gamed, and even ignored. On the other hand, formal reviews by experienced security professionals are rarely ignored. Sometimes not all of the resources are available to militate against some of the vulnerabilities pointed out by the professionals. And sometimes there is debate about the validity of specific observations made by security professionals. But, they are rarely ignored.

Interesting enough, because of the mixed IT security record of many government agencies, Congress is proposing – more audits! It seems to me what they should be considering is strengthening the management of IT security and moving from security audits often performed by unqualified individuals and teams toward security assessments conducted by security professionals. And since professionals are conducting these proposed assessments, they should be required to comment on the seriousness of deficiencies and possible mitigation actions. An additional assessment that the professionals should be required to report on is the adequacy of funding, staffing and higher management support. I don’t really see any point in giving a security program a failing grade if the existing program is well managed but subverted and underfunded by the department’s leadership.



Similar Posts:

Posted in FISMA, NIST, Risk Management, The Guerilla CISO | 4 Comments »
Tags:

In Other News, I’m Saying “Nyet” on S.3474

Posted December 15th, 2008 by

It’s probably a shocker to most people, but I’m going to recommend that S.3474 be amended or die on the Senate floor like Caesar.

I’ve spent many hours reading over S.3474.  I’ve read the press releases and articles about it.  I’ve had some very difficult conversations with my very smart friends.

I’ve come to the conclusion that S.3474 as written/proposed by Senators Carper and Leiberman is not the answer to information security in the Government as it has been publicized repeatedly, and that anyone who believes the hype is in for a rude surprise next fall if the bill is ratified and signed.

My thoughts on the matter:

  • S.3474 is not what it is being publicized as.  The people who write the press releases and the articles would have us believe that S.3474 is a rewrite of FISMA 2002 and that it focuses on continuous monitoring of the security of IT systems, both of which are a good thing.  First and foremost, it does not repeal FISMA 2002, and anyone saying that is simply trying to deceive you.  S.3474 adds to the FISMA 2002 requirements and codifies the role and responsibility of the agency CISO.
  • S.3474 does not solve the core problem.  The core problem with security and the Government is that there is a lack of a skilled workforce.  This is a strategic issue that a bill aimed at execution of security programs cannot solve by itself.
  • S.3474 adds to the existing checklists.  People have been talking about how S.3474 will end the days of checklists and auditors.  No, it doesn’t work that way, nor is the bill written to reduce the audits and checklists.  When you introduce new legislation that adds to existing legislation, it means that you have added more items to the existing checklists.  In particular, the provisions pertaining to the CISO’s responsibilities are audit nightmares–for instance, “How do you maintain a network disconnect capability as required by FISMA 2008” opens up a whole Pandora’s Box worth of “audit requirements” which are exactly what’s wrong with the way FISMA 2002 has been implemented.
  • S.3474 puts too much of the responsibilities on the CISO.  It’s backwards thought, people.  The true responsibility for security inside of an agency falls upon that political appointee who is the agency head.  Those are the people who make the decisions to do “unsafe acts”.
  • S.3474 does not solve any problems that need a solution.  Plain and simple, it just enumerates the perceived failings of FISMA 2002.  It’s more like a post-divorce transition lover who is everything that your ex-spouse is not.  Let’s see… technical controls?  Already got them.  Requirements for network monitoring?  Already got them.  2nd party audits?  Already got them.  Requirements for contractors?  Already got them.  Food for thought is that these exist in the form of guidance, does the security community as a whole feel that we need to take these and turn them into law that takes years to get changed to keep up with the pace of technology?  There is some kind of segue there into Ranum talking about how one day we will all work for the lawyers.

Of course, this is all my opinion and you can feel free to disagree.  In fact, please do, I want to hear your opinion.  But first and foremost, go read the bill.

i haz a veto pen photo by silas216



Similar Posts:

Posted in FISMA, Rants, The Guerilla CISO, What Doesn't Work | 3 Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: