Clouds of CAG Confusion

Posted February 26th, 2009 by

Did you know that the US Department of Defense published the Consensus Audit Guidelines?  Yes, it’s true!  At least according to a ZDNet UK article title, “US Dept of Defense lists top 20 security controls“.

There is a haze of confusion settling around the Consensus Audit Guidelines origins.  The text of the CAG press release (pdf) is clear that it is developed by a consortium of federal agencies and private organizations.  It further states CAG is part of the Center for Strategic and International Studies work on CSIS Commission report on Cybersecurity for the 44th Presidency.  The title of the CAG press release is also equally clear that it is from a “Consortium of US Federal Cybersecurity Experts” which is substantively different than a consortium of federal agencies and private organizations.

The press release relates that CAG was initiated when a team discovered similarities between massive data losses by the US defense industrial base (DIB) and attacks on Federal agencies.  The project then grew as more agencies agreed to become involved.  Following the current public review of CAG the next steps for development are listed as pilot implementations at government agencies, a CIO Council review and an IG review. The clear inference of this origin story and ennumeration of steps is that the project has official Federal backing.

Let’s test that inference.  Click here for a Google search of the entire *.gov hierarchy for “Consensus Audit Guidelines”.  As I write this there is exactly one entry.  From oregon.gov.  A search using usa.gov (which uses live.com) has the same results.  Looking around the various organizations listed as contributors doesn’t yield any official announcements.

So why the confusion in the press?  Why does it appear from the news articles that this is an Federal project?  I wouldn’t speculate.

On a slightly different topic, I’ve been reading through the Consensus Audit Guidelines themselves and enjoying the guidance it provides.  I’ll write up a more complete analysis of it once I have finished my read through.  My initial impression is that CAG controls provide worthwhile recommendations but the framework for implementation needs development.

All Aboard the Astroturfmobile photo by andydr.  Perhaps an explanation is in order….



Similar Posts:

Posted in Technical, What Doesn't Work | 7 Comments »
Tags:

The 10 CAG-egorically Wrong Ways to Introduce Standards

Posted February 20th, 2009 by

The Consensus Audit Guidelines (CAG) appear, at this point, to be a reasonable set of guidelines for mediating some human threats. I’m looking forward to seeing what CAG offers and have no doubt there will be worthwhile and actionable controls in the document. That said, there are significant reasons approach CAG with skepticism and assess it critically.

The motivation for CAG is described in a set of slides at the Gilligan Group site. It starts with a focus on what CIO’s fear most: attacks, reduced operational capability, public criticism, data loss, etc. Then it rightly questions whether FISMA is adequately addressing those problems. It doesn’t and this is the genesis of the CAG.

Consensus photo by Eirik Newth.

Unfortunately CAG subsequently develops by pairing this first valid premise with a set of false premises.  These propositions are drawn from slides at gilligangroupinc.com, attributed to John Gilligan or Alan Paller:

  1. All that matters are attacks. The central tenet of Bush’s Comprehensive National Cyber Initiative (CNCI) is adopted as the CAG theme: “Defense Must Be Informed by the Offense”. CAG envisions security as defense against penetration attacks. As any seasoned security practitioner knows, attacks are a limited subset of the threats to confidentiality, integrity and availability that information and information systems face.
  2. Security through obscurity. CAG seems to have taken the unspoken CNCI theme to heart too, “The most effective security is not exposed to public criticism.” Since its very public December 11th announcement no drafts have been made publicly available for comment.
  3. False dichotomy. CAG has been promoted as an alternative to the OMB/NIST approach to FISMA. It isn’t. An alternative would target a fuller range of threats to information and information system security. CAG should be considered a complement to NIST guidance, an addendum of security controls focused on defense against penetration by hackers. NIST has even acted on this approach by including some CAG controls into the 800-53 Rev. 3 catalog of controls.
  4. There is too much NIST guidance! This is the implication of one CAG slide that lists 1200 pages of guidance, 15 FIPS docs and the assorted Special Publications not related to FISMA as detriments to security. It’s like complaining that Wikipedia has too many articles to contribute to improved learning. Speaking as someone who scrambled to secure Federal systems before FISMA and NIST’s extensive guidance, having that documentation greatly improves my ability to efficiently and effectively secure systems.
  5. NIST guidance doesn’t tell me how to secure my systems! NIST’s FISMA guidance doesn’t step you through securing your SQL Server. The Chairman of the Joint Chiefs also doesn’t deliver your milk. Why not? It’s not their job. NIST’s FISMA guidance helps you to assess the risks to the system, decide how to secure it, secure it accordingly, check that a minimum of controls are in place and then accept responsibility for operating the system. NIST also provides documents, checklists, repositories, standards, working groups and validation of automated tools that help with the actual security implementation.
  6. Automated security controls negate human errors. With the premise of all threats being attacks this is nearly a plausible premise. But not all security is technical. Not all threats come from the Internet. DHS, NIST, Mitre, and their partners have pursued automated security controls to enforce and audit security controls for years but automated security controls can only go so far. Human errors, glitches, unexpected conflicts and operational requirements will always factor into the implementation of security.
  7. Audit compatibility as a hallmark of good security. There is a conflict of focus at the heart of the CAG, it seeks to both improve its subset of security and improve audit compatibility. For technical controls this is somewhat achievable using automation, something NIST has pursued for years with government and industry partners. For operational and management controls it results in audit checklists. But audits are fundamentally concerned with testing the particular and repeatable, security needs focus on evaluating the whole to ensure the necessary security results. An audit sees if antivirus software is installed, an evaluation sees if the antivirus software is effective.
  8. Metrics, but only these metrics over here. When selecting the current crop of CAG controls decisions on what to include were reportedly based on metrics of the highest threats. Great idea, a quantitative approach often discovers counter-intuitive facts. Only the metrics were cherry picked. Instead of looking at all realized threats or real threat impacts only a count of common penetration attacks were considered.
  9. With a sample of 1. As a basis for determining what security should focus on the whole breadth of the security profession was queried, so long as they were penetration testers. Yes, penetration testers are some very smart and talented people but penetration testing is to security what HUMINT is to intelligence services. Important players, expert practitioners but limited in scope and best used in conjunction with other intelligence assets.
  10. Assessments rely on paper artifacts. The NIST guidance does not require paper artifacts. The first line in the NIST SP 800-53A preface is, “Security control assessments are not about checklists, simple pass-fail results, or generating paperwork to pass inspections or audits-rather, security controls assessments are the principal vehicle used to verify that the implementers and operators of information systems are meeting their stated security goals and objectives.” NIST SP 800-37 specifically and repeatedly states, “Security accreditation packages can be submitted in either paper or electronic format.”

CAG is a missed opportunity. Of the myriad problems with our current FISMA regime a lot of good could be achieved. The problems with guidance have many causes but can be addressed through cooperative development of best practices outside of NIST. The Assessment Cases for SP 800-53A is an example of how cooperative development can achieve great results and provide clear guidance. Other problems exist and can be addressed with better training and community developments.

My hope is that the Consensus Audit Guidelines will move towards a more open, collaborative development environment. The first release is sure to deliver useful security controls against penetration attacks. As with all good security practices it will likely need to go through a few iterations and lots of critical assessment to mature. An open environment would help foster a more complete consensus.

Consensus photo by mugley.



Similar Posts:

Posted in BSOFH, FISMA, Rants, Technical, What Doesn't Work, What Works | 9 Comments »
Tags:

In Other News, I’m Saying “Nyet” on S.3474

Posted December 15th, 2008 by

It’s probably a shocker to most people, but I’m going to recommend that S.3474 be amended or die on the Senate floor like Caesar.

I’ve spent many hours reading over S.3474.  I’ve read the press releases and articles about it.  I’ve had some very difficult conversations with my very smart friends.

I’ve come to the conclusion that S.3474 as written/proposed by Senators Carper and Leiberman is not the answer to information security in the Government as it has been publicized repeatedly, and that anyone who believes the hype is in for a rude surprise next fall if the bill is ratified and signed.

My thoughts on the matter:

  • S.3474 is not what it is being publicized as.  The people who write the press releases and the articles would have us believe that S.3474 is a rewrite of FISMA 2002 and that it focuses on continuous monitoring of the security of IT systems, both of which are a good thing.  First and foremost, it does not repeal FISMA 2002, and anyone saying that is simply trying to deceive you.  S.3474 adds to the FISMA 2002 requirements and codifies the role and responsibility of the agency CISO.
  • S.3474 does not solve the core problem.  The core problem with security and the Government is that there is a lack of a skilled workforce.  This is a strategic issue that a bill aimed at execution of security programs cannot solve by itself.
  • S.3474 adds to the existing checklists.  People have been talking about how S.3474 will end the days of checklists and auditors.  No, it doesn’t work that way, nor is the bill written to reduce the audits and checklists.  When you introduce new legislation that adds to existing legislation, it means that you have added more items to the existing checklists.  In particular, the provisions pertaining to the CISO’s responsibilities are audit nightmares–for instance, “How do you maintain a network disconnect capability as required by FISMA 2008” opens up a whole Pandora’s Box worth of “audit requirements” which are exactly what’s wrong with the way FISMA 2002 has been implemented.
  • S.3474 puts too much of the responsibilities on the CISO.  It’s backwards thought, people.  The true responsibility for security inside of an agency falls upon that political appointee who is the agency head.  Those are the people who make the decisions to do “unsafe acts”.
  • S.3474 does not solve any problems that need a solution.  Plain and simple, it just enumerates the perceived failings of FISMA 2002.  It’s more like a post-divorce transition lover who is everything that your ex-spouse is not.  Let’s see… technical controls?  Already got them.  Requirements for network monitoring?  Already got them.  2nd party audits?  Already got them.  Requirements for contractors?  Already got them.  Food for thought is that these exist in the form of guidance, does the security community as a whole feel that we need to take these and turn them into law that takes years to get changed to keep up with the pace of technology?  There is some kind of segue there into Ranum talking about how one day we will all work for the lawyers.

Of course, this is all my opinion and you can feel free to disagree.  In fact, please do, I want to hear your opinion.  But first and foremost, go read the bill.

i haz a veto pen photo by silas216



Similar Posts:

Posted in FISMA, Rants, The Guerilla CISO, What Doesn't Work | 3 Comments »
Tags:

Introducing the Government’s Great InfoSec Equities Issue

Posted December 9th, 2008 by

Government and information security–it really means two different things, and I’m going to break it down for you “Big Bird Stylie” as something I call the InfoSec Equities Issue.

If you’re like me, you have to be wondering the same things over and over again:

  • Why is is that DHS has perpetually scored low on their FISMA report card and yet they are supposed to be leading the way for cybersecurity for the nation as a whole? (FYI, they got a B+ for FY 2007)
  • How is it that the Government as a whole can have these gianormous data breaches ala the Veterans Administration and yet still claim to know how to help us secure our systems?
  • Does the FTC really expect me to keep a straight face when I read OnGuardOnline?

Well fear not, dear readers, for this is the secret to understanding these conundrums:  they’re actually different issues with a different funding trail.  This budget difference, although a topic we security people shun whenever we can, is insanely critical.

For securing their own internal systems, the Government faces exactly the same problems that most companies have only amplified because of scale–security is a cost center, and cost centers get reduced wherever possible.  Fudiciary responsibility to the taxpayers requires that the agency CISO’s staff do more with less, and that’s not a happy thought if you end up on the wrong side of the security budget equation.

Minimal Security photo by °Florian.

When it comes to security of external systems (and some national-level internal programs), the Government runs these as a program and offered as a service to the nation.  Some typical programs include the following:

It’s one of Washington’s best-kept secrets: being a program manager in the Government means that you get a mission and a bag of money, and your job is to decide where to spend it all.  This is the sweetest job and the one that you want whether it’s in security or any other discipline that you could image is a Government service–health care, law enforcement, or even the infamous “Gub’mint cheese”.

However, all is not peachy for programs.  They can get cancelled based on political will and trends, so if your program ends up non-favorably in the Washington Post, you might end with your bag of money pilfered for other programs.

Heightened Security photo by robmcm.

This concept of divergent funding is all nice and neat except, dear readers, when the issues are not separate–ie, when an internal IT system protected by the internal budget supports a particular program.  For example, consider the following scenarios:

  • Security of vulnerability data at US-CERT (external) that resides on a Government IT system (internal).
  • A financial system (internal) that tracks distributions to welfare recipients (external).
  • A government website (internal) that supports awareness and training on security issues affecting individual citizens such as identity theft (external).

Now this is the concept behind the way Government is supposed to be running security programs:  the internal funds pay for the centralized security and the funded programs pay for any level of security for IT systems that they sponsor.

But several catches:

  • The system owner has to understand how to budget for or ensure that security for their program is budgetted for.  Somewhere in there is an understanding of security risk.
  • The system owner (who in theory has better funding and therefore better security) is dependent upon the centrally-managed security (which in theory has less funding and therefore worse security).
  • Program-specific security comes out of the program, which means that higher security costs means that the program manager can’t spend money on the services they provide, which is where they really want to be spending it.
  • A ton of negotiation is required to figure out responsibilities between the program manager and the CIO/CISO.
  • If the agency takes too much money out of the program budget for security, we run into the same fudiciary responsibility problems in that we’re not managing our money properly.


Similar Posts:

Posted in FISMA, What Doesn't Work, What Works | 7 Comments »
Tags:

Tangling with the Clearance Monsters

Posted December 2nd, 2008 by

Another pair of client agencies, another pair of clearance forms to fill out….

Want to talk about fraud, waste, and abuse?  I’m in my mid-30’s (not ~85 like Alex and Mortman think I should be) and I’ve gone through the clearance process about 3 times a year since 2002 (and once in 1992 and once in 1996), mostly because each agency insists on having their own clearance requirements.

So let’s look at the economics of managing clearances at the agency level, I figure I’m a pretty average when it comes to this:

  • ~2 days of filling out SF-86 and other clearance forms 16 hours x $150 = $800
  • ~1 day for fingerprinting and corrections 8 hours x $150 = $400
  • Salaries for cleared personnel = +$15K over “market value” (yes, dear readers, that has become the market value)
  • 3 clearance runs/year for contractors $1200 billable hours x 3 times/year = 3600/year
  • All this times a bazillion contractors supporting the Government
  • ~2 months before somebody can actually be given any information that they can actually use to do the job.

The “Who Moved my Personnel Security Cheese?” Problem

This is the real crux of the problem: every agency thinks that they are special–that Commerce has a different level of a need for trustworthy people than Health and Human Services.  We have a phrase for how we’re managing clearances right now: Not Invented Here.

News flash: trustworthy people are trustworthy people and dirtballs are dirtballs.  Honestly, what can the civilian agencies require that trumps  what having a Department of Defense Top Secret clearance can’t?  What we need is an esperanto for clearances.  My opinion is that DoD should trump all, but I’m obviously biased.  =)

Oh, but here’s the keystone to this argument:  all of the clearance processing (forms, background checks, investigations, and fingerprints) is done by the exact same people: Office of Personnel Management (OPM).

Clearance 12 Feet 4 Inches photo by Beige Alert.

Don’t get me wrong, life is not all gloom and doom.  OPM has this wonderful website now with the clearance forms called Electronic Questionnaires for Investigations Processing (e-QIP).  The best part: it remembers your details so you don’t have to fill them out every time.  Clearance paperwork has now become as simple as updating your contact information and job details on a social networking site.  And it does validation of your filing information so that you don’t have a different way of doing things from agency to agency.

Benefits of Centrally-Managed Universal Clearances

Why am I arguing for managing clearances centrally?  Well, I’m both a taxpayer and a contractor.  This is my line of thought:

  • Cheaper because of reduces redundancy (object lesson on the Federal Enterprise Architecture)
  • Reduces “switch costs” for throwing out one contractor in favor of another. (heh, bring me in instead)
  • Quicker onboarding for both govies and contractors
  • More career options for cleared personnel
  • Unified standard of accep
  • Helps us get to one unified Government ID card (ack, HSPD-12)
  • It’s just plain smarter Government!

Deus Ex Barry O?

Oh yeah, it’s Presidential transition time.  This means that everybody with an opinion comes out of the woodwork with their expert analysis on what the Government should do.  While we’re throwing ideas around, I would like to throw my hat in the ring with just a couple:

  1. Appoint an executive-branch CIO and CISO. (already covered that)
  2. Fix the clearance process so that there’s one authoritative set of clearances that apply everywhere.

Problem as I see it is that left to their own devices, the agencies have to “roll their own” because as downstream consumers of OPM, they don’t have a unifying standard.  As much as I hate getting mandates from OMB, this might be one that I’m willing to support.  And yes, I probably crossed some sort of political threshold somewhere along the line….



Similar Posts:

Posted in Rants, What Doesn't Work | 6 Comments »
Tags:

Keeping The Lights On: Cybersecurity Law for the Electric Grid

Posted September 23rd, 2008 by

Ever wondered if your electricity supply was safe from computer attack? Congress wondered that too. So they asked the Federal Energy Regulatory Commission (FERC) to find out. The answers they received in October of 2007 were not encouraging.

After 9/11 there was concern about the safety of the Bulk Power Supply (BPS). The President’s Commission on Critical Infrastructure Protection released a report which was explicit about the dangers faced. A frightening example of these dangers was demonstrated by the Aurora vulnerability, essentially a software hack that made a generator crash hard. When faced with this example industry moved to mitigate the problem with some prodding from Department of Homeland Security (DHS), Nuclear Regulatory Commission (NRC) and FERC. The Nuclear Sector, which is regulated by NRC, issued a requirement to address the problem. The Electric Sector was issued a recommendation to address the problem by the Electric Sector Information Sharing and Analysis Center (ES-ISAC). Guess which industry has moved forward with successful mitigation efforts and which has not. FERC reported back on these findings in October of 2007.

Fast forward to now. On September 11th the Bulk Power System Protection Act (BPSPA) of 2008 (PDF link) was put forward by Rep. Rick Boucher (D-VA), chairman of the House Subcommittee on Energy and Air Quality. In addition to the September 11th hearing on the BPSPA a closed door hearing was expected to be conducted the following week. The goal of this legislation is to expand the emergency power of FERC to regulate cybersecurity for the BPS. The act itself does not appear to be strongly opposed by the energy industry but, as always, the devil is in the details.

Diablo Canyon Nuclear Power Plant photo by emdot.

The draft legislation is disputed on three major points; whether to include national security threats, disclosure of threat information and a sunset provision.

FERC recommends wording that would make explicit the requirement to address national security threats. This seems an implicit and reasonable expectation that the people of the United States would have of the agency regulating the BPS but the Energy Sector considers this too expansive a role. They argue that it might cause expensive requirements to be issued such as stockpiling fuel.

The disclosure of threat information is a sore point. Here you can understand the pain of the industry in dealing with government intelligence agencies who would like to keep details of a threat spare to preserve the source of that information. Unfortunately the government must preserve their sources while providing enough information for the industry to react.

Both FERC and the Energy Sector agree on the idea of a sunset provision. The sunset provision in this case stipulates that so long as an order is implemented as a standard it should terminate one year after issuance unless renewed by the President or the Secretary of Energy. The issue is whether this sunset will include the orders to address existing problems (such as the Aurora vulnerability) in addition to orders issued for future vulnerabilities. FERC recommends that only future orders should be sunsetted while the Energy Sector recommends both current and future orders should be sunsetted.

One element which is not adequately addressed in this legislation is how FERC will build the capability to assess and manage cybersecurity issues for the BPS. What should be in place is a bipartite separation of duties between FERC and NIST similar to what is in place with the dual OMB/NIST FISMA roles. FERC would oversee the security while NIST would provide technical guidance on what security should be put in place. FERC does not have the experience in security frameworks or in depth expertise in SCADA security which is required for a cybersecurity initiative of this magnitude.

It is worth noting that Energy Policy Act of 2005 (PDF link) established a process through which the North American Electric Reliability Corporation’s (NERC) was authorized to enforce cybersecurity in the Energy Sector. NERC had gone so far as to create Critical Infrastructure Protection (CIP) standards to include with their Reliability Standards and had present them to FERC for approval by late 2007.

A review of the NERC CIP standards (CIP-001 through CIP-009) does not inspire confidence in NERC’s cybersecurity capabilities. I will discuss the shortcomings of this guidance in a subsequent post.



Similar Posts:

Posted in What Doesn't Work | 3 Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: