Rebuilding C&A

Posted June 13th, 2007 by

After commenting on Mike Rothman’s Security Incite and Alex Hutton’s riskanalysis.is, I’m about ready to explain how C&A works and doesn’t work.

Let’s roleplay, shall we?

You’re the US government. You have an IT budget of $65 ba-ba-ba-ba-billion (stuttering added for additional effect) every year (2007 budget). If you wanted to, you might be able to make an offer to buy Microsoft based on one year’s worth of budget.

So how do you manage security risks associated with such a huge amount of cash? Same way you would manage those IT systems in the non-security world:

  • Break it all down into bite-sized pieces
  • Have some sort of methodology to manage the pieces effectively
  • Delegate responsibility for each piece to somebody
  • Use metrics to track where you are going
  • Focus on risks to the business and the financial investment
  • Provide oversight on all of the pieces that you delegated
  • Evaluate each piece to see how well it is doing

Hmm, sounds exactly like what the government has done so far. It’s exactly like an agency’s investment (system) inventory/portfolio, OMB budget process, and the GAO metrics.

Now how would you manage each bite-sized piece? This is roughly the way a systems engineer would do it:

  • Define needs
  • Define requirements
  • Build a tentative design
  • Design review
  • Build the system
  • Test that the requirements are met
  • Flip the switch to take it live
  • Support anything that breaks

Hmm, that’s suspiciously like a system development life-cycle, isn’t it? There’s a reason we use project management and SDLC–in order to get from here to there, you need to have a plan or methodology to follow, and SDLC makes sense.

So then let’s do the same exercise and add in the security pieces of the puzzle.

  • Define needs: Determine how much thesystem and the information is worth–categorization (FIPS-199 and NIST SP 800-60)
  • Define requirements (FIPS-200 andNIST SP 800-53 along with a ton of tailoring)
  • Build a tentative design (first security plan draft)
  • Design review (security plan approval)
  • Build the system
  • Test that the needs and requirements are met (security test and evaluation)
  • Flip the switch to take it live (accreditation decision)
  • Support anything that breaks (continuous monitoring)

Guess what? That’s C&A in a nutshell. All this other junk is just that–junk. If you’re not managing security risk throughout the SDLC, what are you doing except for posturing for the other security people to see and arguing about triviata?

This picture (blatantly stolen from NIST SP 800-64, Security Considerations in the Information System Development Life Cycle) shows you how the core components of C&A fit in with the rest of the SDLC:

Security in the SDLC

My theory is that the majority of systems have already been built and are in O&M phase of their SDLC. What that means is that we are trying to do C&A for these systems too late to really change anything. It also means that for the most part we will be trying to do C&A on systems that have already been built, so, just like how people confused war communism with pure communism, we confuse the emergency state of C&A post-facto with the pure state of C&A.

Now let’s look at where C&A typically falls apart:

Keys to success at this game follow roughly along what ISM-Community has proposed as an ISM Top 10. Those ISM guys, they’re pretty smart. =)



Similar Posts:

Posted in FISMA, ISM-Community, NIST, Risk Management, What Doesn't Work, What Works | 2 Comments »

GAO on Government-wide Security Weaknesses

Posted June 12th, 2007 by

Interesting testimony by Mr. Gregory C. Wilshusen, Director, Information Security Issues Government Accountability Office. For those of you who don’t habla governmento, this is the overall government-wide audit results in a laylanguage.

Some of the findings that he highlights:

  • Access controls, which ensure that only authorized individuals can read, alter, or delete data;
  • Configuration management controls, which provide assurance that only authorized software programs are implemented;
  • Segregation of duties, which reduces the risk that one individual can independently perform inappropriate actions without detection;
  • Continuity of operations planning, which provides for the prevention of significant disruptions of computer-dependent operations; and
  • An agencywide information security program, which provides the framework for ensuring that risks are understood and that effective controls are selected and properly implemented.

At this point, those of you who read celebrity magazines have to be thinking “The Feds, they’re just like us. They need access controls and segregation of duties.”



Similar Posts:

Posted in FISMA, NIST | 6 Comments »

Do You Know What FISMA Is?

Posted June 11th, 2007 by

This is all over the blogosphere by now. SecureInfo conducted a survey that said 65% of government workers did not know what FISMA is.

I even started to comment on this in various places, but posts about this survey pop up faster than I can dispel them. All I can say is that SecureInfo needs to pay their publicist a huge bonus for the mileage they got out of the press release.

When it comes to the topic of government workers knowing about FISMA, I’ve already said my piece: unless you’re working in security, senior management, or IT, you don’t need to know what FISMA is. And then there’s SecureInfo which sells among other things… wait for it… security awareness and training solutions.

However, I also have a corollary for you: most of the security practitioners inside the government do not know what FISMA really is. We have books and websites that use such phrases as “FISMA compliance” and “FISMA C&A”. It’s one of “those things that make you say ‘huh?'”

Once again, for the record:

  • FISMA is a law. The core components are in this slide (ack for the .ppt, sorry if it offends you. =) ) This isn’t my original work, it’s part of a deck that my friends and I use when we teach for Potomac Forum. Also it doesn’t mention the tasks to specific agencies like NIST. Whatever you do, don’t use this slide in a presentation you are going to give to me later, I’ll walk out of the meeting. =)
  • FISMA compliance is easy. It’s very easy to meet the core requirements of FISMA. The question is more one of quality.
  • Contractors cannot be FISMA-compliant no matter how hard they try. They do not report to OMB, GAO, or have an IG. They do, however, support government agencies that do.
  • The primary goal of FISMA is to tie security into the mission budget and to make the “business owners” (mission owners?) responsible for security instead of the CISO.
  • In discussing the details of FISMA, it is very easy to confuse the implementation details/guidance with the actual law.

Makes you want to go read the law, doesn’t it? Here’s the text on the NIST CSRC website.

When you look around at the FISMA critics and compare what they say to the law itself, you come to some interesting conclusions:

  • The overwhelming majority of contractors selling solutions around FISMA do not understand what FISMA really is.
  • We are teaching each other the wrong way to approach security by thinking that FISMA compliance means “write a bunch of documents”, “make a scorecard”, or even “do C&A”.
  • Some people have a conflict of interest with understanding FISMA because they are selling their own “competing” methodology. (one NPO in particular, rhymes with “CANS”)
  • There are many charlatans getting rich off everyone else’s ignorance selling both software and services. It is in their best interest to keep you ignorant of what the law is because it helps them sow the seeds of Fear, Uncertainty, and Doubt. I think the only thing saving humanity is the fact that CISOs are skeptical by nature. =)
  • Poor little FISMA has been abused by everybody, even those who think they are doing the right thing and quoting the magic phrase of “won’t somebody think of the taxpayers?”

Where do we go from here? Just like I’ve said a bazillion times, the DC security community needs more heretic prophets to show them the way out of the wilderness through a campaign of public awareness and education.



Similar Posts:

Posted in FISMA, NIST, Rants | 10 Comments »

Some Random Thoughts on C&A SOPs

Posted June 7th, 2007 by

I had a friend forward me today a C&A SOP from a small government agency. Other than taking NIST guidance and repackaging it in some weird morphed way that didn’t make any sense (they added a weird pre-certification phase), they missed the obvious piece: C&A is just a way to get security requirements and risk assessment into the SDLC. About 80% of the people playing the C&A game for the government think that the process goes something like this:

  • Build the system
  • Write a security plan
  • Notify CISO that document is ready to be tested
  • Auditor audits the document and makes a “you been bad” report that nitpicks about the grammar being in passive voice or you’re not using the “approved” template
  • System is given a certification statement
  • Somebody signs off on the accreditation
  • We forget about it all until it’s time to update the security plan

OK, if you do it this way, then maybe you do need a SOP.

Then again, maybe you need a new job.

I get “wigged out” when I see SOPs for C&A. A big part of why the government is failing at security and C&A is that they have divorced the 2 activities from the rest of how they do business. You shouldn’t need a SOP for C&A any more than you would need a SOP for breathing–you should have a SDLC SOP or an engineering SOP of which security is a small but important piece.

Mike’s version of how to do C&A:

  • We realize we have a need for a system
  • We categorize the data and come up with a SWAG on how much it’s worth to protect
  • We haggle over what security controls we should build based on our SWAG
  • We start writing a security plan that lists the controls we agreed on
  • We build the thing
  • We do user acceptance testing and security testing concurrently or in series
  • We fix problems and do regression testing
  • We certify that we have implemented the controls we determined we needed
  • Somebody gives the security team a vote of confidence in the form of accreditation


Similar Posts:

Posted in FISMA, NIST, What Doesn't Work | 3 Comments »

The Vendors are Already Jumping on the 07-11 Bandwagon

Posted May 21st, 2007 by

Two months ago, OMB released Memorandum 07-11 which established the authority for government-wide hardening standards for Windows products. It’s a very good thing in my opinion.

However, I’m beginning to see the start of the side effects. I have vendors already that are beating down my door trying to sell me compliance solutions that will help me meet this “oh-so-very-important standard”. I think they missed the other things I’ve had to say about compliance. The one worry that I have is that people will hit their systems with whatever technical policy compliance tool and think that they don’t have to do anything else. I think really that’s the one big problem I have with this entire class of products–they present themselves as the cure-all for all the security problems that an organization could have.

Knowing the people from NIST, it’s the classic problem that they have: They issue guidance and people blindly follow it even though it’s contradictory and not smart security. The best part is when people offer “NIST-Compliant” solutions (I take that out of our marketing material whenever I find it and then take the time to educate people on why it’s wrong) which are at best, “Our interpretation of the guidelines with numerous assumptions” and think that this is all that an organization should do security-wise. Well, the catch is that NIST, compliance frameworks, and vendors can’t anticipate every situation, so at the most what they’re offering is a 75% solution. If you go back to both NIST and OMB, they will tell you to make a decision based on a cost-benefit-risk comparison.

My friend Art Chantker from The Potomac Forum has an executive breakfast on the 24th with a good host of speakers–OMB, NIST, MicroSoft, and US Air Force. I’ll be there, just for the simple fact that I can refute claims later when somebody offers me yet another compliance solution. =)

This whole unified standard business was started by the US Air Force who very simply decreed that you wouldn’t connect a windows system to the network until it met the technical standards. Hmmm, wonder where they got the idea for a technical standard? This isn’t new, DoD has been doing it for years. I guess finally the clueful people got together and decided to make the migration to Vista a chance to get STIGs implemented in the civilian agencies.



Similar Posts:

Posted in FISMA, NIST, Rants, Technical | 3 Comments »

Core Belief #4 — Compliance is a Dead-End

Posted April 12th, 2007 by

Compliance is a Dead-End

Compliance is aimed at one thing: limiting risks to the organization that writes or enforces the standard.  How’s that for “Bottom Line up Front” writing?

I’ve been a critic of approaching FISMA with an eye toward compliance, and I just recently started to look at PCI.  I’ve started to come around to a different way of thinking.  It all makes perfect sense for the people who write or enforce the standard–they’re cutting their losses and making the non-compliant organization take the blame.  It’s risk management done in a very effective Macchiavellian style.

For an organization looking to improve their security posture, taking a compliance-based approach will eventually implode on itself.  Why?  Because compliance is binary–you are or you’re not.  Risk management is not binary, it’s OK to say “well, we don’t meet the standard here, but we don’t really need to.”

If you base your security on compliance, you are spending too much of your time, people, and money on places where you shouldn’t be, and not enough on where you should be.  In engineering words, you have had your solution dictated to you by a compliance framework.

The endgame of all compliance is either CYA, finger-pointing, or both.  Look at how data breaches with both PCI and the government get spun in the press: “$Foo organization was not compliant with $Bar standard.”  As Adam Shostack says, “Data Breaches are Good for You”, the one caveat being “except when you are caught out of compliance and smeared by the enforcers of the compliance framework”.

I remember a post to the Policy, Standards, Regulations, and Compliance list from Mark Curphey back in the neolithic age of last year about “Do organizations care about compliance or do they care about being caught out of compliance?”  It makes more sense now that I look at it.

On the other side of the coin, what I believe in is risk management.  Risk management realizes that we cannot be compliant with any framework because frameworks are made for a “one size fits all” world.  Sometimes you have to break the rules to follow the rules, and there isn’t room for that in a compliance world.



Similar Posts:

Posted in FISMA, NIST, Rants, Risk Management, What Doesn't Work | 1 Comment »

« Previous Entries Next Entries »


Visitor Geolocationing Widget: