The Accreditation Decision and the Authorizing Official

Posted February 10th, 2009 by

The accreditation decision is one of the most key activities in how the US Government secures its systems. It’s also one of the most misunderstood activities. This slideshow aims to explain the role of the Authorizing Official and to give you some understanding into why and how accreditation decisions are made.

I would like to give a big thanks to Joe Faraone and Graydon McKee who helped out.

The presentation is licensed under Creative Commons, so feel free to download it, email it, and use it in your own training.



Similar Posts:

Posted in FISMA, NIST, Risk Management, Speaking | 5 Comments »
Tags:

Could the Titanic have changed course?

Posted January 6th, 2009 by

Rybolov really struck a note with me (as he usually does) with his blog entry with his decision that S.3474 was a bad thing. It reminds me of a conversation I had with a friend recently. Basically she ask me why bad thing happen even after smart people put their heads together and try to deal with the problem before facing a crisis. Intrigued with her question, I asked her what specifically she was asking about. She shared that she had been thinking about the tragedy of the Titanic sinking.

Of course she was referring to the sinking of the passenger ship RMS Titanic on the evening of 14 April 1912. She made two points, first that the experts declared that the ship was “unsinkable” – how could they be so wrong. Second, she wondered how the ship could be so poorly equipped with boats and safety equipment such that there was such great loss of life.

The Titanic’s Disaster photo by bobster1985.

Little did she know that I have had an odd fascination with the Titanic disaster since childhood and have basically read much of the common public material about the event. So, I replied that that no expert had ever declared her unsinkable, that it was basically something that was made up by the press and the dark spineless things that hang around the press. However, I added the designers and owners of the ship had made much of her advanced safety features when she was launched. A critical feature was including water-tight bulkheads in her design. This was something of an advanced and novel feature at the time. What it meant was that you could poke a pretty big hole in the ship, and as long as the whole was not spread over several of these water-tight compartments she would stay afloat. The problem was that the iceberg that she hit (the Titanic, not my friend), ignored all of this a tore a big gash along about a third of the length of the ship.

So, my friend pressed again about the lack of safety equipment, especially lifeboats. I told her that the problem here was that the Titanic indeed did meet all of the safety requirements of the time. And that a big part of the problem was that the safety requirements were drafted in 1894 at a time when there were rapid changes and in the size and design of ships of this kind. Those regulations indicated that all passenger ships over 10,000 tons required 16 life boats, and that’s how many the Titanic had. At the time the regulations were written there were hardly any ships over 10,000 tons in size. However, when Titanic was launched she was designed to be over 50,000 tons when fully loaded. The fact was that if each of these lifeboats was fully loaded they could barely hold half of the of the passengers and crew of the ship if fully loaded. What is worse, when the ship did sink, not all of the boats were usable because of speed and angle in which the ship began sinking.

So, the bottom-line was that when the Titanic was reviewed by the safety accountants, they took out their check-list and went over the ship with a fine tooth comb. When the day was done the ship fully met all the safety criteria and was certified as safe.

This is where I see the parallels between root causes of the Titanic disaster and the odd situation we find ourselves in today in terms of IT security. Security by checklist –especially out of date checklists—simply doesn’t work. Moreover, the entire mental framework that mixes up accounting practices and thoughts with security discipline and research is an utter failure. Audits only uncover the most egregious security failures. And, they uncover them at a point in time. The result is that audits can be gamed, and even ignored. On the other hand, formal reviews by experienced security professionals are rarely ignored. Sometimes not all of the resources are available to militate against some of the vulnerabilities pointed out by the professionals. And sometimes there is debate about the validity of specific observations made by security professionals. But, they are rarely ignored.

Interesting enough, because of the mixed IT security record of many government agencies, Congress is proposing – more audits! It seems to me what they should be considering is strengthening the management of IT security and moving from security audits often performed by unqualified individuals and teams toward security assessments conducted by security professionals. And since professionals are conducting these proposed assessments, they should be required to comment on the seriousness of deficiencies and possible mitigation actions. An additional assessment that the professionals should be required to report on is the adequacy of funding, staffing and higher management support. I don’t really see any point in giving a security program a failing grade if the existing program is well managed but subverted and underfunded by the department’s leadership.



Similar Posts:

Posted in FISMA, NIST, Risk Management, The Guerilla CISO | 4 Comments »
Tags:

Workin’ for the ‘Counters: an Analysis of my Love-Hate Relationship with the CPAs

Posted September 30th, 2008 by

No big surprise by now, I work for an accounting firm.  Oh, what’s that?  Oh yes, that’s right, it’s a consulting firm with a high percentage of accountants, including a plethora of CPAs.  “Accounting firm” is so 1950s-ish. =)

It’s my secret theory (well, not so much of a secret now, just between the Internet and me) that the primary problem we have in information security is that as a field we have borrowed heavily from public accounting.  The only problem is that public accounting is different from what we do.

Goals for public accounting run something like this:

  • Eliminate fraud through oversight
  • Protect the company’s money from rogue agents
  • Protect the shareholders of public companies
  • Ensure accountability of actions

Accounting for Mere Mortals Such as Security Folk

Accounting for Non-Accountants photo by happyeclair.

As a result of their goals, accountants have an interesting set of values:

  • Signatures are sacred
  • Separation of duties is sacrosanct
  • Auditing is designed to act as a deterrent to fraud
  • “Professional Skepticism” is a much-valued trait
  • Zero-Defects is a good condition

In other words, accountants live in a panopticon of tranparency, the concept being that through oversight and transparency, people will not become evildoers and those that do will be caught.  Pretty simple idea, makes me think about IDS in an entirely new light.

Words that accountants use that mean something entirely different from the way you or I use them:

  • Fraud, Waste, and Abuse: They’re talking about spending money, I’m usually talking about people doing something ethically wrong.
  • Investigation: They’re looking at the numbers to see how a particular number was created.  Me, I bring the nice people with guns when I do an investigation.
  • Incident: Their version is what I would call an event.  When I call something an incident, we’re headed towards an investigation.
  • Security test and evaluation: To them, it’s a compliance audit.  To me, it’s determining the frequency that the system will fail and if we have a way to fix it once it does.  Remember this, it’s a critical difference.
  • Control: I think their version has something to do with having oversight and separation of duties.  Me, when I see this word, I think “countermeasure to a specific threat and vulnerability”.
  • Audit: An activity designed to prove that fraud has not happened.  Usually we don’t use the word unless we absolutely have to.
  • Technical: They’re talking about the highly-detailed accounting rules.  I’m talking about if you know how to build your own server and OS using lumps of raw silicon and a soldering iron.
  • Checklist: They’re talking about a sacred list that condenses all the rules into an easily-auditable format.  Me, I’m thinking that a checklist is something that will fail because my threats and their capabilities don’t fit into nice little lists.
  • Forensics: Their version is what I would call “research to find out where the money went to” and involves looking at a bunch of numbers.  My version has something to do with logs, memory dumps, and hard drive images.
  • Risk Management: This has something to do with higher interest rates for high-risk loans.  For me, it’s looking for countermeasures and knowing what things to skimp on even though the catalog of controls says you have to have it.

In short, pretty much anything they could say about our line of work has a different meaning.  This is why I believe it’s a problem if we adopt too much of their methodology and management models because they are doing similar activities to what security people do, only for different purposes.

In order to understand the mentality that we’re working with, let’s give you a couple of scenarios:

After-Work Optional Training Session: The accountants not only make you put your name on the attendance roster but you have to sign it as well.  Are they worried that you’re committing fraud by showing up at training that you were not supposed to, so they need some sort of signature nonrepudiation to prove that you were there?  No!  They just make you sign it because they believe in the power of the signature and that’s just how they do things, no matter how trivial.

The Role of Security: To an accountant, the role of security in an organization is to reduce fraud by “hack-proof” configurations and monitoring.  This is a problem in that since security is economics, we’re somehow subordinate to the finance people.

Let’s look at the world of the typical security practitioner:

  • The guidance that security professionals have is very contradictory, missing, or non-relevant.
  • Really what we do comes down to risk management, which means that sometimes it makes more sense to break the rules (even though there is a rule that says break the rules, which should freak your brain out by now if you’re an accountant).
  • We have a constantly changing environment that rules cannot keep up with.

Now this whole blog post, although rambling on about accountants, is aimed at getting a message across.  In the US Federal Government, we use a process called certification and accreditation (C&A).  The certification part is pretty easy to understand–it’s like compliance, do you have it and does it work.  CPAs will readily understand that as a controls assessment.  That’s very much a transferable concept.

But in accreditation, you give the risks to a senior manager/executive and they accept the risks associated with operating the system.  The CPA’s zero-defects world comes through and they lie on the ground doing the cockroach.  Their skills aren’t transferable when dealing with risk management, only compliance with a set of rules.

Once again, the problem with security in Government is that it’s cultural.

And don’t get me wrong, I like accountants and they do what I do not have neither the skills nor the desire to do.  I just think that there aren’t as many transferable skills between our jobs as there might seem on the surface.



Similar Posts:

Posted in Odds-n-Sods, Rants | 4 Comments »
Tags:

C&A Seminar, October 15th and 16th

Posted September 22nd, 2008 by

The Potomac Forum crew is back at it again with a C&A seminar on the 15th and 16th.  While 2 days isn’t long enough to earn your black belt at C&A-Foo, it is enough so that if you’re a solid program manager or technical lead, you’ll walk out being at least able to understand the core of the process.

As usual, some of the instructors should be familiar to my blog readers.  =)



Similar Posts:

Posted in FISMA, Speaking | No Comments »
Tags:

Draft of SP 800-37 R1 is Out for Public Review

Posted August 19th, 2008 by

Go check it out (caveat: .pdf file) and send your comments to sec-cert@nist.gov

I’ve just given it a glance and here are some things that I’ve noticed:

  • Section on security in SDLC
  • Incorporates some of the concepts in SP 800-39 about enterprise-wide risk
  • Section on common controls
  • The process has remained pretty much the same but now references all the other core documents

Where I see this revision’s weaknesses:

  • Still possible to view the C&A process as happening at the end of the SDLC as a gateway activity.  This is the path to failure–you have to start at the initiation of a project.  In other words, I don’t think the SDLC thing is obvious enough for the constituency.  C&A should be all about security in the SDLC, and I  think we’ve done ourselves a disservice by trying to separate the two.
  • Unity:  Yes, we have all the pieces there, but the document doesn’t flow as a whole yet.  BFD, I’ll get over it soon enough.
  • It all goes back to metrics:  If completed C&A is going to be one of the core metrics that you use (or OMB uses), then it should be THE core document with everything else being a stub of of it.  We have a start, but I don’t think it’s as fleshed-out as it needs to be.

Side-note for NIST:  C&A is the implementation of the System Security Engineering Process, some of that SSE has a place in 800-37.

The origingal announcement from NIST is this:

NIST, in cooperation with the Office of the Director of National Intelligence (ODNI), the Department of Defense (DOD), and the Committee on National Security Systems (CNSS), announces the completion of an interagency project to develop a common process to authorize federal information systems for operation. The initial public draft of NIST Special Publication 800-37, Revision 1, Guide for Security Authorization of Federal Information Systems: A Security Lifecycle Approach, is now available for a six-week public comment period. The publication contains the proposed new security authorization process for the federal government (currently commonly referred to as certification and accreditation, or C&A). The new process is consistent with the requirements of the Federal Information Security Management Act (FISMA) and the Office of Management and Budget (OMB) Circular A-130, Appendix III, promotes the concept of near real-time risk management based on continuous monitoring of federal information systems, and more closely couples information security requirements to the Federal Enterprise Architecture (FEA) and System Development Life Cycle (SDLC). The historic nature of the partnership among the Civil, Defense, and Intelligence Communities and the rapid convergence of information security standards and guidelines for the federal government will have a significant impact on the federal government’s ability to protect its information systems and networks. The convergence of security standards and guidelines is forging ahead with the development of a series of new CNSS policies and instructions that closely parallel the NIST security standards and guidelines developed in response to FISMA. The CNSS policies and instructions which address the specific areas of security categorization, security control specification, security control assessment, risk management, and security authorization, coupled with the current NIST publications will provide a more unified information security framework for the federal government and its contracting base. The unified approach to information security is brought together in part by the update to NIST Special Publication 800-37, Revision 1, which provides a common security authorization process and references the NIST and CNSS publications for the national security and non national security communities, respectively. The convergence activities mentioned above along with tighter integration of security requirements into the FEA and SDLC processes will promote more consistent and cost-effective information security and trusted information sharing across the federal government. Comments on the IPD of SP 800-37, Revision 1 should be provided by September 30, 2008 and forwarded to the Computer Security Division, Information Technology Laboratory at NIST or submitted via email to: sec-cert@nist.gov .
 



Similar Posts:

Posted in Uncategorized | 1 Comment »
Tags:

New SP 800-60 is Out, Categorize Yerselves Mo Better

Posted August 18th, 2008 by

While I was slaving away last week, our friends over at NIST published a new version of SP 800-60.  Go check it out at the NIST Pubs Page.

Now for those of you who don’t know what 800-60 is, go check out my 3-part special on the Business Reference Model (BRM), a primer on how SP 800-60 aligning FIPS-199 with the BRM, and a post on putting it all together with a catalog of controls.

And oh yeah, the obligatory press reference: Government Computer News.

Data Release Show

Data Release Show photo by Discos Konfort.

So deep down inside, you have to be asking one question by now:  “Why do we need SP 800-60?”  Well, 800-60 does the following:

  • Level-sets data criticality across the Government:  Provides a frame of reference for determining criticality–ie, if my data is more important than this but less than this, then it’s a moderate for criticality.
  • Counters the tendency to rate system criticality higher than it should be:  Everybody wants to rate their system as high criticality because it’s the safe choice for their career.
  • Protection prioritization:  Helps us point out at a national level the systems that need more protection.
  • Is regulations-based:  The criticality ratings reflect laws and standards.  For example, Privacy Act Data is rated higher for confidentiality.

All things considered, it’s a pretty decent systemfor Government use.

Now this is where I have a bit of heartburn with GRC tools and data classification in general in the private sector–they classify the wrong things.  How the vendors (not all of them, there is a ton of variation in implementation) want you to categorize your data:

  • HIPAA-regulated
  • PCI-DSS-regulated
  • SOX-regulated
  • All other data types

How your CISO needs to categorize data to keep the business afloat:

  • Data that gets you paid:  If you’re a business, your #1 priority is getting money.  This is your billing/AR/POS data that needs to keep going.
  • Data that keeps you with a product to sale over the next week:  usually ERP data, stuff that slows down the production line.
  • Data that people want to rip off your customers:  hey, almost all the regulated data (PCI-DSS, HIPAA, etc) fits in here.
  • Data where people will rip you off:  ie, your internal financial systems.  Typically this is SOX country.

I guess really it comes down to the differences between compliance and risk, but in this case, one version will keep you from getting fined, the other will keep your business running.



Similar Posts:

Posted in FISMA, NIST | No Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: