Ooh, “The Word” is out on S 3474

Posted September 19th, 2008 by

Federal Computer Week: Senate Panel Rejects Weakening S 3474

Gene Schultz: Goodbye FISMA (as We Know It)

Let’s talk through the FCW article first, shall we?   =)

“The measure would amend the original FISMA legislation, which outlined compliance activities for agencies to meet each year. However, many agencies have turned FISMA compliance into a paperwork exercise, Carper said.”

Um, no, I don’t get that.  The original FISMA is an information security management law, this law mostly formalizes the role, responsibility, and authority of the CISO.  They intentionally named it FISMA 2008 to make people think that it was ammending the original FISMA, but it doesn’t do that.

Don’t believe the hype, this will not change the original FISMA, it’s just an addition.

“Carper said CIOs primarily develop and oversee policy, but the CISO handles the daily information security activities. He suggested that a CISO council could have a sunset date of two or three years. If the council demonstrated benefits, it could be extended, Carper said.”

OK, fair enough on the cost and coordination, but what the CISO council objectionists don’t understand is that the CIOs don’t know all of the nuts and bolts of security, that’s why we have CISO as a mandatory position in this bill–so that the CIO has a subject-matter-expert to help them out.  Yes, it’s that specialized as a profession.

Now for Gene Schultz:

“First and foremost, to comply with this statute involves generating huge amounts of paperwork to document actions (or lack thereof) taken to address the many areas that FISMA describes. A completely ineffective security practice can get high FISMA marks, as has happened numerous times before.”

OK, this is a little lesson on FISMA paperwork:  people are doing 4x what they should be doing for the following reasons:

  • The people doing the writing do not know what they are actually doing
  • The agency’s security program is not mature enough to have shared/common controls
  • In the world of auditors, if it’s not written down, it doesn’t exist
  • CYA purposes–I told you this was a risk

So you think you’re going to do any better with any other framework/law and the same people executing it?

“Two US Senators, Joseph Lieberman of Connecticut and Tom Carper of Delaware, have recently introduced a Senate bill that would render the 2002 version of FISMA obsolete.”

No, to be bluntfully honest, the old version of FISMA will still be around.  Somebody’s been drinking the kool-aid from the lawmakers and the press machine.  If anything, this adds more junk that you can get audited on and an additional layer of paperwork to demonstrate that you have met the provisions of FISMA 2008.

Post No Bills photo by striatic.

Note to our nation’s Lawmakers: as long as you approach information security from the compliance angle, we as a government are doomed to failure and to turn the entire thing into the checklist activity because the people who evaluate compliance are auditors who only know checklists–it’s not a law problem, it’s a people and skills problem.

This bill is actually pretty good with the exception of divorcing the mission owners from the security of the systems that support their mission.

However, if you think that you can reduce the compliance trap by adding more things that will end up on a compliance checklist, you have to be kidding yourself or you don’t understand the auditor mentality.

I keep reconvincing myself that the only way the government can win at security is to promote programs to develop people with security skills.  Of course, that isn’t as sexy as throwing out a bill that you can claim will make FISMA obsolete.

And finally, for those of you playing along at home, the Thomas entry for S 3474, the bill’s page on Washington Watch and the bill’s page on GovTrack.



Similar Posts:

Posted in FISMA | 3 Comments »
Tags:

Assessment Cases for 800-53A Are Available

Posted August 25th, 2008 by

Ever feel lost and lonely when staring at the business end of an ST&E?  Confounded and confused considering Configuration controls?  Perplexed and Puzzled at Planning procedures?  Anxious or amazed at Audit and Accountability assessments?  Annoyed at aimless alliteration?

NIST has heard your muttered curses and answered them!  (Except the annoying alliteration, which is my fault.)

Now available are the Assessment Cases for Special Publication 800-53A.  The Assessment Cases offer supplemental guidance on assessing security controls found in the recently released SP 800-53A Guide for Assessing the Security Controls in Federal Information Systems (PDF Warning).  These documents are in their Initial Public Draft so be sure to give them a look and provide some feedback.

The Assessment Cases contain consensus recommendations from the Assessment Cases Project on specific actions to perform when assessing security controls.  These specific actions are intended to complement the assessment procedures documented in NIST SP 800-53A.   Yes, you heard that right, Specific Actions.  Less time spent pondering how to “Examine: … other relevant documents or records”.

The Assessment Cases Project is an inter-agency workgroup headed by DoJ with members including NIST, DoE, DoT and ODNI-CIO.  Many thanks for the hard work of this workgroup’s membership.  You may not be able to hear it but I am applauding on this side of the keyboard.  And a big thanks to Patrick O’Reilly for pointing me to this wonderful resource.



Similar Posts:

Posted in FISMA, NIST, What Works | 1 Comment »
Tags:

Cloud Computing and the Government

Posted August 19th, 2008 by

Post summary: We’re not ready yet culturally.

What spurred this blog post into being is this announcement from ServerVault and Apptis about a Federal Computing Cloud.  I think it’s a pretty ballsy move, and I’ll be watching to see if it works out.

Disclaimer being that at one time I managed security for something similar in a managed services world, only it was built by account with everything being a one-off.  And yeah, we didn’t start our organization the right way, so we had a ton of legacy concepts that we could never shake off, more than anything else about our commercial background and ways of doing things.

Current Theory on Cloud Computing

Current Theory on Cloud Computing photo by cote.

The way you make money in the managed services world is on standardization and economy-of-scale.  To us mere mortals, it means the following:

  • Standardized OS builds
  • Shared services where it makes sense
  • Shared services as the option of choice
  • Split your people’s time between clients
  • Up-charge for non-standard configurations
  • Refuse one-off configurations on a case-by-case basis

The last 2 were our downfall.  Always eager to please our clients, our senior leadership would agree to whatever one-offs that they felt were necessary for client relationship purposes but without regard to the increased costs and inefficiency when it came time to implement.

Now for those of you out in the non-Government world, let me bring you to the conundrum of the managed services world:  shared services only works in limited amounts.  Yes, you can manage your infrastructure better than the Government does, but they’ll still not like most of it because culturally, they expect a custom-built solution that they own.  Yes, it’s as simple as managing the client’s expectations of ownership v/s their cost savings, and I don’t think we’re over that hurdle yet.

And this is the reason: when it comes to security and cloud computing, the problem is that you’re only as technically literate as your auditors are.  If they don’t understand what the solution is and what the controls are around it, you do not have a viable solution for the public sector.

A “long time ago” (9000 years at least), I created the 2 golden rules for shared infrastructure:

  • One customer cannot see another customer.
  • One customer cannot affect another customer’s level of service.

And the side-rules for shared infrastructure in the public sector:

  • We have a huge set of common controls that you get the documentation to.  It will have my name on it, but you don’t have to spend the money to get it done.
  • It’s to my benefit to provide you with transparency in how my cloud operates because otherwise, my solution is invalidated by the auditors.
  • Come to us to design a solution, it’s cheaper for you that way.  I know how to do it effectively and more cheaply because it’s my business to know the economics of my cloud.
  • You have to give up control in some ways in order to get cost savings.
  • There is a line beyond which you cannot change or view because of the 2 golden rules.  The only exception is that I tell you how it’s made, but you can’t see any of the data that goes into my infrastructure.
  • If I let you audit my infrastructure, you’ll want to make changes, which can’t happen because of the 2 golden rules.
  • I’ll be very careful where I put your data because if your mission data spills into my infrastructure, I put myself at extreme risk.

So, are ServerVault and Apptis able to win in their cloud computing venture?  Honestly, I don’t know.  I do think that when somebody finally figures out how to do cloud computing with the Federal Government, it will pay off nicely.

I think Apptis might be spreading themselves fairly thin at this point, rumor has it they were having some problems last year.  I think ServerVault is in their comfort space and didn’t have to do too much for this service offering.

I can’t help but think that there’s something missing in all of this, and that something is a partnering with the a sponsoring agency on a Line of Business.  FEA comes to mind.



Similar Posts:

Posted in What Doesn't Work, What Works | 1 Comment »
Tags:

Draft of SP 800-37 R1 is Out for Public Review

Posted August 19th, 2008 by

Go check it out (caveat: .pdf file) and send your comments to sec-cert@nist.gov

I’ve just given it a glance and here are some things that I’ve noticed:

  • Section on security in SDLC
  • Incorporates some of the concepts in SP 800-39 about enterprise-wide risk
  • Section on common controls
  • The process has remained pretty much the same but now references all the other core documents

Where I see this revision’s weaknesses:

  • Still possible to view the C&A process as happening at the end of the SDLC as a gateway activity.  This is the path to failure–you have to start at the initiation of a project.  In other words, I don’t think the SDLC thing is obvious enough for the constituency.  C&A should be all about security in the SDLC, and I  think we’ve done ourselves a disservice by trying to separate the two.
  • Unity:  Yes, we have all the pieces there, but the document doesn’t flow as a whole yet.  BFD, I’ll get over it soon enough.
  • It all goes back to metrics:  If completed C&A is going to be one of the core metrics that you use (or OMB uses), then it should be THE core document with everything else being a stub of of it.  We have a start, but I don’t think it’s as fleshed-out as it needs to be.

Side-note for NIST:  C&A is the implementation of the System Security Engineering Process, some of that SSE has a place in 800-37.

The origingal announcement from NIST is this:

NIST, in cooperation with the Office of the Director of National Intelligence (ODNI), the Department of Defense (DOD), and the Committee on National Security Systems (CNSS), announces the completion of an interagency project to develop a common process to authorize federal information systems for operation. The initial public draft of NIST Special Publication 800-37, Revision 1, Guide for Security Authorization of Federal Information Systems: A Security Lifecycle Approach, is now available for a six-week public comment period. The publication contains the proposed new security authorization process for the federal government (currently commonly referred to as certification and accreditation, or C&A). The new process is consistent with the requirements of the Federal Information Security Management Act (FISMA) and the Office of Management and Budget (OMB) Circular A-130, Appendix III, promotes the concept of near real-time risk management based on continuous monitoring of federal information systems, and more closely couples information security requirements to the Federal Enterprise Architecture (FEA) and System Development Life Cycle (SDLC). The historic nature of the partnership among the Civil, Defense, and Intelligence Communities and the rapid convergence of information security standards and guidelines for the federal government will have a significant impact on the federal government’s ability to protect its information systems and networks. The convergence of security standards and guidelines is forging ahead with the development of a series of new CNSS policies and instructions that closely parallel the NIST security standards and guidelines developed in response to FISMA. The CNSS policies and instructions which address the specific areas of security categorization, security control specification, security control assessment, risk management, and security authorization, coupled with the current NIST publications will provide a more unified information security framework for the federal government and its contracting base. The unified approach to information security is brought together in part by the update to NIST Special Publication 800-37, Revision 1, which provides a common security authorization process and references the NIST and CNSS publications for the national security and non national security communities, respectively. The convergence activities mentioned above along with tighter integration of security requirements into the FEA and SDLC processes will promote more consistent and cost-effective information security and trusted information sharing across the federal government. Comments on the IPD of SP 800-37, Revision 1 should be provided by September 30, 2008 and forwarded to the Computer Security Division, Information Technology Laboratory at NIST or submitted via email to: sec-cert@nist.gov .
 



Similar Posts:

Posted in Uncategorized | 1 Comment »
Tags:

New SP 800-60 is Out, Categorize Yerselves Mo Better

Posted August 18th, 2008 by

While I was slaving away last week, our friends over at NIST published a new version of SP 800-60.  Go check it out at the NIST Pubs Page.

Now for those of you who don’t know what 800-60 is, go check out my 3-part special on the Business Reference Model (BRM), a primer on how SP 800-60 aligning FIPS-199 with the BRM, and a post on putting it all together with a catalog of controls.

And oh yeah, the obligatory press reference: Government Computer News.

Data Release Show

Data Release Show photo by Discos Konfort.

So deep down inside, you have to be asking one question by now:  “Why do we need SP 800-60?”  Well, 800-60 does the following:

  • Level-sets data criticality across the Government:  Provides a frame of reference for determining criticality–ie, if my data is more important than this but less than this, then it’s a moderate for criticality.
  • Counters the tendency to rate system criticality higher than it should be:  Everybody wants to rate their system as high criticality because it’s the safe choice for their career.
  • Protection prioritization:  Helps us point out at a national level the systems that need more protection.
  • Is regulations-based:  The criticality ratings reflect laws and standards.  For example, Privacy Act Data is rated higher for confidentiality.

All things considered, it’s a pretty decent systemfor Government use.

Now this is where I have a bit of heartburn with GRC tools and data classification in general in the private sector–they classify the wrong things.  How the vendors (not all of them, there is a ton of variation in implementation) want you to categorize your data:

  • HIPAA-regulated
  • PCI-DSS-regulated
  • SOX-regulated
  • All other data types

How your CISO needs to categorize data to keep the business afloat:

  • Data that gets you paid:  If you’re a business, your #1 priority is getting money.  This is your billing/AR/POS data that needs to keep going.
  • Data that keeps you with a product to sale over the next week:  usually ERP data, stuff that slows down the production line.
  • Data that people want to rip off your customers:  hey, almost all the regulated data (PCI-DSS, HIPAA, etc) fits in here.
  • Data where people will rip you off:  ie, your internal financial systems.  Typically this is SOX country.

I guess really it comes down to the differences between compliance and risk, but in this case, one version will keep you from getting fined, the other will keep your business running.



Similar Posts:

Posted in FISMA, NIST | No Comments »
Tags:

C&A Seminar in August, Instructor-to-Coolness Ratio Goes Up!

Posted July 28th, 2008 by

Potomac Forum is having a 2-day C&A seminar on August 6th and 7th.  It will be unusually good this time because I won’t be there to drag everybody down–I’ll be on the road for some training.  =)  Anyway, check it out and say hi to my instructors from me.



Similar Posts:

Posted in FISMA, Speaking | 1 Comment »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: