GSA Looking for a Few Good Tools

Posted September 2nd, 2008 by

I like SmartBuy, I’ve talked about it before, it’s a software bulk-purchase program sponsored by GSA. The more types of software products they buy, the better for the people who need to depend on this stuff.

So I’m doing my usual beginning-of-the-week upcoming contracts perusal and something interesting caught my eye:  GSA is looking for “Situational Awareness and Incident Response” (SAIR) software to do a blanket purchase agreement for SmartBuy.

What they mean by SAIR (according to the pre-RFP information) is the following:

  • Baseline Configuration Management
  • Network Mapping
  • Vulnerability Management

Really, think something along the lines of FDCC/SCAP-aware tools to manage IT assets.  Not sure how the incident response piece fits in, but OK, I’ll go along with you here.  Makes sense if you stop and think about it–we have a FDCC mandate from OMB, and now we’re looking for the tools to help with it–I mentioned that FDCC without automation was futile almost 9000 years ago.

I know I have blog readers who make similar software, drop me a message if you need more details.

And for my daily dose of snarkiness:  it’s good to see how GSA has come such a long way in my life from being just the provider of skillcraft pens and simple green.  =)



Similar Posts:

Posted in FISMA, What Works | 5 Comments »
Tags:

Audit Requirements come to LOLCATS

Posted August 28th, 2008 by

Pet peeve of just about every CISO in existance:  the so-called “audit requirements”.  What they really mean to say is “It’s on the checklist, so it has to be true, just do what I say”.

Without traceability to the actual requirement, items on a checklist are just that: items on a checklist.

Anyway, on to the lulz:

funny pictures



Similar Posts:

Posted in IKANHAZFIZMA | 1 Comment »
Tags:

Cloud Computing and the Government

Posted August 19th, 2008 by

Post summary: We’re not ready yet culturally.

What spurred this blog post into being is this announcement from ServerVault and Apptis about a Federal Computing Cloud.  I think it’s a pretty ballsy move, and I’ll be watching to see if it works out.

Disclaimer being that at one time I managed security for something similar in a managed services world, only it was built by account with everything being a one-off.  And yeah, we didn’t start our organization the right way, so we had a ton of legacy concepts that we could never shake off, more than anything else about our commercial background and ways of doing things.

Current Theory on Cloud Computing

Current Theory on Cloud Computing photo by cote.

The way you make money in the managed services world is on standardization and economy-of-scale.  To us mere mortals, it means the following:

  • Standardized OS builds
  • Shared services where it makes sense
  • Shared services as the option of choice
  • Split your people’s time between clients
  • Up-charge for non-standard configurations
  • Refuse one-off configurations on a case-by-case basis

The last 2 were our downfall.  Always eager to please our clients, our senior leadership would agree to whatever one-offs that they felt were necessary for client relationship purposes but without regard to the increased costs and inefficiency when it came time to implement.

Now for those of you out in the non-Government world, let me bring you to the conundrum of the managed services world:  shared services only works in limited amounts.  Yes, you can manage your infrastructure better than the Government does, but they’ll still not like most of it because culturally, they expect a custom-built solution that they own.  Yes, it’s as simple as managing the client’s expectations of ownership v/s their cost savings, and I don’t think we’re over that hurdle yet.

And this is the reason: when it comes to security and cloud computing, the problem is that you’re only as technically literate as your auditors are.  If they don’t understand what the solution is and what the controls are around it, you do not have a viable solution for the public sector.

A “long time ago” (9000 years at least), I created the 2 golden rules for shared infrastructure:

  • One customer cannot see another customer.
  • One customer cannot affect another customer’s level of service.

And the side-rules for shared infrastructure in the public sector:

  • We have a huge set of common controls that you get the documentation to.  It will have my name on it, but you don’t have to spend the money to get it done.
  • It’s to my benefit to provide you with transparency in how my cloud operates because otherwise, my solution is invalidated by the auditors.
  • Come to us to design a solution, it’s cheaper for you that way.  I know how to do it effectively and more cheaply because it’s my business to know the economics of my cloud.
  • You have to give up control in some ways in order to get cost savings.
  • There is a line beyond which you cannot change or view because of the 2 golden rules.  The only exception is that I tell you how it’s made, but you can’t see any of the data that goes into my infrastructure.
  • If I let you audit my infrastructure, you’ll want to make changes, which can’t happen because of the 2 golden rules.
  • I’ll be very careful where I put your data because if your mission data spills into my infrastructure, I put myself at extreme risk.

So, are ServerVault and Apptis able to win in their cloud computing venture?  Honestly, I don’t know.  I do think that when somebody finally figures out how to do cloud computing with the Federal Government, it will pay off nicely.

I think Apptis might be spreading themselves fairly thin at this point, rumor has it they were having some problems last year.  I think ServerVault is in their comfort space and didn’t have to do too much for this service offering.

I can’t help but think that there’s something missing in all of this, and that something is a partnering with the a sponsoring agency on a Line of Business.  FEA comes to mind.



Similar Posts:

Posted in What Doesn't Work, What Works | 1 Comment »
Tags:

Draft of SP 800-37 R1 is Out for Public Review

Posted August 19th, 2008 by

Go check it out (caveat: .pdf file) and send your comments to sec-cert@nist.gov

I’ve just given it a glance and here are some things that I’ve noticed:

  • Section on security in SDLC
  • Incorporates some of the concepts in SP 800-39 about enterprise-wide risk
  • Section on common controls
  • The process has remained pretty much the same but now references all the other core documents

Where I see this revision’s weaknesses:

  • Still possible to view the C&A process as happening at the end of the SDLC as a gateway activity.  This is the path to failure–you have to start at the initiation of a project.  In other words, I don’t think the SDLC thing is obvious enough for the constituency.  C&A should be all about security in the SDLC, and I  think we’ve done ourselves a disservice by trying to separate the two.
  • Unity:  Yes, we have all the pieces there, but the document doesn’t flow as a whole yet.  BFD, I’ll get over it soon enough.
  • It all goes back to metrics:  If completed C&A is going to be one of the core metrics that you use (or OMB uses), then it should be THE core document with everything else being a stub of of it.  We have a start, but I don’t think it’s as fleshed-out as it needs to be.

Side-note for NIST:  C&A is the implementation of the System Security Engineering Process, some of that SSE has a place in 800-37.

The origingal announcement from NIST is this:

NIST, in cooperation with the Office of the Director of National Intelligence (ODNI), the Department of Defense (DOD), and the Committee on National Security Systems (CNSS), announces the completion of an interagency project to develop a common process to authorize federal information systems for operation. The initial public draft of NIST Special Publication 800-37, Revision 1, Guide for Security Authorization of Federal Information Systems: A Security Lifecycle Approach, is now available for a six-week public comment period. The publication contains the proposed new security authorization process for the federal government (currently commonly referred to as certification and accreditation, or C&A). The new process is consistent with the requirements of the Federal Information Security Management Act (FISMA) and the Office of Management and Budget (OMB) Circular A-130, Appendix III, promotes the concept of near real-time risk management based on continuous monitoring of federal information systems, and more closely couples information security requirements to the Federal Enterprise Architecture (FEA) and System Development Life Cycle (SDLC). The historic nature of the partnership among the Civil, Defense, and Intelligence Communities and the rapid convergence of information security standards and guidelines for the federal government will have a significant impact on the federal government’s ability to protect its information systems and networks. The convergence of security standards and guidelines is forging ahead with the development of a series of new CNSS policies and instructions that closely parallel the NIST security standards and guidelines developed in response to FISMA. The CNSS policies and instructions which address the specific areas of security categorization, security control specification, security control assessment, risk management, and security authorization, coupled with the current NIST publications will provide a more unified information security framework for the federal government and its contracting base. The unified approach to information security is brought together in part by the update to NIST Special Publication 800-37, Revision 1, which provides a common security authorization process and references the NIST and CNSS publications for the national security and non national security communities, respectively. The convergence activities mentioned above along with tighter integration of security requirements into the FEA and SDLC processes will promote more consistent and cost-effective information security and trusted information sharing across the federal government. Comments on the IPD of SP 800-37, Revision 1 should be provided by September 30, 2008 and forwarded to the Computer Security Division, Information Technology Laboratory at NIST or submitted via email to: sec-cert@nist.gov .
 



Similar Posts:

Posted in Uncategorized | 1 Comment »
Tags:

New SP 800-60 is Out, Categorize Yerselves Mo Better

Posted August 18th, 2008 by

While I was slaving away last week, our friends over at NIST published a new version of SP 800-60.  Go check it out at the NIST Pubs Page.

Now for those of you who don’t know what 800-60 is, go check out my 3-part special on the Business Reference Model (BRM), a primer on how SP 800-60 aligning FIPS-199 with the BRM, and a post on putting it all together with a catalog of controls.

And oh yeah, the obligatory press reference: Government Computer News.

Data Release Show

Data Release Show photo by Discos Konfort.

So deep down inside, you have to be asking one question by now:  “Why do we need SP 800-60?”  Well, 800-60 does the following:

  • Level-sets data criticality across the Government:  Provides a frame of reference for determining criticality–ie, if my data is more important than this but less than this, then it’s a moderate for criticality.
  • Counters the tendency to rate system criticality higher than it should be:  Everybody wants to rate their system as high criticality because it’s the safe choice for their career.
  • Protection prioritization:  Helps us point out at a national level the systems that need more protection.
  • Is regulations-based:  The criticality ratings reflect laws and standards.  For example, Privacy Act Data is rated higher for confidentiality.

All things considered, it’s a pretty decent systemfor Government use.

Now this is where I have a bit of heartburn with GRC tools and data classification in general in the private sector–they classify the wrong things.  How the vendors (not all of them, there is a ton of variation in implementation) want you to categorize your data:

  • HIPAA-regulated
  • PCI-DSS-regulated
  • SOX-regulated
  • All other data types

How your CISO needs to categorize data to keep the business afloat:

  • Data that gets you paid:  If you’re a business, your #1 priority is getting money.  This is your billing/AR/POS data that needs to keep going.
  • Data that keeps you with a product to sale over the next week:  usually ERP data, stuff that slows down the production line.
  • Data that people want to rip off your customers:  hey, almost all the regulated data (PCI-DSS, HIPAA, etc) fits in here.
  • Data where people will rip you off:  ie, your internal financial systems.  Typically this is SOX country.

I guess really it comes down to the differences between compliance and risk, but in this case, one version will keep you from getting fined, the other will keep your business running.



Similar Posts:

Posted in FISMA, NIST | No Comments »
Tags:

Legacy Systems: Where the Catalog Falls Apart and LOLCATS Roam

Posted July 31st, 2008 by

Let’s face it, compliance in IT security is a myth.  Compliance in IT security with legacy systems is like a chupacabbra riding a white unicorn chasing a leprechaun while waving Excalibur.  And the auditors just shake their head and wonder why you can’t just comply.

Anyway, on to the LOLCATZ (note that I’m getting all creative-stylie with haikus this week, must be something in the beer last night):

 

funny pictures



Similar Posts:

Posted in IKANHAZFIZMA | 2 Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: