Lolcats take on Laws, Sausage, Cyberwhatzits, and PCI

Posted April 23rd, 2009 by

I just love how the security twit community this month is turning into a bunch of public policy wonks before my very eyes.  First we have S.773, the Cybersecurity Act of 2009 and then the House hearings on the effectiveness of PCI.

“To retain respect for sausages and laws, one must not watch them in the making.” –Otto von Bismark (ref: Wikipedia)


funny pictures



Similar Posts:

Posted in IKANHAZFIZMA, Public Policy | 2 Comments »
Tags:

Blow-By-Blow on S.773–The Cybersecurity Act of 2009–Part 1

Posted April 14th, 2009 by

Rybolov Note: this is such a long blog post that I’m breaking it down into parts.  Go read the bill hereGo read part two hereGo read part three here. Go read part four hereGo read part 5 here. =)

So the Library of Congress finally got S.773 up on http://thomas.loc.gov/.  For those of you who have been hiding under a rock, this is the Cybersecurity Act of 2009 and is a bill introduced by Senators Rockefeller and Snowe and, depending on your political slant, will allow us to “sock it to the hackers and send them to a federal pound-you-in-the-***-prison” or “vastly erode our civil liberties”.

A little bit of pre-reading is in order:

Timing: Now let’s talk about the timing of this bill.  There is the 60-day Cybersecurity Review that is supposed to be coming out Real Soon Now (TM).  This bill is an attempt by Congress to head it off at the pass.

Rumor mill says that not only will the Cybersecurity Review be unveiled at RSA (possible, but strange) and that it won’t bring anything new to the debate (more possibly, but then again, nothing’s really new, we’ve known about this stuff for at least a decade).

Overall Comments:

This bill is big.  It really is an omnibus Cybersecurity Act and has just about everything you could want and more.  There’s a fun way of doing things in the Government, and it goes something like this: ask for 300% of what you need so that you will end up with 80%.  And I see this bill is taking this approach to heart.

Pennsylvania Ave – Old Post Office to the Capitol at Night photo by wyntuition.

And now for the good, bad, and ugly:

SEC. 2. FINDINGS. This section is primarily a summary of testimony that has been delivered over the past couple of years.  It really serves as justification for the rest of the bill.  It is a little bit on the FUD side of things (as in “omigod, they put ‘Cyber-Katrina‘ in a piece of legislation”), but overall it’s pretty balanced and what you would expect for a bill.  Bottom line here is that we depend on our data and the networks that carry it.  Even if you don’t believe in Cyberwar (I don’t really believe in Cyberwar unles it’s just one facet of combined arms warfare), you can probably agree that the costs of insecurity on a macroeconomic scale need to be looked at and defended against, and our dependency on the data and networks is only going to increase.

No self-respecting security practitioner will like this section, but politicians will eat it up.  Relax, guys, you’re not the intended audience.

Verdict: Might as well keep this in there, it’s plot development without any requirements.

SEC. 3. CYBERSECURITY ADVISORY PANEL. This section creates a Cybersecurity Advisory Panel made up of Federal Government, private sector, academia, and state and local government.  This is pretty typical so far.  The interesting thing to me is “(7) whether societal and civil liberty concerns are adequately addressed”… in other words, are we balancing security with citizens’, corporations’, and states’ rights?  More to come on this further down in the bill.

Verdict: Will bring a minimal cost in Government terms.  I’m very hesitant to create new committees.  But yeah, this can stay.

SEC. 4. REAL-TIME CYBERSECURITY DASHBOARD. This section is very interesting to me.  On one hand, it’s what we do at the enterprise level for most companies.  On the other hand, this is specific to the Commerce Department –“Federal Government information systems and networks managed by the Department of Commerce.”  The first reading of this is the internal networks that are internal to Commerce, but then why is this not handed down to all agencies?  I puzzled on this and did some research until I remembered that Commerce, through NTIA, runs DNS, and Section 8 contains a review of the DNS contracts.

Verdict: I think this section needs a little bit of rewording so that the scope is clearer, but sure, a dashboard is pretty benign, it’s the implied tasks to make a dashboard function (ie, proper management of IT resources and IT security) that are going to be the hard parts.  Rescope the dashboard and explicitly say what kind of information it needs to address and who should receive it.

SEC. 5. STATE AND REGIONAL CYBERSECURITY ENHANCEMENT PROGRAM. This section calls for Regional Cybersecurity Centers, something along the lines of what we call “Centers of Excellence” in the private sector.  This section is interesting to me, mostly because of how vague it seemed the first time I read it, but the more times I look at it, I go “yeah, that’s actually a good idea”.  What this section tries to do is to bridge the gap between the standards world that is NIST and the people outside of the beltway–the “end-users” of the security frameworks, standards, tools, methodologies, what-the-heck-ever-you-want-to-call-them.  Another interesting thing about this is that while the proponent department is Commerce, NIST is part of Commerce, so it’s not as left-field as you might think.

Verdict: While I think this section is going to take a long time to come to fruition (5+ years before any impact is seen), I see that Regional Cybersecurity Centers, if properly funded and executed, can have a very significant impact on the rest of the country.  It needs to happen, only I don’t know what the cost is going to be, and that’s the part that scares me.

SEC. 6. NIST STANDARDS DEVELOPMENT AND COMPLIANCE. This is good.  Basically this section provides a mandate for NIST to develop a series of standards.  Some of these have been sitting around for some time in various incarnations, I doubt that anyone would disagree that these need to be done.

  1. CYBERSECURITY METRICS RESEARCH:  Good stuff.  Yes, this needs help.  NIST are the people to do this kind of research.
  2. SECURITY CONTROLS:  Already existing in SP 800-53.  Depending on interpretation, this changes the scope and language of the catalog of controls to non-Federal IT systems, or possibly a fork of the controls catalog.
  3. SOFTWARE SECURITY:  I guess if it’s in a law, it has come of age.  This is one of the things that NIST has wanted to do for some time but they haven’t had the manpower to get involved in this space.
  4. SOFTWARE CONFIGURATION SPECIFICATION LANGUAGE: Part of SCAP.  The standard is there, it just needs to be extended to various pieces of software.
  5. STANDARD SOFTWARE CONFIGURATION:  This is the NIST configuration checklist program ala SP 800-70.  I think NIST ran short on manpower for this also and resorted back to pointing at the DISA STIGS and FDCC.  This so needs further development into a uniform set of standards and then, here’s the key, rolled back upstream to the software vendors so they ship their product pre-configured.
  6. VULNERABILITY SPECIFICATION LANGUAGE: Sounds like SCAP.

Now for the “gotchas”:

(d) COMPLIANCE ENFORCEMENT- The Director shall–

(1) enforce compliance with the standards developed by the Institute under this section by software manufacturers, distributors, and vendors; and

(2) shall require each Federal agency, and each operator of an information system or network designated by the President as a critical infrastructure information system or network, periodically to demonstrate compliance with the standards established under this section.

This section basically does 2 things:

  • Mandates compliancy for vendors and distributors with the NIST standards listed above.  Suprised this hasn’t been talked about elsewhere.  This clause suffers from scope problems because if you interpret it BSOFH-stylie, you can take it to mean that anybody who sells a product, regardless of who’s buying, has to sell a securely-configured version.  IE, I can’t sell XP to blue-haired grandmothers unless I have something like an FDCC variant installed on it.  I mostly agree with this in the security sense but it’s a serious culture shift in the practical sense.
  • Mandates an auditing scheme for Federal agencies and critical infrastructure.  Everybody’s talked about this, saying that since designation of critical infrastructure is not defined, this is left at the discretion of the Executive Branch.  This isn’t as wild-west as the bill’s opponents want it to seem, there is a ton of groundwork layed out in HSPD-7.  But yeah, HSPD-7 is an executive directive and can be changed “at the whim” of the President.  And yes, this is auditing by Commerce, which has some issues in that Commerce is not equipped to deal with IT security auditing.  More on this in a later post.

Verdict: The standard part is already happening today, this section just codifies it and justify’s NIST’s research.  Don’t task Commerce with enforcement of NIST standards, it leads down all sorts of inappropriate roads.



Similar Posts:

Posted in Public Policy, What Doesn't Work, What Works | 7 Comments »
Tags:

NIST Framework for FISMA Dates Announced

Posted April 10th, 2009 by

Some of my friends (and maybe myself) will be teaching the NIST Framework for FISMA in May and June with Potomac Forum.   This really is an awesome program.  Some highlights:

  • Attendance is limited to Government employees only so that you can talk openly with your peers.
  • Be part of a cohort that trains together over the course of a month.
  • The course is 5 Fridays so that you can learn something then take it back to work the next week.
  • We have a Government speaker ever week, from the NIST FISMA guys to agency CISOs and CIOs.
  • No pitching, no marketing, no product placement (OK, maybe we’ll go through DoJ’s CSAM but only as an example of what kinds of tools are out there) , no BS.

See you all there!



Similar Posts:

Posted in NIST, Speaking | 1 Comment »
Tags:

FIPS and the Linux Kernel

Posted March 5th, 2009 by

Recently I was building a new kernel for my firewall and noticed an interesting new option in the Cryptographic API: “FIPS 200 compliance“.

You can imagine how very interesting and somewhat confusing this is to a stalwart FISMA practitioner. Reading through FIPS 200 it’s hard to find mention of cryptography, much less a technical specification that could be implemented in the Linux kernel. FIPS 140, FIPS 197, FIPS 186, FIPS 46 and FIPS 180 standards would be natural fits in the Cryptographic API but FIPS 200? The kernel help description didn’t clear things up:

CONFIG_CRYPTO_FIPS:

This options enables the fips boot option which is
required if you want to system to operate in a FIPS 200
certification. You should say no unless you know what
this is.

Symbol: CRYPTO_FIPS [=n]
Prompt: FIPS 200 compliance
Defined at crypto/Kconfig:24
Depends on: CRYPTO
Location:
-> Cryptographic API (CRYPTO [=y])
Selected by: CRYPTO_ANSI_CPRNG && CRYPTO

Given that examining the kernel code was a little beyond my ken and I couldn’t test to discover what it did I turned to the third of the 800-53A assessment methods, interview. A little digging on kernel.org turned up the man behind this kernel magic, Neil Horman. He was able to shed some light on what is called the fips_enabled flag.

As it turns out the FIPS 200 compliance function wasn’t as exciting as I’d hoped but it does point to interesting future possibilities.

So what does it do? In the words of Neil Horman, it is a “flag for determining if we need to be operating in some fips_compliant mode (without regard to the specific criteria)”. This means it is sort of a place holder for future developments so the kernel can operate in a mode that uses a FIPS 140-2 cryptographic module.

Did you notice the word that wasn’t included in the last paragraph? Validated. Yes, there are no validated cryptographic modules in the Linux upstream kernel. If you look at the kernel’s Cryptographic API you will find listed the “AES cipher algorithms” and “DES and Triple DES EDE cipher algorithms”. These may be compliant with FIPS standards but they are not validated.

This begs the question, why have a FIPS 200 compliance flag if you can’t meet the FIPS 140-2 requirement? This is the interesting part. Let’s say a distro decides it wants to become very FISMA friendly and get their kernel’s FIPS 140-2 cryptographic module validated. Well, if the validation of the OpenSSL VCM is an apt example the distro’s Linux kernel will need to operate in a FIPS compliant mode to verifiably load the cryptographic module. So the inclusion of the fips_enabled flag enables future compliance.

Sadly it is unlikely that any single Linux distro getting their cryptographic module validated will not translate to the upstream kernel having a validated cryptographic module. If you look at the catalog of FIPS 140-2 VCM’s the modules are only validated for particular code versions and operating mode. As the upstream kernel code won’t likely see the revisions made by the downstream distro in order to achieve the VCM until after the VCM is issued it doesn’t inherit the validation.

Polyester Resin Kernel photo by  Marshall Astor – Food Pornographer.

Two possible scenarios were discussed with Neil to allow for upstream Linux kernel incorporation of a VCM.

The first scenario would be that the upstream kernel gets all the revisions made by the downstream distro to gain the VCM designation. It then goes through the process to gain the VCM itself. Unfortunately as the code is under constant revision and can’t be locked as soon as a revision was committed to the code base the VCM would be invalidated. Only a particular build of the Linux kernel could claim to be validated.

The second scenario would be a revision to the Linux kernel that allowed for the downstream’s Linux distro’s VCM to be loaded instead of the standard Linux Cryptographic API. When asked about this scenario Neil had this to say:

“That said, theres no reason the crypto api couldn’t be ripped out and replaced with a different implementation, one that is maintained independently and its certification kept up. Of course, anyone so doing would need to keep up with the pace of kernel development, and that in turn brings the need for recertification, so its rather a lost effort in my opinion. I certainly wouldn’t recommend doing so, its just too much work.”

So the solution would either be short lived and costly or long lived and insecure.

Sadly this means that there is no easy way to include FIPS 140-2 VCM within the upstream Linux kernel. But each distro can modify their Cryptographic API and validate a cryptographic module to allow for FIPS 200 compliance. With the FIPS 200 compliance flag now in the Linux kernel it is possible for this to be verified. And that’s a happy thought for Federal Linux users.

My many thanks to Neil Horman, without whom I’d have nothing to write.



Similar Posts:

Posted in FISMA, Technical | No Comments »
Tags:

The Authorizing Official and the Lolcat

Posted February 12th, 2009 by

Hot on the heels of our DAA presentation, the Guerilla CISO is proud to present our lolcat Authorizing Official.

Yes, it’s all about the budget. If you do not have a budget, you do not have the ability to change things. We have a word for people who have all the authority of an Authorizing Official but none of the budget: scapegoat.

funny pictures

And since I’m in Toronto for an extended stay thanks to the weather, today is a 2-fer:
funny pictures



Similar Posts:

Posted in IKANHAZFIZMA | No Comments »
Tags:

The Accreditation Decision and the Authorizing Official

Posted February 10th, 2009 by

The accreditation decision is one of the most key activities in how the US Government secures its systems. It’s also one of the most misunderstood activities. This slideshow aims to explain the role of the Authorizing Official and to give you some understanding into why and how accreditation decisions are made.

I would like to give a big thanks to Joe Faraone and Graydon McKee who helped out.

The presentation is licensed under Creative Commons, so feel free to download it, email it, and use it in your own training.



Similar Posts:

Posted in FISMA, NIST, Risk Management, Speaking | 5 Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: