Some Thoughts on POA&M Abuse

Posted June 8th, 2009 by

Ack, Plans of Action and Milestones.  I love them and I hate them.

For those of you who “don’t habla Federali”, a POA&M is basically an IOU from the system owner to the accreditor that yes, we will fix something but for some reason we can’t do it right now.  Usually these are findings from Security Test and Evaluation (ST&E) or Certification and Accreditation (C&A).  In fact, some places I’ve worked, they won’t make new POA&Ms unless they’re traceable back to ST&E results.

Functions that a POA&M fulfills:

  • Issue tracking to resolution
  • Serves as a “risk register”
  • Used as the justification for budget
  • Generate mitigation metrics
  • Can be used for data-mining to find common vulnerabilities across systems

But today, we’re going to talk about POA&M abuse.  I’ve seen my fair share of this.

Conflicting Goals: The basic problem is that we want POA&Ms to satisfy too many conflicting functions.  IE, if we use the number of open POA&Ms as a metric to determine if our system owners are doing their job and closing out issues but we also turn around and report these at an enterprise level to OMB or at the department level, then it’s a conflict of interest to get these closed as fast as possible, even if it means losing your ability to track things at the system level or to spend the time doing things that solve long-term security problems–our vulnerability/weakness/risk management process forces us into creating small, easily-to-satisfy POA&Ms instead of long-term projects.

Near-Term v/s Long-Term:  If we set up POA&Ms with due dates of 30-60-90 (for high, moderate, and low risks) days, we don’t really have time at all to turn these POA&Ms into budget support.  Well, if we manage the budget up to 3 years in advance and we have 90 days for high-risk findings, then that means we’ll have exactly 0 input into the budget from any POA&M unless we can delay the bugger for 2 years or so, much too long for it to actually be fixable.

Bad POA&Ms:  Let’s face it, sometimes the one-for-one nature of ST&E, C&A, and risk assessment findings to POA&Ms means that you get POA&Ms that are “bad” and by that I mean that they can’t be satisfied or they’re not really something that you need to fix.

Some of the bad POA&Ms I’ve seen, these are paraphrased from the original:

  • The solution uses {Microsoft|Sun|Oracle} products which has a history of vulnerabilities.
  • The project team needs to tell the vendor to put IPV6 into their product roadmap
  • The project team needs to implement X which is a common control provided at the enterprise level
  • The System Owner and DAA have accepted this risk but we’re still turning it into a POA&M
  • This is a common control that we really should handle at the enterprise level but we’re putting it on your POA&M list for a simple web application

Plan of Action for Refresh Philly photo by jonny goldstein.

Keys to POA&M Nirvana:  So over the years, I’ve observed some techniques for success in working with POA&Ms:

  • Agree on the evidence/proof of POA&M closure when the POA&M is created
  • Fix it before it becomes a POA&M
  • Have a waiver or exception process that requires a cost-benefit-risk analysis
  • Start with”high-level” POA&Ms and work down to more detailed POA&Ms as your security program matures
  • POA&Ms are between the System Owner and the DAA, but the System Owner can turn around and negotiate a POA&M as a cedural with an outsourced IT provider

And then the keys to Building Good POA&Ms:

  • Actionable–ie, they have something that you need to do
  • Achievable–they can be accomplished
  • Demonstrable–you can demonstrate that the POA&M has been satisfied
  • Properly-Scoped–absorbed at the agency level, the common control level, or the system level
  • They are SMART: Specific, Manageable, Attainable, Relevant, and within a specified Timeframe
  • They are DUMB: Doable, Understandable, Manageable, and Beneficial

Yes, I stole the last 2 bullets from the picture above, but they make really good sense in a way that “know thyself” is awesome advice from the Oracle at Delphi.



Similar Posts:

Posted in BSOFH, FISMA | No Comments »
Tags:

When Standards Aren’t Good Enough

Posted May 22nd, 2009 by

One of the best things about being almost older than dirt is that I’ve seen several cycles within the security community.  Just like fashion and ladies’ hemlines, if you pay attention long enough, you’ll see history repeat itself, or something that closely resembles history.  Time for a short trip “down memory lane…”

In the early days of computer security, all eyes were fixed on Linthicum and the security labs associated with the NSA.  In the late 80’s and early 90’s the NSA evaluation program was notoriously slow – glacial would be a word one could use…  Bottom line, the process just wasn’t responsive enough to keep up with the changes and improvements in technology.  Products would be in evaluation for years before coming out of the process with their enabling technology nearly obsolete.   It didn’t matter, it was the only game in town until NIST and the Common Criteria labs  came onto the scene.  This has worked well, however the reality is, it’s not much better at vetting and moving technology from vendors to users.  The problem is, the evaluation process takes time and time means money, but it also means that the code submitted for evaluation will most likely be several revisions old by the time it emerges from evaluation.   Granted, it may only be 6 months, but it might take a year – regardless, this is far better than before.

So…  practically speaking, if the base version of FooOS submitted for evaluation is, say Version 5.0.1, several revisions —  each solving operational problems affecting the  organization — may have been released.  We may find that we need to run Version 5.6.10r3 in order to pass encrypted traffic via the network.  Because we encrypt traffic we must use FIPS-Level 2 certified code – but in the example above, the validated version of the FooOS will not work in our network…    What does the CISO do?  We’ll return to this in a moment, it gets better!

In order to reach levels of FIPS-140 goodness, one vendor in particular has instituted “FIPS Mode.”  What this does is require administration of the box from apposition directly in front  of the equipment, or at the length of your longest console cable…  Clearly, this is not suitable for organizations with equipment deployed worldwide to locations that do not have qualified administrators or network engineers.  Further, having to fly a technician to Burundi to clear sessions on a box every time it becomes catatonic is ridiculous at worst.  At best it’s not in accordance with the network concept of operations.  How does the CISO propose a workable, secure solution?


Standard Hill photo by timparkinson.

Now to my point.  (about time Vlad)   How does the CISO approach this situation?  Allow me to tell you the approach I’ve taken….

1. Accept the fact that once Foo OS has achieved a level of FIPS-140 goodness, the likelihood that the modules of code within the OS implementing cryptographic functionality in follow-on versions have not been changed.  This also means you have to assume the vendor has done a good job of documenting the changes to their baseline in their release notes, and that they HAVE modular code…

2. Delve into vendor documentation and FIPS-140 to find out exactly what “FIPS Mode” is, its benefits and the requirement.  Much of the written documentation in the standard deals with physical security of the cryptographic module itself (e.g., tamper-evident seals) – but most helpful is Table 1.

Security Level  1 Security Level 2 Security Level 3 Security Level 4
Cryptographic

Module Specification

Specification of cryptographic module, cryptographic boundary, Approved algorithms, and Approved modes of operation. Description of cryptographic module, including all hardware, software, and firmware components. Statement of module security policy.
Cryptographic Module Ports and Interfaces Required and optional interfaces. Specification of all interfaces and of all input and output data paths. Data ports for unprotected critical security parameters logically or physically separated from other data ports.
Roles, Services, and Authentication Logical separation of required and optional roles and services Role-based or identity-based operator authentication Identity-based operator authentication.
Finite State Model Specification of finite state model.  Required and optional states.  State transition diagram and specification of state transitions.
Physical Security Production grade equipment. Locks or tamper evidence. Tamper detection and response for covers and doors. Tamper detection and response envelope.  EFP or EFT.
Operational Environment Single operator. Executable code. Approved integrity technique. Referenced PPs evaluated at EAL2 with specified discretionary access control mechanisms and auditing. Referenced PPs plus trusted path evaluated at EAL3 plus security policy modeling. Referenced PPs plus trusted path evaluated at EAL4.
Cryptographic Key Management Key management mechanisms: random number and key generation, key establishment, key distribution, key entry/output, key storage, and key zeroization.
Secret and private keys established using manual methods may be entered or output in plaintext form. Secret and private keys established using manual methods shall be entered or output encrypted or with split knowledge procedures.
EMI/EMC 47 CFR FCC Part 15. Subpart B, Class A (Business use). Applicable FCC requirements (for radio). 47 CFR FCC Part 15. Subpart B, Class B (Home use).
Self-Tests Power-up tests: cryptographic algorithm tests, software/firmware integrity tests, critical functions tests. Conditional tests.
Design Assurance Configuration management (CM). Secure installation and generation. Design and policy correspondence. Guidance documents. CM system. Secure distribution. Functional specification. High-level language implementation. Formal model. Detailed explanations (informal proofs). Preconditions and postconditions.
Mitigation of Other Attacks Specification of mitigation of attacks for which no testable requirements are currently available.

Summary of Security Requirements From FIPS-140-2

Bottom line — some “features” are indeed useful,  but this one particular vendor’s implementation into a “one-size fits all” option tends to limit the use of the feature at all in some operational scenarios (most notably, the one your humble author is dealing with.)  BTW, changing vendors is not an option.

3. Upon analyzing the FIPS requirements against operational needs, and (importantly) the environment the equipment is operating in, one has to draw the line between “operating in vendor FIPS Mode,” and using FIPS 140-2 encryption.

4. Document the decision and the rationale.

Once again, security professionals have to help managers to strike a healthy balance between “enough” security and operational requirements.   You would think that using approved equipment, operating systems, and vendors using the CC evaluation process would be enough.  Reading the standard, we see the official acknowledgement that “Your Mileage May Indeed Vary:” TM

While the security requirements specified in this standard are intended to maintain the security provided by a cryptographic module, conformance to this standard is not sufficient to ensure that a particular module is secure. The operator of a cryptographic module is responsible for ensuring that the security provided by a module is sufficient and acceptable to the owner of the information that is being protected and that any residual risk is acknowledged and accepted.”     FIPS 140-2 Sec 15, Qualifications

The next paragraph constitutes validation of the approach I’ve embraced:

“Similarly, the use of a validated cryptographic module in a computer or telecommunications system does not guarantee the security of the overall system. The responsible authority in each agency shall ensure that the security of the system is sufficient and acceptable.”  (Emphasis added.)

One could say, “it depends,” but you wouldn’t think so at first glance – it’s a Standard for Pete’s sake!

Then again, nobody said this job would be easy!

Vlad



Similar Posts:

Posted in Rants, Risk Management, Technical | 4 Comments »
Tags:

Sir Bruce Mentions FDCC, World Goes Nuts

Posted May 7th, 2009 by

Check out this blog post.  Wow, all sorts of crazies decend out of the woodwork when Bruce talks about something that’s been around for years and suddenly everyone’s redesigning the desktop from the ground up.

Quick recap on comments:

  • 60-day password changes suck
  • You can do this at home, the GPOs are available from NIST
  • My blue-haired sheepdog can’t use the FDCC image, it’s broken for commercial use!
  • You wouldn’t have to do this in Linux
  • Linux is teh suxx0rz
  • My computer started beeping and smoke came out of it, is this FDCC?

Proving once again that you can’t talk about Windows desktop security without it evolving into a flamewar.  Might as well pull out “vi v/s emacs” while you’re at it, Bruce.  =)

Computer Setup photo by karindalziel.  Yes, one of them is a linux box, I used this picture for that very same reason.  =)

But there is one point that people need to understand.  The magic of FDCC is not in the fact that the Government used its IT-buying muscle to get Microsoft to cooperate.  Oh no, that’s to be expected–the guys at MS are used to working with a lot of people now on requests.

The true magic of FDCC is getting the application vendors to play along.  To wit:

  • The FDCC GPOs are freely available from NIST
  • You can download images from NIST with a preconfigured FDCC setup
  • Application vendors can test their product against FDCC in their own lab
  • There is no external audit burden (yet, it might be coming) for software vendors because it’s a self-certification
  • FDCC-compatible software doesn’t require administrative privileges

In other words, if your software works with FDCC, it’s probably built to run on a security-correct operating system in the first place.  This is a good thing, and in this case the Government is using its IT budget to bring the application vendors into some sort of minimal security to the rest of the world.

This statement is from the FDCC FAQ, comments in parenthesis are mine:

“How are vendors required to prove FDCC compliance?
There is no formal compliance process; vendors of information technology products must self-assert FDCC compliance. They are expected to ensure that their products function correctly with computers configured with the FDCC settings. The product installation process must make no changes to the FDCC settings. Applications must work with users who do not have administrative privileges, the only acceptable exception being information technology management tools. Vendors must test their products on systems configured with the FDCC settings, they must use SCAP validated tools with FDCC Scanner capability to certify their products operate correctly with FDCC configurations and do not alter FDCC settings. The OMB provided suggested language in this memo: http://www.whitehouse.gov/omb/memoranda/fy2007/m07-18.pdf, vendors are likely to encounter similar language when negotiating with agencies.”

So really what you get out of self-certification is something like this:



Similar Posts:

Posted in Technical | 4 Comments »
Tags:

Lolcats take on Laws, Sausage, Cyberwhatzits, and PCI

Posted April 23rd, 2009 by

I just love how the security twit community this month is turning into a bunch of public policy wonks before my very eyes.  First we have S.773, the Cybersecurity Act of 2009 and then the House hearings on the effectiveness of PCI.

“To retain respect for sausages and laws, one must not watch them in the making.” –Otto von Bismark (ref: Wikipedia)


funny pictures



Similar Posts:

Posted in IKANHAZFIZMA, Public Policy | 2 Comments »
Tags:

Blow-By-Blow on S.773–The Cybersecurity Act of 2009–Part 2

Posted April 16th, 2009 by

Rybolov Note: this is part 2 in a series about S.773.  Go read the bill hereGo read part one here. Go read part 3 here. Go read part four hereGo read part 5 here. =)

SEC. 7. LICENSING AND CERTIFICATION OF CYBERSECURITY PROFESSIONALS. This section has received quite a bit of airtime around the blagosphere.  Everybody thinks that they’ll need some kind of license from the Federalies to run nessus.  Hey, maybe this is how it will all end up, but I think this provision will end up stillborn.

I know the NIST folks have been working on licensing and certification for some time, but they usually run into the same problems:

  • Do we certify individuals as cybersecurity professionals?
  • Do we certify organizations as cybersecurity service providers?
  • What can the Government do above and beyond what the industry provides? (ISC2, SANS, 27001, etc)
  • NIST does not want to be in the business of being a licensure board.

Well, this is my answer (I don’t claim that these are my opinion):

  • Compulsory: the Government can require certifications/licensure for certain job requirements.  Right now this is managed by HR departments.
  • Existing Precedent: We’ve been doing this for a couple of years with DoDI 8570.01M, which is mandatory for DoD contracts.  As much as I think industry certification is a pyramid scheme, I think this makes sense in contracting for the Government because it’s the only way to ensure some kind of training for security staff.If the Government won’t pay for contractor training (and they shouldn’t) and the contractor won’t pay for employees to get training because their turnover rate is 50% in a year, it’s the only way to ensure some kind of training and professionalization of the staff.  Does this scale to the rest of the country?  I’m not sure.
  • Governance and Oversight: The security industry has too many different factions.  A Government-ran certification and license scheme would provide some measure of uniformity.

Honestly, this section of the bill might make sense (it opens up a bigger debate) except for one thing:  we haven’t defined what “Cybersecurity Services” are.  Let’s face it, most of what we think are “security” services are really basic IT management services… why should you need a certification to be the goon on the change control board.  However, this does solve the “problem” of hackers who turn into “researchers” once they’re caught doing something illegal.  I just don’t see this as that big of a problem.

Verdict: Strange that this isn’t left up to industry to handle.  It smells like lobbying by somebody in ISC2 or SANS to generate a higher demand for certs.  Unless this section is properly scoped and extensively defined, it needs to die on the cutting room floor–it’s too costly for almost no value above what industry can provide.  If you want to provide the same effect with almost no cost to the taxpayers, consider something along the 8570.01 approach in which industry runs the certifications and specific certifications are required for certain job titles.

SEC. 8. REVIEW OF NTIA DOMAIN NAME CONTRACTS. Yes, there is a bunch of drama-llama-ing going on between NTIA, ICANN, Verisign, and a cast of a thousand.  This section calls for a review of DNS contracts by the Cybersecurity Advisory Panel (remember them from section 3?) before they are approved.  Think managing the politics of DNS is hard now?  It just got harder–you ever try to get a handful of security people to agree on anything?  And yet, I’m convinced that either this needs to happen or NTIA needs to get some clueful security staffers who know how to manage contracts.

Verdict: DNSSEC is trendy thanks to Mr Kaminski.  I hate it when proposed legislation is trendy.  I think this provision can be axed off the bill if NTIA had the authority to review the security of their own contracts.  Maybe this could be a job for the Cybersecurity Advisor instead of the Advisory Panel?

SEC. 9. SECURE DOMAIN NAME ADDRESSING SYSTEM. OK, the Federal Government has officially endorsed DNSSEC thanks to some OMB mandates.  Now the rest of the country can play along.  Seriously, though, this bill has some scope problems, but basically what we’re saying is that Federal agencies and critical infrastructure will be required to implement DNSSEC.

Once again, though, we’re putting Commerce in charge of the DNSSEC strategy.  Commerce should only be on the hook for the standards (NIST) and the changes to the root servers (NTIA).  For the Federal agencies, this should be OMB in charge.  For “critical infrastructure”, I believe the most appropriate proponent agency is DHS because of their critical infrastructure mission.

And as for the rest of you, well, if you want to play with the Government or critical infrastructure (like the big telephone and network providers), it would behoove you to get with the DNSSEC program because you’re going to be dragged kicking and screaming into this one.  Isn’t the Great InfoSec Trickle-Down Effect awesome?

Verdict: If we want DNSSEC to happen, it will take an act of Congress because the industry by itself can’t get it done–too many competing interests.  Add more tasks to the agencies outside of Commerce here, and it might work.

Awesome Capitol photo by BlankBlankBlank.

SEC. 10. PROMOTING CYBERSECURITY AWARENESS. Interesting in that this is tasked to Commerce, meaning that the focus is on end-users and businesses.

In a highly unscientific, informal poll with a limited sample of security twits, I confirmed that nobody has ever heard of Dewie the Webwise Turtle.  Come on, guys, “Safe at any speed”, how could you forget that?  At any rate, this already exists in some form, it just has to be dusted off and get a cash infusion.

Verdict: Already exists, but so far efforts have been aimed at users.  The following populations need awareness: small-medium-sized businesses (SMBs), end-users, owners of critical infrastructure, technology companies, software developers.  Half of these are who DHS is dealing with, and this provision completely ignores DHS’s role.

SEC. 11. FEDERAL CYBERSECURITY RESEARCH AND DEVELOPMENT. This section is awesome to read, it’s additions to the types of research that NSF can fund and extensions of funding for the existing types of research.  It’s pretty hard to poke holes in, and based on back-of-the-envelope analysis, there isn’t much that is missing by way of topics that need to be added to research priorities.  What I would personally like to see is a better audit system not designed around the accounting profession’s way of doing things.  =)

Verdict: Keep this section intact.  If we don’t fund this, we will run into problems 10+ years out–some would say we’re already running into the limitations of our current technology.

SEC. 12. FEDERAL CYBER SCHOLARSHIP-FOR-SERVICE PROGRAM. This is an existing program, and it’s pretty good.  Basically you get a scholarship with a Government service commitment after graduation.  Think of it as ROTC-light scholarships without bullets and trips to SW Asia.

Verdict: This is already there.  This section of the bill most likely is in to get the program funded out to 2014.



Similar Posts:

Posted in NIST, Public Policy, What Doesn't Work, What Works | 2 Comments »
Tags:

Blow-By-Blow on S.773–The Cybersecurity Act of 2009–Part 1

Posted April 14th, 2009 by

Rybolov Note: this is such a long blog post that I’m breaking it down into parts.  Go read the bill hereGo read part two hereGo read part three here. Go read part four hereGo read part 5 here. =)

So the Library of Congress finally got S.773 up on http://thomas.loc.gov/.  For those of you who have been hiding under a rock, this is the Cybersecurity Act of 2009 and is a bill introduced by Senators Rockefeller and Snowe and, depending on your political slant, will allow us to “sock it to the hackers and send them to a federal pound-you-in-the-***-prison” or “vastly erode our civil liberties”.

A little bit of pre-reading is in order:

Timing: Now let’s talk about the timing of this bill.  There is the 60-day Cybersecurity Review that is supposed to be coming out Real Soon Now (TM).  This bill is an attempt by Congress to head it off at the pass.

Rumor mill says that not only will the Cybersecurity Review be unveiled at RSA (possible, but strange) and that it won’t bring anything new to the debate (more possibly, but then again, nothing’s really new, we’ve known about this stuff for at least a decade).

Overall Comments:

This bill is big.  It really is an omnibus Cybersecurity Act and has just about everything you could want and more.  There’s a fun way of doing things in the Government, and it goes something like this: ask for 300% of what you need so that you will end up with 80%.  And I see this bill is taking this approach to heart.

Pennsylvania Ave – Old Post Office to the Capitol at Night photo by wyntuition.

And now for the good, bad, and ugly:

SEC. 2. FINDINGS. This section is primarily a summary of testimony that has been delivered over the past couple of years.  It really serves as justification for the rest of the bill.  It is a little bit on the FUD side of things (as in “omigod, they put ‘Cyber-Katrina‘ in a piece of legislation”), but overall it’s pretty balanced and what you would expect for a bill.  Bottom line here is that we depend on our data and the networks that carry it.  Even if you don’t believe in Cyberwar (I don’t really believe in Cyberwar unles it’s just one facet of combined arms warfare), you can probably agree that the costs of insecurity on a macroeconomic scale need to be looked at and defended against, and our dependency on the data and networks is only going to increase.

No self-respecting security practitioner will like this section, but politicians will eat it up.  Relax, guys, you’re not the intended audience.

Verdict: Might as well keep this in there, it’s plot development without any requirements.

SEC. 3. CYBERSECURITY ADVISORY PANEL. This section creates a Cybersecurity Advisory Panel made up of Federal Government, private sector, academia, and state and local government.  This is pretty typical so far.  The interesting thing to me is “(7) whether societal and civil liberty concerns are adequately addressed”… in other words, are we balancing security with citizens’, corporations’, and states’ rights?  More to come on this further down in the bill.

Verdict: Will bring a minimal cost in Government terms.  I’m very hesitant to create new committees.  But yeah, this can stay.

SEC. 4. REAL-TIME CYBERSECURITY DASHBOARD. This section is very interesting to me.  On one hand, it’s what we do at the enterprise level for most companies.  On the other hand, this is specific to the Commerce Department –“Federal Government information systems and networks managed by the Department of Commerce.”  The first reading of this is the internal networks that are internal to Commerce, but then why is this not handed down to all agencies?  I puzzled on this and did some research until I remembered that Commerce, through NTIA, runs DNS, and Section 8 contains a review of the DNS contracts.

Verdict: I think this section needs a little bit of rewording so that the scope is clearer, but sure, a dashboard is pretty benign, it’s the implied tasks to make a dashboard function (ie, proper management of IT resources and IT security) that are going to be the hard parts.  Rescope the dashboard and explicitly say what kind of information it needs to address and who should receive it.

SEC. 5. STATE AND REGIONAL CYBERSECURITY ENHANCEMENT PROGRAM. This section calls for Regional Cybersecurity Centers, something along the lines of what we call “Centers of Excellence” in the private sector.  This section is interesting to me, mostly because of how vague it seemed the first time I read it, but the more times I look at it, I go “yeah, that’s actually a good idea”.  What this section tries to do is to bridge the gap between the standards world that is NIST and the people outside of the beltway–the “end-users” of the security frameworks, standards, tools, methodologies, what-the-heck-ever-you-want-to-call-them.  Another interesting thing about this is that while the proponent department is Commerce, NIST is part of Commerce, so it’s not as left-field as you might think.

Verdict: While I think this section is going to take a long time to come to fruition (5+ years before any impact is seen), I see that Regional Cybersecurity Centers, if properly funded and executed, can have a very significant impact on the rest of the country.  It needs to happen, only I don’t know what the cost is going to be, and that’s the part that scares me.

SEC. 6. NIST STANDARDS DEVELOPMENT AND COMPLIANCE. This is good.  Basically this section provides a mandate for NIST to develop a series of standards.  Some of these have been sitting around for some time in various incarnations, I doubt that anyone would disagree that these need to be done.

  1. CYBERSECURITY METRICS RESEARCH:  Good stuff.  Yes, this needs help.  NIST are the people to do this kind of research.
  2. SECURITY CONTROLS:  Already existing in SP 800-53.  Depending on interpretation, this changes the scope and language of the catalog of controls to non-Federal IT systems, or possibly a fork of the controls catalog.
  3. SOFTWARE SECURITY:  I guess if it’s in a law, it has come of age.  This is one of the things that NIST has wanted to do for some time but they haven’t had the manpower to get involved in this space.
  4. SOFTWARE CONFIGURATION SPECIFICATION LANGUAGE: Part of SCAP.  The standard is there, it just needs to be extended to various pieces of software.
  5. STANDARD SOFTWARE CONFIGURATION:  This is the NIST configuration checklist program ala SP 800-70.  I think NIST ran short on manpower for this also and resorted back to pointing at the DISA STIGS and FDCC.  This so needs further development into a uniform set of standards and then, here’s the key, rolled back upstream to the software vendors so they ship their product pre-configured.
  6. VULNERABILITY SPECIFICATION LANGUAGE: Sounds like SCAP.

Now for the “gotchas”:

(d) COMPLIANCE ENFORCEMENT- The Director shall–

(1) enforce compliance with the standards developed by the Institute under this section by software manufacturers, distributors, and vendors; and

(2) shall require each Federal agency, and each operator of an information system or network designated by the President as a critical infrastructure information system or network, periodically to demonstrate compliance with the standards established under this section.

This section basically does 2 things:

  • Mandates compliancy for vendors and distributors with the NIST standards listed above.  Suprised this hasn’t been talked about elsewhere.  This clause suffers from scope problems because if you interpret it BSOFH-stylie, you can take it to mean that anybody who sells a product, regardless of who’s buying, has to sell a securely-configured version.  IE, I can’t sell XP to blue-haired grandmothers unless I have something like an FDCC variant installed on it.  I mostly agree with this in the security sense but it’s a serious culture shift in the practical sense.
  • Mandates an auditing scheme for Federal agencies and critical infrastructure.  Everybody’s talked about this, saying that since designation of critical infrastructure is not defined, this is left at the discretion of the Executive Branch.  This isn’t as wild-west as the bill’s opponents want it to seem, there is a ton of groundwork layed out in HSPD-7.  But yeah, HSPD-7 is an executive directive and can be changed “at the whim” of the President.  And yes, this is auditing by Commerce, which has some issues in that Commerce is not equipped to deal with IT security auditing.  More on this in a later post.

Verdict: The standard part is already happening today, this section just codifies it and justify’s NIST’s research.  Don’t task Commerce with enforcement of NIST standards, it leads down all sorts of inappropriate roads.



Similar Posts:

Posted in Public Policy, What Doesn't Work, What Works | 7 Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: