Security Automation Developers Conference Slides

Posted July 2nd, 2009 by

Eh? What’s that mean?  Developer Days is a weeklong conference where they get down into the weeds about the various SCAP schemas and how they fit into the overall program of security automation. 

Highlights and new ideas:

Remedial Markup Language: Fledgeling schema to describe how to remediate a vulnerability.  A fully automated security system would scan and then use the RML content to automagically fix the finding… say, changing a configuration setting or installing a patch.  this would be much awesome if combined with the CVE/CWE so you have a vulnerability scanner that scans and fixes the problem.  Also needs to be kept in a bottle because the operations guys will have a heartattack if we are doing this without any human intervention.

Computer Network Defense: There is a pretty good scenario slide deck on using SCAP to automate hardening, auditing, monitoring, and defense.  The key from this deck is how the information flows using automation.

Common Control Identifier:  This schema is basically a catalog of controls (800-53, 8500.2, PCI, SoX, etc) in XML.  The awesomeness with this is that one control can contain a reference implementation for each technology and the checklist to validate it in XCCDF.  At this point, I get all misty…

Open Checklist Interactive Language: This schema is to capture questionaires.  Think managerial controls, operational controls, policy, and procedure captured in electronic format and fed into the regular mitigation and workflow tools that you use so that you can view “security of the enterprise at a glance” across technical and non-technical security.

Network Event Content Automation Protocol:  This is just a concept floating around right now on using XML to describe and automate responses to attacks.  If you’re familiar with ArcSight’s Common Event Format, this would be something similar but on steroids with workflow and a pony!

Attendance at developer days is limited, but thanks to all the “Powar of teh Intarwebs, you can go here and read the slides!



Similar Posts:

Posted in NIST, Technical | 3 Comments »
Tags:

When Standards Aren’t Good Enough

Posted May 22nd, 2009 by

One of the best things about being almost older than dirt is that I’ve seen several cycles within the security community.  Just like fashion and ladies’ hemlines, if you pay attention long enough, you’ll see history repeat itself, or something that closely resembles history.  Time for a short trip “down memory lane…”

In the early days of computer security, all eyes were fixed on Linthicum and the security labs associated with the NSA.  In the late 80’s and early 90’s the NSA evaluation program was notoriously slow – glacial would be a word one could use…  Bottom line, the process just wasn’t responsive enough to keep up with the changes and improvements in technology.  Products would be in evaluation for years before coming out of the process with their enabling technology nearly obsolete.   It didn’t matter, it was the only game in town until NIST and the Common Criteria labs  came onto the scene.  This has worked well, however the reality is, it’s not much better at vetting and moving technology from vendors to users.  The problem is, the evaluation process takes time and time means money, but it also means that the code submitted for evaluation will most likely be several revisions old by the time it emerges from evaluation.   Granted, it may only be 6 months, but it might take a year – regardless, this is far better than before.

So…  practically speaking, if the base version of FooOS submitted for evaluation is, say Version 5.0.1, several revisions —  each solving operational problems affecting the  organization — may have been released.  We may find that we need to run Version 5.6.10r3 in order to pass encrypted traffic via the network.  Because we encrypt traffic we must use FIPS-Level 2 certified code – but in the example above, the validated version of the FooOS will not work in our network…    What does the CISO do?  We’ll return to this in a moment, it gets better!

In order to reach levels of FIPS-140 goodness, one vendor in particular has instituted “FIPS Mode.”  What this does is require administration of the box from apposition directly in front  of the equipment, or at the length of your longest console cable…  Clearly, this is not suitable for organizations with equipment deployed worldwide to locations that do not have qualified administrators or network engineers.  Further, having to fly a technician to Burundi to clear sessions on a box every time it becomes catatonic is ridiculous at worst.  At best it’s not in accordance with the network concept of operations.  How does the CISO propose a workable, secure solution?


Standard Hill photo by timparkinson.

Now to my point.  (about time Vlad)   How does the CISO approach this situation?  Allow me to tell you the approach I’ve taken….

1. Accept the fact that once Foo OS has achieved a level of FIPS-140 goodness, the likelihood that the modules of code within the OS implementing cryptographic functionality in follow-on versions have not been changed.  This also means you have to assume the vendor has done a good job of documenting the changes to their baseline in their release notes, and that they HAVE modular code…

2. Delve into vendor documentation and FIPS-140 to find out exactly what “FIPS Mode” is, its benefits and the requirement.  Much of the written documentation in the standard deals with physical security of the cryptographic module itself (e.g., tamper-evident seals) – but most helpful is Table 1.

Security Level  1 Security Level 2 Security Level 3 Security Level 4
Cryptographic

Module Specification

Specification of cryptographic module, cryptographic boundary, Approved algorithms, and Approved modes of operation. Description of cryptographic module, including all hardware, software, and firmware components. Statement of module security policy.
Cryptographic Module Ports and Interfaces Required and optional interfaces. Specification of all interfaces and of all input and output data paths. Data ports for unprotected critical security parameters logically or physically separated from other data ports.
Roles, Services, and Authentication Logical separation of required and optional roles and services Role-based or identity-based operator authentication Identity-based operator authentication.
Finite State Model Specification of finite state model.  Required and optional states.  State transition diagram and specification of state transitions.
Physical Security Production grade equipment. Locks or tamper evidence. Tamper detection and response for covers and doors. Tamper detection and response envelope.  EFP or EFT.
Operational Environment Single operator. Executable code. Approved integrity technique. Referenced PPs evaluated at EAL2 with specified discretionary access control mechanisms and auditing. Referenced PPs plus trusted path evaluated at EAL3 plus security policy modeling. Referenced PPs plus trusted path evaluated at EAL4.
Cryptographic Key Management Key management mechanisms: random number and key generation, key establishment, key distribution, key entry/output, key storage, and key zeroization.
Secret and private keys established using manual methods may be entered or output in plaintext form. Secret and private keys established using manual methods shall be entered or output encrypted or with split knowledge procedures.
EMI/EMC 47 CFR FCC Part 15. Subpart B, Class A (Business use). Applicable FCC requirements (for radio). 47 CFR FCC Part 15. Subpart B, Class B (Home use).
Self-Tests Power-up tests: cryptographic algorithm tests, software/firmware integrity tests, critical functions tests. Conditional tests.
Design Assurance Configuration management (CM). Secure installation and generation. Design and policy correspondence. Guidance documents. CM system. Secure distribution. Functional specification. High-level language implementation. Formal model. Detailed explanations (informal proofs). Preconditions and postconditions.
Mitigation of Other Attacks Specification of mitigation of attacks for which no testable requirements are currently available.

Summary of Security Requirements From FIPS-140-2

Bottom line — some “features” are indeed useful,  but this one particular vendor’s implementation into a “one-size fits all” option tends to limit the use of the feature at all in some operational scenarios (most notably, the one your humble author is dealing with.)  BTW, changing vendors is not an option.

3. Upon analyzing the FIPS requirements against operational needs, and (importantly) the environment the equipment is operating in, one has to draw the line between “operating in vendor FIPS Mode,” and using FIPS 140-2 encryption.

4. Document the decision and the rationale.

Once again, security professionals have to help managers to strike a healthy balance between “enough” security and operational requirements.   You would think that using approved equipment, operating systems, and vendors using the CC evaluation process would be enough.  Reading the standard, we see the official acknowledgement that “Your Mileage May Indeed Vary:” TM

While the security requirements specified in this standard are intended to maintain the security provided by a cryptographic module, conformance to this standard is not sufficient to ensure that a particular module is secure. The operator of a cryptographic module is responsible for ensuring that the security provided by a module is sufficient and acceptable to the owner of the information that is being protected and that any residual risk is acknowledged and accepted.”     FIPS 140-2 Sec 15, Qualifications

The next paragraph constitutes validation of the approach I’ve embraced:

“Similarly, the use of a validated cryptographic module in a computer or telecommunications system does not guarantee the security of the overall system. The responsible authority in each agency shall ensure that the security of the system is sufficient and acceptable.”  (Emphasis added.)

One could say, “it depends,” but you wouldn’t think so at first glance – it’s a Standard for Pete’s sake!

Then again, nobody said this job would be easy!

Vlad



Similar Posts:

Posted in Rants, Risk Management, Technical | 4 Comments »
Tags:

Wanted: Some SCAP Wranglers

Posted May 18th, 2009 by

So I was doing my usual “Beltway Bandit Perusal of Opportunities for Filthy Lucre” also known as diving into FedBizOps and I found this gem.  Basically what this means is that sometime this summer, NIST is going to put out an RFP for contractors to further develop SCAP using ARRA funds.

Keeping in mind that this isn’t the official list of what NIST wants done under this contract, but it’s interesting to look at from an angle of where SCAP will go over the next couple of years:

  1. Evolution of the SCAP protocol and specifications thereof
  2. Feasibility studies, development, documenting, prototyping, and road-mapping of SCAP expansions (e.g., remediation capability) and analog protocols (e.g., Network Event Content Automation Protocol
  3. Implementation and maintenance support for the Security Automation Content Validation Program
  4. Maintenance support for the SCAP Product Validation Program
  5. Pilot, beta, and production support for SCAP and security automation use-cases
  6. Content development, modification, and testing
  7. Infrastructure and reference implementation development in JAVA, C++, and C programming languages
  8. Data trust models and data provenance solutions.

So how do you play?  Well, the first thing is that you respond to the notice with a capabilities statement saying “yes, we have experience in doing what you want”–there is a list of specifics in the original notice.  Then sign up for FedBizOps and follow the announcement so you can get changes and the RFP when it comes out.



Similar Posts:

Posted in NIST, Outsourcing | 5 Comments »
Tags:

Sir Bruce Mentions FDCC, World Goes Nuts

Posted May 7th, 2009 by

Check out this blog post.  Wow, all sorts of crazies decend out of the woodwork when Bruce talks about something that’s been around for years and suddenly everyone’s redesigning the desktop from the ground up.

Quick recap on comments:

  • 60-day password changes suck
  • You can do this at home, the GPOs are available from NIST
  • My blue-haired sheepdog can’t use the FDCC image, it’s broken for commercial use!
  • You wouldn’t have to do this in Linux
  • Linux is teh suxx0rz
  • My computer started beeping and smoke came out of it, is this FDCC?

Proving once again that you can’t talk about Windows desktop security without it evolving into a flamewar.  Might as well pull out “vi v/s emacs” while you’re at it, Bruce.  =)

Computer Setup photo by karindalziel.  Yes, one of them is a linux box, I used this picture for that very same reason.  =)

But there is one point that people need to understand.  The magic of FDCC is not in the fact that the Government used its IT-buying muscle to get Microsoft to cooperate.  Oh no, that’s to be expected–the guys at MS are used to working with a lot of people now on requests.

The true magic of FDCC is getting the application vendors to play along.  To wit:

  • The FDCC GPOs are freely available from NIST
  • You can download images from NIST with a preconfigured FDCC setup
  • Application vendors can test their product against FDCC in their own lab
  • There is no external audit burden (yet, it might be coming) for software vendors because it’s a self-certification
  • FDCC-compatible software doesn’t require administrative privileges

In other words, if your software works with FDCC, it’s probably built to run on a security-correct operating system in the first place.  This is a good thing, and in this case the Government is using its IT budget to bring the application vendors into some sort of minimal security to the rest of the world.

This statement is from the FDCC FAQ, comments in parenthesis are mine:

“How are vendors required to prove FDCC compliance?
There is no formal compliance process; vendors of information technology products must self-assert FDCC compliance. They are expected to ensure that their products function correctly with computers configured with the FDCC settings. The product installation process must make no changes to the FDCC settings. Applications must work with users who do not have administrative privileges, the only acceptable exception being information technology management tools. Vendors must test their products on systems configured with the FDCC settings, they must use SCAP validated tools with FDCC Scanner capability to certify their products operate correctly with FDCC configurations and do not alter FDCC settings. The OMB provided suggested language in this memo: http://www.whitehouse.gov/omb/memoranda/fy2007/m07-18.pdf, vendors are likely to encounter similar language when negotiating with agencies.”

So really what you get out of self-certification is something like this:



Similar Posts:

Posted in Technical | 4 Comments »
Tags:

Blow-By-Blow on S.773–The Cybersecurity Act of 2009–Part 5

Posted May 4th, 2009 by

Rybolov Note: this is part 4 in a series about S.773.  Go read the bill hereGo read part one hereGo read part two hereGo read part three here. Go read part four here.

Themes: I’ve read this thing back and forth, and one theme emerges overall: We’ve talked for the better part of a decade about what it’s going to take to “solve” this problem that is IT security, from an internal Federal Government standpoint, from a military-industrial complex standpoint, from a state and local government standpoint, from a private-sector standpoint, and from an end-user standpoint.  This bill takes some of the best though on the issue, wraps it all up, and presents it as a “if you want to get the job done, this is the way to do it”.

Missing: The role of DHS.  Commerce is highly represented, over-represented to my mindset.  Looking at the pieces of who owns what:

Commerce security organizations:

NTIA–Technically not a security organization, but they manage the DNS root and set telecom policy.

NIST–They write the standards for security.

FTC–They regulate trade and have oversight over business fraud.

DHS Security organizations:

NPPD–They are responsible for critical infrastructure and national risk management.

NCSD–They do the security operations side of our national cybersecurity strategy and run US-CERT. (BTW, hi guys!)

Secret Service–They have the primary responsibility of protecting the US Currency which also includes computer crimes against financial infrastructure.

Science and Technology Directorate–They are responsible for research and development, including IT security.

DOJ Security Organizations:

FBI–Surprise, they do investigations.

So you see, some of the things that are tasked to Commerce are done by DHS and DOJ.  This is probably the nature of the bill, it was introduced in the Commerce committee so it’s understandable that it would be Commerce-centric.

Cost: One thing kept nagging me in the back of my head while going through this bill is the cost to do everything  We’re asking to do a lot in this bill, now what’s the total cost?  Typically what happens when a bill makes it out of committee is that the Congressional Budget Office attached a price to the legislation as far as the total cost and then what’s the breakdown for the average American household.  That data isn’t published yet on the bill’s page, so we’ll see in the next iteration.

In-Your-Face Politics: Really, this bill is showing us how to do the full security piece.  It includes everything.  It’s challenging people to come up with alternatives.  It’s challenging people to delete the sections that don’t make sense.  It’s challenging people to fix the scope issues.  Like it or hate it, it definitely stirs up debate.

Final Thoughts: S.773 is a pretty decent bill.  It has some warts that need to be fixed, but overall it’s a pretty positive step.

Capitol photo by bigmikesndtech.



Similar Posts:

Posted in Public Policy | No Comments »
Tags:

Blow-By-Blow on S.773–The Cybersecurity Act of 2009–Part 3

Posted April 30th, 2009 by

Rybolov Note: this is part 3 in a series about S.773.  Go read the bill hereGo read part one hereGo read part two here. Go read part four hereGo read part 5 here. =)

SEC. 13. CYBERSECURITY COMPETITION AND CHALLENGE. This section of the bill creates a series of competitions for a range of ages and skills… with cash prizes!  Mostly it’s just the administration of competitions–cash prizes, no illegal activities, etc.

This goes back to the age-old discussions of glorification of illegal activities, giving tools to people who are too young to know how to stay out of jail.

But then again, I know why this section of the bill is in there.  If we want to grow enough security professionals to even remotely keep up with demand, we need to do a much better job at recruiting younger techies to the “security dark side”.  Competitions are a start, the next step is to get them into formal education and apprenticeships to learn from the gray-hairs that have been in industry for awhile.

Once again, the same verbiage about tasking Commerce with leading this effort… I’m not sure they’re the ones to do this.

Verdict: Already happening although in ad-hoc fashion.  I’m not sold on teaching high school kids to hack, but yeah, we need to do this.

SEC. 14. PUBLIC-PRIVATE CLEARINGHOUSE. Although the title of this sounds really cool, like super-FOIA stuff, it’s really just information-sharing with critical infrastructure owners and operators.

One interesting provision is this:

“The Secretary of Commerce–

(1) shall have access to all relevant data concerning such networks without regard to any provision of law, regulation, rule, or policy restricting such access”

In other words, all your critical infrastructure information belong to Feds.  This is interesting because it can run the range from the Feds asking power grid operators for information and getting what they get, or it can be stretched into justification for auditing of privately-owned critical infrastructure.  I’m pretty sure that they mean the former, but I can see the latter being used at a later stage in the game.

One thing I thought was interesting is that this section only refers to information sharing with critical infrastructure.  There is a big gap here in sharing information with state and local government, local (ie, non-Federal) law enforcement, and private industry.  I think other sections–most notably  section 5–deal with this somewhat, but it’s always been a problem with information dissemination because how do you get classified data down to the people who need it to do their jobs but don’t have any level of clearance or trustability other than they won an election to be sheriff in Lemhi County, Idaho? (population 5000)  Also reference the Homeland Security Information Network to see how we’re doing this today.

Verdict: Really, I think this section is a way for the Feds to gather information from the critical infrastructure owners and I don’t see much information flow the other way, since the means for the flow to critical infrastructure owners already exists in HSIN.

Capitol photo by rpongsaj.

SEC. 15. CYBERSECURITY RISK MANAGEMENT REPORT. This small section is to do some investigation on something that has been bouncing around the security community for some time now: tying security risks into financial statements, cyberinsurance, company liability, etc.

Verdict: Seems pretty benign, hope it’s not just another case where we report on something and nothing actually happens. This has potential to be the big fix for security because it deals with the business factors instead of the symptoms.

SEC. 16. LEGAL FRAMEWORK REVIEW AND REPORT. This section requires a review of the laws, national-level policies, and basically what is our national-level governance for IT security.  As weird as this sounds, this is something that needs to be done because once we have a national strategy that aligns with our laws and policies and then is translated into funding and tasks to specific agencies, then we might have a chance at fixing things.  The one caveat is that if we don’t act on the report, it will become yet another National Strategy to Secure Cyberspace, where we had lots of ideas but they were never fulfilled.

Verdict: Some of this should have been done in the 60-day Cybersecurity Review.  This is more of the same, and is a perfect task for the Cybersecurity Advisor when the position is eventually staffed.

SEC. 17. AUTHENTICATION AND CIVIL LIBERTIES REPORT. This section is really short, but read it verbatim here, you need to because this one sentence will change the game considerably.

“Within 1 year after the date of enactment of this Act, the President, or the President’s designee, shall review, and report to Congress, on the feasibility of an identity management and authentication program, with the appropriate civil liberties and privacy protections, for government and critical infrastructure information systems and networks.”

So my take on it is something like REAL-ID and/or HSPD-12 but for critical infrastructure.

My personal belief is that if you have centralized identity management, it runs contrary to civil liberties and privacy protections: the power of identification lies with the group that issues the identification.  Hence the “rejection” of REAL-ID.

If I operated critical infrastructure, I would definitely protest this section because it gives the Government the decision-making authority on who can access my gear.  Identity and access management is so pivotal to how we do security that there is no way I would give it up.

On the bright side, this section just calls for a feasibility report.

Verdict: Oh man, identification and authentication nation-wide for critical infrastructure?  We can’t even do it in a semi-hierarchical top-down world of Government agencies, much less the privately-owned critical infrastructure.



Similar Posts:

Posted in Public Policy | 1 Comment »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: