Comments on SCAP 2008

Posted September 24th, 2008 by

I just got back from the SCAP 2008 conference at NIST HQ, and this is a collection of my thoughts in a somewhat random order:

Presention slides are available at the NVD website

I blogged about SCAP a year ago, and started pushing it in conversations with security managers that I came across.  Really, if you’re managing security of anything and you don’t know what SCAP is, you need to get smart on it really fast, if for no other reason than that you will be pitched it by vendors sporting new certifications.

Introduction to SCAP:  SCAP is a collection of XML schemas/standards that allow technical security information to be exchanged between tools.  It consists of the following standards:

  • Common Platform Enumeration (CPE): A standard to describe a specific hardware, OS, and software configuration.  Asset information, it’s fairly humdrum, but it makes the rest of SCAP possible–think target enumeration and you’re pretty close.
  • Common Vulnerabilities and Exposures (CVE): A definition of publicly-known vulnerabilities and weaknesses.  Should be familiar to most security researches and patch monkies.
  • Common Configuration Enumeration (CCE): Basically, like CVE but specific to misconfigurations.
  • Common Vulnerability Scoring System (CVSS): A standard for determining the characteristics and impact of security vulnerabilities.  Hmmm, sounds suspiciously like standardization of what is a high, medium, and low criticality vulnerability.
  • Open Vulnerability and Assessment Language (OVAL):  Actually, 3 schemas to describe the inventory of a computer, the configuration on that computer, and a report of what vulnerabilites were found on that computer.
  • Extensible Configuration Checklist Description Format (XCCDF): A data set that describes checks for vulnerabilities, benchmarks, or misconfigurations.  Sounds like the updates to your favorite vulnerability scanning tool because it is.

Hall of Standards inside NIST HQ photo by ME!!!

What’s the big deal with SCAP: SCAP allows data exchanges between tools.  So, for example, you can take a technical policy compliance tool, load up the official Government hardening policy in XCCDF for, say, Windows 2003, run a compliance scan, export the data in OVAL, and load the results into a final application that can help your CISO keep track of all the vulnerabilities.  Basically, imagine that you’re DoD and have 1.5 million desktops–how do you manage all of the technical information on those without having tools that can import and export from each other?

And then there was the Federal Desktop Core Configuration (FDCC): OMB and Karen Evans handed SCAP its first trial-by-fire.  FDCC is a configuration standard that is to be rolled out to every Government desktop.  According to responses received by OMB from the departments in the executive branch (see, Karen, I WAS paying attention =)   ), there are roughly 3.5 Million desktops inside the Government.  The only way to manage these desktops is through automation, and SCAP is providing that.

He sings, he dances, that Tony Sager is a great guy: So he’s presented at Black Hat, now SCAP 2008 (.pdf caveat).  Basically, while the NSA has a great red-team (think pen-test) capability, they had a major change of heart and realized, like the rest of the security world (*cough*Ranum*cough*), that while attacking is fun, it isn’t very productive at defending your systems–there is much more work to be done for the defenders, and we need more clueful people doing that.

Vendors are jumping on the bandwagon with both feet: The amount of uptake from the vulnerability and policy compliance vendors is amazing.  I would give numbers of how many are certified, but I literally get a new announcement in my news reader ever week or so.  For vendors, being certified means that you can sell your product to the Government, not being certified means that you get to sit on the bench watching everybody else have all the fun.  The GSA SAIR Smart-Buy Blanket Purchase Agreement sweetens the deal immensely by having your product easily purchasable in massive quantities by the Government.

Where are the rest of the standards: Yes, FDCC is great, but where are the rest of the hardening standards in cute importable XML files, ready to be snarfed into my SCAP-compliant tool?  Truth be told, this is one problem with SCAP right now because everybody has been focusing on FDCC and hasn’t had time yet to look at the other platforms.  Key word is “yet” because it’s happening real soon now, and it’s fairly trivial to convert the already-existing DISA STIGs or CIS Benchmarks into XCCDF.  In fact, Sun was blindsided by somebody who had made some SCAP schemas for their products and they had no idea that anybody was working on it–new content gets added practically daily because of the open-source nature of SCAP.

Changing Government role: This is going to be controversial.  With NVD/CVE, the government became the authoritative source for vulnerabilities.  So far that’s worked pretty well.  With the rest of SCAP, the Government changes roles to be a provider of content and configurations.  If NIST is smart, they’ll stay out of this because they prefer to be in the R&D business and not the operations side of things.  Look for DHS to pick up the role of being a definitions provider.  Government has to be careful here because they could in some instances be competing with companies that sell SCAP-like feed services.  Not a happy spot for either side of the fence.

More information security trickle-down effect: A repeated theme at SCAP 2008 is that the public sector is interested in what Big SCAP can do for them.  The vendors are using SCAP certification as a differentiator for the time being, but expect to see SCAP for security management standards like PCI-DSS, HIPAA, and SOX–to be honest here, though, most of the vendors in this space cut their teeth on these standards, it’s just a matter of legwork to be able to export in SCAP schemas.  Woot, we all win thanks to the magic that is the Government flexing its IT budget dollars!

OS and Applications vendors: these guys are feeling the squeeze of standardization.  On one hand, the smart vendors (Oracle, Microsoft, Sun, Cisco) have people already working with DISA/NSA to help produce the configuration guides, they just have to sit back and let somebody turn the guides into SCAP content.  Some of the applications vendors still haven’t figured out that their software is about to be made obsolete in the Government market because they don’t have the knowledge base to self-certify with FDCC and later OS standards.  With a 3-year lead time required for some of the desktop applications before a feature request (make my junk work with FDCC) makes it into a product release, there had better be some cluebat work going on in the application vendor community.  Adobe, I’m talking to you and Lifecycle ES–if you need help, just call me.

But how about system integrators: Well, for the time being, system integrators have almost a free ride–they just have to deal with FDCC.  There are some of them that have some cool solutions built on the capabilities of SCAP, but for the most part I haven’t seen much movement except for people who do some R&D.  Unfortunately for system integrators, the Federal Acquisition Regulation now requires that anything you sell to the Government be configured IAW the NIST checklists program.  And just how do you think the NIST checklists program will be implemented?  I’ll take SCAP for $5Bazillion, Alex.  Smart sytem integrators will at least keep an eye on SCAP before it blindsides them 6 months from now.

Technical compliance tools are destined to be a commodity: For the longest time, the vulnerability assessment vendors made their reputation by having the best vulnerability signatures.  In order to get true compatibility across products, standardized SCAP feeds means that the pure-play security tools are going to have less things to differentiate themselves from all the other tools and they fall into a commodity market centered on the accuracy of their checks with reduced false positives and negatives.  While it may seem like a joyride for the time being (hey, we just got our ticket to sell to the Gubmint by being SCAP-certified), that will soon turn into frustration as the business model changes and the margins get smaller.  Smart vendors will figure out ways to differentiate themselves and will survive, the others will not.

Which leads me to this: Why is it that SCAP only applies to security tools?  I mean, seriously, guys like BigFix and NetIQ have crossover from technical policy compliance to network management systems–CPE in particular.  What we need is a similar effort applied to network and data center tools.  And don’t point me at SNMP, I’m talking rich data.  =)  On a positive note, expect some of the security pure-play tools to be bought up and incorporated into enterprise suites if they aren’t already.

Side notes:

I love how the many deer (well over 9000 deer on the NIST campus) all have ear tags.  It brings up all sorts of scientific studies ideas.  But apparently the deer are on birth control shots or something….

Former Potomac Forum students:  Whattayaknow, I met some of our former students who are probably reading this right now because I pimped out my blog probably too aggressively.  =)  Hi Shawn, Marc, and Bob!

Old friends:  Wow, I found some of them, too.  Hi Jess, Walid, Chris, and a cast of thousands.

Deer on NIST Gaithersburg Campus photo by Chucka_NC.



Similar Posts:

Posted in DISA, FISMA, NIST, Technical, What Works | 2 Comments »
Tags:

Backtrack 3 USB Slides

Posted September 12th, 2008 by

Just a quick slideshow I threw together for a class on making a Backtrack 3 bootable USB drive.  I sanitized it for public use, but I figure there’s some reusable content that somebody will thank me for later.

Backtrack 3 USB

View SlideShare presentation or Upload your own. (tags: usb pen-test)


Similar Posts:

Posted in Odds-n-Sods, Technical, What Works | 1 Comment »
Tags:

Oh Hey, Secure DNS now Mandatory

Posted August 27th, 2008 by

OMB sneaked this one in on me:  OMB Memo 08-23 requires secure DNS (standard .pdf caveat).  Agencies need to submit a plan by September 5th on how they should accomplish this.  The whole switchover should occur by December 2009.

The interesting thing to me is that OMB is getting bolder in specifying technical solutions.  Part of me wants to scream because public policy people have no business dictating technical solutions–that’s what we have standards boards and RFCs for.

From what I hear, some of this is because OMB is starting to be a really bad lame duck.  Think about it, what are the odds that anybody at OMB is going to be around in December 2009?  Completely unofficial word on the street is that OMB is pushing last-minute initiatives because of politicals–trying to accomplish things in time for the elections.

Also, I think that OMB is getting tired of NIST’s nonpspecificity in their guidance.  NIST’s approach to being generic in nature is necessary because of their role as a research organization and the producers of methodologies.

The solution to all this?  Well, the way it happens in the rational world is organic standards boards.  Yes, they have their problems (*cough* WAFs anyone? *cough*) but overall, they fill a place.  Inside Government, we don’t have much of that happening–we have the CIO council and the Enterprise Architecture folks, but nothing security-specific.

Lockup Your Data

Lock Up Your Data photo by psd.

Description of the picture, it’s great and needs to be repeated:

The road passes the temptations of Flash and AIR. Those who succumb or who are unfortunate enough to be lured by Silverlight’s Siren find themselves sold down the river of Rich User Experiences and hurled towards lock-in weir. The TiddlyWiki steps may rescue some, who can then join those who stuck to the main path of Javascript and AJAX for their interactions.

The URI scheme is based on DNS, a registry which has weaknesses, meanwhile the ICANN Fracture results from their greedily adding spurious new Top Level Domains such as .mobi, .jobs, .xxx and even .tel, which whilst generating more revenue (for them) causes mass confusion and threatens to break the opacity of URIs.



Similar Posts:

Posted in Technical | 2 Comments »
Tags:

DAM Solutions, Will They Let Me Pick Up Chicks?

Posted April 17th, 2008 by

Sadly, kids, the answer is no.

In fact, it’s worse than that. I think that the DAM integrators the world over will be dateless for decades because the product class is the meta-definition of geekiness.

Let’s look at the skills you need to be the DAM God:

  • You have to know security better than the security people.
  • You have to know databases as well as the SOX auditors. (feel free to chortle here, mkay?)
  • You have to know networks as well as the NIDS guys.
  • You have to know servers and OSs as well as the HIDS people.

Talk about a complex system and a fringe sport filled with fur-toothed geeks such as myself….

Anyway, I’ve been in training all week and I keep thinking “How do I staff a DAM integration project with some of the junior staff?”  Answer is, you don’t–you need some fairly senior people with a wide variety of experience to make DAM products work.



Similar Posts:

Posted in Technical, What Doesn't Work | No Comments »

Selling Water to People in the Desert

Posted April 15th, 2008 by

Some things should absolutely sell themselves. In the Mojave desert, the guy to be is the one driving the ice cream truck because everybody is happy to see you.

When it comes to the Government there is one thing that is their lifeblood: they make and trade secrets. And since 2001, every building in DC has become its own semi-autonomous nation-state with X-ray machines and armed guards.

So why is it so hard to sell Data Leakage Prevention (DLP) and Database Activity Monitoring (DAM) solutions to them? I’ve talked to vendors in both solution spaces, and they’ve found that it’s a hard sell to get product in the door.

If anybody needs DAM and DLP, it’s the Gub’mint. I try not to play this game, but if you look at the PII incidents that meet the Washington Post front page threshold, you’ll see that all of them are preventable with either DAM or DLP or both.

DAM and Leackage Prevention

Photo by Dru

My thoughts on what’s up:

  • Government purchasing lags behind the private sector. Government CPIC works on a 2-year cycle. Keeping in mind that the average life expectancy for a CISO is 2 years, this doesn’t bode well. This is also why it’s so hard to get strategic projects (*cough* redundant data center *cough*) completed.
  • If it’s not in the control catalog, it’s hard to justify buying it. It’s the double-edged sword of compliance. Unless I have all the controls in the catalog implemented, I can’t really justify anything not in the catalog, and once I have all of the catalog done, they yank my budget for somebody who doesn’t have the catalog implemented.
  • It takes approximately 2 years to get a particular technology into the catalog of controls. If the catalog (SP 800-53) is revised every year, then if NIST thinks that my technology/concept is a good idea, then I still have to wait for the next revision.
  • So if you introduce a new technology today, the earliest I could expect to have it implemented is in 4 years, 3 if you’re lucky.
  • Selling to the government is long and slow (can we say “heavy on bizdev investment”) but has a big payoff: remember that the Overall IT budget is just shy of $80Bazillionz.

The winning strategies:

  1. Partnering up with the larger integrators who can bundle your product with an existing outsourcing contract.
  2. Matching up your product description with the catalog of controls. Make it easy for the Government to select your product.
  3. Let NIST and Mitre evaluate your product. Seriously. If you’ve got game, flaunt it.
  4. Invest in BizDev expecting 4 years before you get a return.


Similar Posts:

Posted in FISMA, Technical, What Doesn't Work, What Works | No Comments »

Government-Wide Monitoring? ‘Bout Time.

Posted April 8th, 2008 by

Good, I’m glad we’re finally doing this.

For those of you watching the other initiatives, this does have something to do with the Trusted Internet Connections initiative–if you can choke traffic down into 50 “sets of tubes” to watch, then it’s easier to watch them.

Expect to see more over the next year, the pieces are starting to fall into place.



Similar Posts:

Posted in Technical, What Works | 1 Comment »

« Previous Entries Next Entries »


Visitor Geolocationing Widget: