Database Activity Monitoring for the Government

Posted November 11th, 2008 by

I’ve always wondered why I have yet to meet anyone in the Government using Database Activity Monitoring (DAM) solutions, and yet the Government has some of the largest, most sensitive databases around.  I’m going to try to lay out why I think it’s a great idea for Government to court the DAM vendors.

Volume of PII: The Government owns huge databases that are usually authoritative sources.  While the private sector laments the leaks of Social Security Numbers, let’s stop and think for a minute.  There is A database inside the Social Security Administration that holds everybody’s number and is THE database where SSNs are assigned.  DAM can help here by flagging queries that retrieve large sets of data.

Targetted Privacy Information:  Remember the news reports about people looking at the presidential candidate’s passport information?  Because of the depth of PII that the Government holds about any one individual, it provides a phenomenal opportunity for invation of someone’s privacy.  DAM can help here by flagging VIPs and sending an alert anytime one of them is searched for. (DHS guys, there’s an opportunity for you to host the list under LoB)

Sensitive Information: Some Government databases come from classified sources.  If you were to look at all that information in aggregate, you could determing the classified version of events.  And then there are the classified databases themselves.  Think about Robert Hanssen attacking the Automated Case System at the FBI–a proper DAM implementation would have noticed the activity.  One interesting DAM rule here:  queries where the user is also the subject of the query.

Financial Data:  The Government moves huge amounts of money, well into $Trillions.  We’re not just talking internal purchasing controls, it’s usually programs where the Government buys something or… I dunno… “loans” $700B to the financial industry to stay solvent.  All that data is stored in databases.

HR Data:  Being one of the largest employers in the world, the Government is sitting on one of the largest repository of employee data anywhere.  That’s in a database, DAM can help.

 

Guys, DAM in the Government just makes sense.

 

Problems with the Government adopting/using DAM solutions:

DAM not in catalog of controls: I’ve mentioned this before, it’s the dual-edge nature of a catalog of controls in that it’s hard to justify any kind of security that isn’t explicitly stated in the catalog.

Newness of DAM:  If it’s new, I can’t justify it to my management and my auditors.  This will get fixed in time, let the hype cycle run itself out.

Historical DAM Customer Base:  It’s the “Look, I’m not a friggin’ bank” problem again.  DAM vendors don’t actively pursue/understand Government clients–they’re usually looking for customers needing help with SOX and PCI-DSS controls.

 

 

London is in Our Database photo by Roger Lancefield.



Similar Posts:

Posted in Rants, Risk Management, Technical, What Works | 2 Comments »
Tags:

In Which Our Protagonist Discovers We Need More Good Public Policy People Who Understand Security

Posted November 4th, 2008 by

Note the emphasis on good.  Note the emphasis on public policy.

Yes, folks, we need good policy people.  Think about the state of security and public policy today:

  • We have FISMA which is a law.  Everybody’s whipping boy but it’s exactly where it needs to be to have risk-based management of IT security.
  • We have a framework for implementing FISMA.  It’s a pretty good set of process, policy, and standards that have spilled over into the private sector.
  • You need a crowbar to get good/smart security people to deal with politics, it takes a death ray to get them to deal with public policy.
  • We don’t have high-level policy-makers who understand risk management and they are co-opting the model of compliance.
  • Public policy is the upstream neighbor of information security and what public policy people do influences what we do.
  • If we want to succeed in security at the operational and tactical level, we need to have the right decisions made at the strategic level, and that includes public policy.
  • I’m not just talking about security and the Government, this is also with things like breach laws; compliance frameworks (PCI, HIPAA); and how unpatched and zombified desktops hurt everybody else.

So in true Guerilla CISO style, I’m doing something about it.  Armed with my favorite govie (who is actually the lead on this, I’m just a straphanger), The New School of Information Security (Hi Adam and Andrew), some government policy directives, and the National Strategy to Secure Cyberspace, I am teaching an Information Security Management and Public Policy class for Carnegie Mellon’s Heinz School.

The more I work with the Masters of Science in Public Policy Management program, the more I’m sold on it.  Basically the students do a year on-campus in Pittsburg, then they have the option of staying there or coming to DC.  The students who come to DC work a 32-hour week (some do more), 2 night classes, and class for most of Friday.  Our information security class fits in as a sector-specific deep-dive, the other one being healthcare (which needs smart public policy people, too).

Which is where we need some help.  It’s a little behind the game, but we’re constantly looking for Government agencies, NGOs/NPOs, and contractors who are interested in taking on interns.  Even better if you have jobs that don’t have a US citizenship requirement.  If you want to be linked up, just drop me a line.

And oh yeah, my blogging has slowed down because I’m working 2 new projects and traveling to Tennessee and teaching Thursday nights and my life just got way busy.  =)

 

Alexander Hamilton Statue photo by dbking.



Similar Posts:

Posted in The Guerilla CISO, What Works | No Comments »
Tags:

NIST and SCAP; SCAP @ Large Part 2

Posted October 2nd, 2008 by

There is another challenge that SCAP addresses without it being officially on the SCAP program’s agenda.  With the advent of SCAP we now have a common reporting criteria by which we can now judge SCAP certified products.  If you have ever used an automated vulnerability scanner as part of a penetration test or security audit, you know that not all vulnerability scanners are created equal.  Some have much lower false positive alert and reporting rates than others.  Likewise, it is known that false negative alert and reporting rates vary.  And, because of the various technical approaches taken by the scanners, some provide much more consistent results. The challenge has been that without a common criteria to test against, it is difficult for a small or even fairly large security organization to find the resources to effectively test these products in a fair apples to apples test.

This is where NIST has a real opportunity on its hands.  With the release of the SCAP protocol, we have the criteria by which performance comparisons can be made.  What we are lacking is a common test environment.

Benchmark photo by bzo.

Let me veer off-topic for a moment to provide some background.  In the last few years the Linux community has created various “live distributions” for various specialized requirements.  What live distributions are, are CD, DVD or Flash-media-based operating systems that are executed upon boot.  That is to say that they boot and run directly from CD or DVD.  So, by using a Linux live distribution, you can run Linux off of you home Windows-based laptop without ever installing Linux to your hard disk.  This has opened up a world of specialized possibilities for this community.  One of them is the standardized training environment.  For example, security testers have created DVL (damn vulnerable Linux http://www.damnvulnerablelinux.org/).  DVL is a live distribution that with well documented security vulnerabilities, this distribution is used as a training aid for teaching vulnerability assessment and mitigation. There are other similar efforts created with the same intent such as the excellent DE-ICE training targets (http://de-ice.net/hackerpedia/index.php/De-ICE.net_PenTest_Disks).

NIST could follow-up on the release of the SCAP protocol by also building and releasing a common testing environment based perhaps on live distribution technology. Such an environment with well documented vulnerabilities would allow for the creation of objective benchmarks to be created to rate the accuracy, reproducibility, completeness of the results of SCAP certified vulnerability testing and reporting products.  This would aid government agencies, businesses and even individuals in their purchasing decisions.  It would also allow provide vendors with an objective and common test environment in which they can test and improve their products.  I admit this would be a significant undertaking for NIST.  However, I would suggest that such a test environment could be designed in such a manner that it could be built and released as a series of inter-operable modules based on live distribution technology.  The initial release might only offer a relatively modest set of tests but with the release of each module building on the results of previous releases, a highly demanding and sophisticated test environment could soon be realized.  Because of the importance and utility of such a project, industry and outside security experts might want to participate in and contribute to such an endeavor.

 



Similar Posts:

Posted in NIST, Technical, What Works | No Comments »
Tags:

NIST and SCAP; Busting a cap on intruders Part 1

Posted October 1st, 2008 by

I was attending a conference at NIST (the National Institute of Standards) concerning the SCAP program (Security Content Automation Protocol; pronounced ESS-cap).  SCAP is focused on providing the Federal government with automated, common,  interoperable security solutions.  Specifically the SCAP program has developed a common set of standards for reporting security vulnerabilities for use in automated security scanners, security appliances and reporting systems.

Well, why do we need SCAP?  If we use the Godfather of all vulnerability management tools, the NESSUS vulnerability scanner as an example, we have seen that industry has produced a number of similar products.  Each has its own strengths and rich feature set.  However, none of them use the same “language” for detecting or describing or reporting a potential vulnerability.  This not only means that these various products can only be used to operate with each other with some measure of difficulty but, trying to aggregate and manage the result of reports from these systems can be tedious.

“Tim Bray at XML 2005” photo by Roland.

As a result of these efforts and vision of the dedicated employees at NIST, industry is already scrambling to get their related products SCAP certified.  And, Federal agencies are also specifying in contracts that products must be SCAP certified in order to be qualified for purchase.  This is real progress and great news for the tax payer who will get real better value for their tax dollar.  But, it is not a revolution — yet.  Where I see the revolution emerging is in six-month to a year time frame when industry takes note of the SCAP program and we begin to see SCAP certified and SCAP interoperable products being ordered.  It will not be long after that that we may see the SCAP protocol used in even consumer-level products like personal firewalls.  This ability to give us all a common language will significantly reduce the cost of building and supporting vulnerability scanners and vulnerability reporting tools.  This cost reduction will allow resources to be freed up to address prevention and mitigation concerns in a more meaningful manner.

For example, industry has tools that enable network and security support professionals to detect a mis-configuration in a desktop machine in their network and correct it.  But, only the largest and most well funded security IT security departments have such tools.  With the advent of SCAP, these kind of services will be much more affordable and supportable and thus more common.  In fact, because much of this can be automated, I can even envision the McAfee, Symantec, and others who are well placed in the vulnerability scanning market to offer support services over the wire to smaller businesses and to consumers.  Moreover, as this technology improves and becomes commoditized, I can see ISP’s offering security scanning and mediation as a service to their customers.



Similar Posts:

Posted in NIST, Technical, The Guerilla CISO, What Works | No Comments »
Tags:

Comments on SCAP 2008

Posted September 24th, 2008 by

I just got back from the SCAP 2008 conference at NIST HQ, and this is a collection of my thoughts in a somewhat random order:

Presention slides are available at the NVD website

I blogged about SCAP a year ago, and started pushing it in conversations with security managers that I came across.  Really, if you’re managing security of anything and you don’t know what SCAP is, you need to get smart on it really fast, if for no other reason than that you will be pitched it by vendors sporting new certifications.

Introduction to SCAP:  SCAP is a collection of XML schemas/standards that allow technical security information to be exchanged between tools.  It consists of the following standards:

  • Common Platform Enumeration (CPE): A standard to describe a specific hardware, OS, and software configuration.  Asset information, it’s fairly humdrum, but it makes the rest of SCAP possible–think target enumeration and you’re pretty close.
  • Common Vulnerabilities and Exposures (CVE): A definition of publicly-known vulnerabilities and weaknesses.  Should be familiar to most security researches and patch monkies.
  • Common Configuration Enumeration (CCE): Basically, like CVE but specific to misconfigurations.
  • Common Vulnerability Scoring System (CVSS): A standard for determining the characteristics and impact of security vulnerabilities.  Hmmm, sounds suspiciously like standardization of what is a high, medium, and low criticality vulnerability.
  • Open Vulnerability and Assessment Language (OVAL):  Actually, 3 schemas to describe the inventory of a computer, the configuration on that computer, and a report of what vulnerabilites were found on that computer.
  • Extensible Configuration Checklist Description Format (XCCDF): A data set that describes checks for vulnerabilities, benchmarks, or misconfigurations.  Sounds like the updates to your favorite vulnerability scanning tool because it is.

Hall of Standards inside NIST HQ photo by ME!!!

What’s the big deal with SCAP: SCAP allows data exchanges between tools.  So, for example, you can take a technical policy compliance tool, load up the official Government hardening policy in XCCDF for, say, Windows 2003, run a compliance scan, export the data in OVAL, and load the results into a final application that can help your CISO keep track of all the vulnerabilities.  Basically, imagine that you’re DoD and have 1.5 million desktops–how do you manage all of the technical information on those without having tools that can import and export from each other?

And then there was the Federal Desktop Core Configuration (FDCC): OMB and Karen Evans handed SCAP its first trial-by-fire.  FDCC is a configuration standard that is to be rolled out to every Government desktop.  According to responses received by OMB from the departments in the executive branch (see, Karen, I WAS paying attention =)   ), there are roughly 3.5 Million desktops inside the Government.  The only way to manage these desktops is through automation, and SCAP is providing that.

He sings, he dances, that Tony Sager is a great guy: So he’s presented at Black Hat, now SCAP 2008 (.pdf caveat).  Basically, while the NSA has a great red-team (think pen-test) capability, they had a major change of heart and realized, like the rest of the security world (*cough*Ranum*cough*), that while attacking is fun, it isn’t very productive at defending your systems–there is much more work to be done for the defenders, and we need more clueful people doing that.

Vendors are jumping on the bandwagon with both feet: The amount of uptake from the vulnerability and policy compliance vendors is amazing.  I would give numbers of how many are certified, but I literally get a new announcement in my news reader ever week or so.  For vendors, being certified means that you can sell your product to the Government, not being certified means that you get to sit on the bench watching everybody else have all the fun.  The GSA SAIR Smart-Buy Blanket Purchase Agreement sweetens the deal immensely by having your product easily purchasable in massive quantities by the Government.

Where are the rest of the standards: Yes, FDCC is great, but where are the rest of the hardening standards in cute importable XML files, ready to be snarfed into my SCAP-compliant tool?  Truth be told, this is one problem with SCAP right now because everybody has been focusing on FDCC and hasn’t had time yet to look at the other platforms.  Key word is “yet” because it’s happening real soon now, and it’s fairly trivial to convert the already-existing DISA STIGs or CIS Benchmarks into XCCDF.  In fact, Sun was blindsided by somebody who had made some SCAP schemas for their products and they had no idea that anybody was working on it–new content gets added practically daily because of the open-source nature of SCAP.

Changing Government role: This is going to be controversial.  With NVD/CVE, the government became the authoritative source for vulnerabilities.  So far that’s worked pretty well.  With the rest of SCAP, the Government changes roles to be a provider of content and configurations.  If NIST is smart, they’ll stay out of this because they prefer to be in the R&D business and not the operations side of things.  Look for DHS to pick up the role of being a definitions provider.  Government has to be careful here because they could in some instances be competing with companies that sell SCAP-like feed services.  Not a happy spot for either side of the fence.

More information security trickle-down effect: A repeated theme at SCAP 2008 is that the public sector is interested in what Big SCAP can do for them.  The vendors are using SCAP certification as a differentiator for the time being, but expect to see SCAP for security management standards like PCI-DSS, HIPAA, and SOX–to be honest here, though, most of the vendors in this space cut their teeth on these standards, it’s just a matter of legwork to be able to export in SCAP schemas.  Woot, we all win thanks to the magic that is the Government flexing its IT budget dollars!

OS and Applications vendors: these guys are feeling the squeeze of standardization.  On one hand, the smart vendors (Oracle, Microsoft, Sun, Cisco) have people already working with DISA/NSA to help produce the configuration guides, they just have to sit back and let somebody turn the guides into SCAP content.  Some of the applications vendors still haven’t figured out that their software is about to be made obsolete in the Government market because they don’t have the knowledge base to self-certify with FDCC and later OS standards.  With a 3-year lead time required for some of the desktop applications before a feature request (make my junk work with FDCC) makes it into a product release, there had better be some cluebat work going on in the application vendor community.  Adobe, I’m talking to you and Lifecycle ES–if you need help, just call me.

But how about system integrators: Well, for the time being, system integrators have almost a free ride–they just have to deal with FDCC.  There are some of them that have some cool solutions built on the capabilities of SCAP, but for the most part I haven’t seen much movement except for people who do some R&D.  Unfortunately for system integrators, the Federal Acquisition Regulation now requires that anything you sell to the Government be configured IAW the NIST checklists program.  And just how do you think the NIST checklists program will be implemented?  I’ll take SCAP for $5Bazillion, Alex.  Smart sytem integrators will at least keep an eye on SCAP before it blindsides them 6 months from now.

Technical compliance tools are destined to be a commodity: For the longest time, the vulnerability assessment vendors made their reputation by having the best vulnerability signatures.  In order to get true compatibility across products, standardized SCAP feeds means that the pure-play security tools are going to have less things to differentiate themselves from all the other tools and they fall into a commodity market centered on the accuracy of their checks with reduced false positives and negatives.  While it may seem like a joyride for the time being (hey, we just got our ticket to sell to the Gubmint by being SCAP-certified), that will soon turn into frustration as the business model changes and the margins get smaller.  Smart vendors will figure out ways to differentiate themselves and will survive, the others will not.

Which leads me to this: Why is it that SCAP only applies to security tools?  I mean, seriously, guys like BigFix and NetIQ have crossover from technical policy compliance to network management systems–CPE in particular.  What we need is a similar effort applied to network and data center tools.  And don’t point me at SNMP, I’m talking rich data.  =)  On a positive note, expect some of the security pure-play tools to be bought up and incorporated into enterprise suites if they aren’t already.

Side notes:

I love how the many deer (well over 9000 deer on the NIST campus) all have ear tags.  It brings up all sorts of scientific studies ideas.  But apparently the deer are on birth control shots or something….

Former Potomac Forum students:  Whattayaknow, I met some of our former students who are probably reading this right now because I pimped out my blog probably too aggressively.  =)  Hi Shawn, Marc, and Bob!

Old friends:  Wow, I found some of them, too.  Hi Jess, Walid, Chris, and a cast of thousands.

Deer on NIST Gaithersburg Campus photo by Chucka_NC.



Similar Posts:

Posted in DISA, FISMA, NIST, Technical, What Works | 2 Comments »
Tags:

Next Up in Security Legislation: S3474

Posted September 15th, 2008 by

And here we have it, a bill introduced by Senators Carper and Lieberman to increase security in the Government, known as FISMA 2008. I’m still waiting on the text to appear on the Thomas entry, but I’ll go through the major provisions from the congressional record.

Article from NextGov

Thomas Reference

Congressional Record of the Bill’s Introduction and text (Starts on CR 8388 and goes through CR 8391)

Major provisions:

  • Changes some definitions of “assessment”, “audit” and “evaluation”. OK, I had to do some research on this one.  Thankfully, this is all online.  Sidenote: it’s not Section 3545 as per the bill, it’s Section 3535.  Basically this is just rewording and rescoping of annual audits to be written the way it should have been in the first place.
  • Creates a CISO position at each agency. Hey, I thought this was already created by FISMA.  What we need is not CISOs that work for the CIO, what we need are agency CSOs (I’ll even take an agency Chief Risk Officer) that have authority over all of security, not just IT geek concerns.
  • Creates a CISO Council. Fantastic idea, how come I didn’t think of it?
  • Qualifications for CISOs. Not a bad idea, but the bill doesn’t elaborate too much.
  • Responsibilities for CISOs. This is an interesting section.  Much of this is in guidance from NIST/DISA/CNSS already.  I like most of these measures, but I’m not sure that they need to be codified into law except for the pieces that reside outside of the agencies, like the coordination with US-CERT.  Putting the CISO’s responsibilities into law does give the CISO more teeth if they need it, but you have to wield the law carefully.

The Law

The Law photo by F.S.M.

From the NextGov article and the congressional record:

“Our bill empowers chief information security officers to deny access to the agency network if proper security policies are not being followed. If we are going to hold these hardworking individuals accountable in Congress for information security, then we should give them the authority to do so,” said Carper.

Um, yeah, we’ve given them the authority in this bill, but my problem is that it completely removes the DAA/AO/mission owners from the picture–the CISO is now responsible for the secure operations of IT systems and has disconnect authority.

I think that philosophically this bill is a step backwards.  The more progressive thought is that security is the responsibility of the agency head and the mission owners and that the CISO just provides support as a subject matter expert.  Under this bill, we’re back to a world where the CISO is the sole decision-maker when it comes to security.  Wow, that’s so… 1990’s-ish.

However, we all know that the CISOs are the people getting the security job done from day to day, and this bill makes sense if you assume that the agency heads and DAAs/AOs have 0 interest or skills to assist in the security of their data.  That might or might not be true, I’ll leave it up to you to decide.

Questions for today are these (and yes, I want to hear what you think):

  • Are we willing to scrap the “business/system owner” concepts that our security management processes are modeled around?
  • Are we willing to admit that the DAA/AO concept is a failure because of lack of understanding and capabilities on their part?
  • Are the mission owners willing to take an outage on their supporting IT infrastructure because the CISO took the system offline because they didn’t secure the system properly in the first place?
  • Can we rely on a management technique where the stakeholders are removed from the decisionmaking of a trained expert?


Similar Posts:

Posted in FISMA, What Doesn't Work, What Works | 4 Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: