Et Tu, TIC?

Posted October 7th, 2008 by

Let’s talk about TIC today, dear readers, for I smell a conspiracy theory brewing.

For those of you who missed the quick brief, TIC is short for “Trusted Internet Connections” and is an architecture model/mandate/$foo to take all of the Internet connections in the Government (srsly, nobody knows how many of them really exist, but it’s somewhere in the 2,000-10,000 range) and consolidate them into 50.  These connections will then be monitored by DHS’s Einstein program.

No, Not That Kind of TIC photo by m.prinke.

Bringing you all up to date, you’ll need to do some homework:

Now having read all of this, some things become fairly obvious:

  • If you have the following people needing connections:
    • 24 agencies, plus
    • DoD with 2 points of presence, plus
    • Intelligence agencies with a handful of Internet connections, means that:
  • That basically, everybody gets one Internet connection.  This is not good, it’s all single point-of-DOS.
  • Agencies have been designated as Internet providers for other agencies.  Sounds like LoB in action.
  • Given the amount of traffic going through the TIC access points, it most likely is going to take a significant amount of hardware to monitor all these connections–maybe you saved 50% of the monitoring hardware by reducing the footprint, but it’s still hardware-intensive.
  • TIC is closely tied with the Networx contract.
  • In order to share Internet connections, there needs to be a network core between all of the agencies so that an agency without a TIC access point can route through multiple TIC service provider agencies.

And this is where my conspiracy theory comes in:  TIC is more about making a grand unified Government network than it is monitoring events–Einstein is just an intermediate goal.   If you think about it, this is where the Government is headed.

We were headed this way back in ought-two with a wonderful name: GovNet.  To be honest, the groundwork wasn’t there and the idea was way ahead of its time and died a horrible death, but it’s gradually starting to happen, thanks to TIC, FDCC, and Einstein. 

More fun links:

If you want to get a reaction out of the OMB folks, mention GovNet and watch them backpedal and cringe,–I think the pain factor was very high for them on GovNet. So I think that we should, as a cadre of information security folks, start calling TIC what it really is:  Govnet 2.0!  =)



Similar Posts:

Posted in Technical | 2 Comments »
Tags:

Note to Government: Learn What RSS Means

Posted September 30th, 2008 by

Let’s talk process flow rants here for a second.

Every couple of weeks, OMB issues a new memorandum setting policy for the executive branch.  They release the memo to the agency heads who then respond–a policy-based syn/ack/fin loop if you will.

All the memos are set up on the OMB website at this url.

Notice something missing?  Where is the RSS feed?  Seriously, if you want somebody other than just the agency heads to be informed of all your newest issues, you need to make it subscribable either via email or RSS feed.  There are tons of people who would subscribe, and it makes your job that much easier.

This rant was brought to you by the letters R and S.  =)



Similar Posts:

Posted in Rants | 3 Comments »
Tags:

Comments on SCAP 2008

Posted September 24th, 2008 by

I just got back from the SCAP 2008 conference at NIST HQ, and this is a collection of my thoughts in a somewhat random order:

Presention slides are available at the NVD website

I blogged about SCAP a year ago, and started pushing it in conversations with security managers that I came across.  Really, if you’re managing security of anything and you don’t know what SCAP is, you need to get smart on it really fast, if for no other reason than that you will be pitched it by vendors sporting new certifications.

Introduction to SCAP:  SCAP is a collection of XML schemas/standards that allow technical security information to be exchanged between tools.  It consists of the following standards:

  • Common Platform Enumeration (CPE): A standard to describe a specific hardware, OS, and software configuration.  Asset information, it’s fairly humdrum, but it makes the rest of SCAP possible–think target enumeration and you’re pretty close.
  • Common Vulnerabilities and Exposures (CVE): A definition of publicly-known vulnerabilities and weaknesses.  Should be familiar to most security researches and patch monkies.
  • Common Configuration Enumeration (CCE): Basically, like CVE but specific to misconfigurations.
  • Common Vulnerability Scoring System (CVSS): A standard for determining the characteristics and impact of security vulnerabilities.  Hmmm, sounds suspiciously like standardization of what is a high, medium, and low criticality vulnerability.
  • Open Vulnerability and Assessment Language (OVAL):  Actually, 3 schemas to describe the inventory of a computer, the configuration on that computer, and a report of what vulnerabilites were found on that computer.
  • Extensible Configuration Checklist Description Format (XCCDF): A data set that describes checks for vulnerabilities, benchmarks, or misconfigurations.  Sounds like the updates to your favorite vulnerability scanning tool because it is.

Hall of Standards inside NIST HQ photo by ME!!!

What’s the big deal with SCAP: SCAP allows data exchanges between tools.  So, for example, you can take a technical policy compliance tool, load up the official Government hardening policy in XCCDF for, say, Windows 2003, run a compliance scan, export the data in OVAL, and load the results into a final application that can help your CISO keep track of all the vulnerabilities.  Basically, imagine that you’re DoD and have 1.5 million desktops–how do you manage all of the technical information on those without having tools that can import and export from each other?

And then there was the Federal Desktop Core Configuration (FDCC): OMB and Karen Evans handed SCAP its first trial-by-fire.  FDCC is a configuration standard that is to be rolled out to every Government desktop.  According to responses received by OMB from the departments in the executive branch (see, Karen, I WAS paying attention =)   ), there are roughly 3.5 Million desktops inside the Government.  The only way to manage these desktops is through automation, and SCAP is providing that.

He sings, he dances, that Tony Sager is a great guy: So he’s presented at Black Hat, now SCAP 2008 (.pdf caveat).  Basically, while the NSA has a great red-team (think pen-test) capability, they had a major change of heart and realized, like the rest of the security world (*cough*Ranum*cough*), that while attacking is fun, it isn’t very productive at defending your systems–there is much more work to be done for the defenders, and we need more clueful people doing that.

Vendors are jumping on the bandwagon with both feet: The amount of uptake from the vulnerability and policy compliance vendors is amazing.  I would give numbers of how many are certified, but I literally get a new announcement in my news reader ever week or so.  For vendors, being certified means that you can sell your product to the Government, not being certified means that you get to sit on the bench watching everybody else have all the fun.  The GSA SAIR Smart-Buy Blanket Purchase Agreement sweetens the deal immensely by having your product easily purchasable in massive quantities by the Government.

Where are the rest of the standards: Yes, FDCC is great, but where are the rest of the hardening standards in cute importable XML files, ready to be snarfed into my SCAP-compliant tool?  Truth be told, this is one problem with SCAP right now because everybody has been focusing on FDCC and hasn’t had time yet to look at the other platforms.  Key word is “yet” because it’s happening real soon now, and it’s fairly trivial to convert the already-existing DISA STIGs or CIS Benchmarks into XCCDF.  In fact, Sun was blindsided by somebody who had made some SCAP schemas for their products and they had no idea that anybody was working on it–new content gets added practically daily because of the open-source nature of SCAP.

Changing Government role: This is going to be controversial.  With NVD/CVE, the government became the authoritative source for vulnerabilities.  So far that’s worked pretty well.  With the rest of SCAP, the Government changes roles to be a provider of content and configurations.  If NIST is smart, they’ll stay out of this because they prefer to be in the R&D business and not the operations side of things.  Look for DHS to pick up the role of being a definitions provider.  Government has to be careful here because they could in some instances be competing with companies that sell SCAP-like feed services.  Not a happy spot for either side of the fence.

More information security trickle-down effect: A repeated theme at SCAP 2008 is that the public sector is interested in what Big SCAP can do for them.  The vendors are using SCAP certification as a differentiator for the time being, but expect to see SCAP for security management standards like PCI-DSS, HIPAA, and SOX–to be honest here, though, most of the vendors in this space cut their teeth on these standards, it’s just a matter of legwork to be able to export in SCAP schemas.  Woot, we all win thanks to the magic that is the Government flexing its IT budget dollars!

OS and Applications vendors: these guys are feeling the squeeze of standardization.  On one hand, the smart vendors (Oracle, Microsoft, Sun, Cisco) have people already working with DISA/NSA to help produce the configuration guides, they just have to sit back and let somebody turn the guides into SCAP content.  Some of the applications vendors still haven’t figured out that their software is about to be made obsolete in the Government market because they don’t have the knowledge base to self-certify with FDCC and later OS standards.  With a 3-year lead time required for some of the desktop applications before a feature request (make my junk work with FDCC) makes it into a product release, there had better be some cluebat work going on in the application vendor community.  Adobe, I’m talking to you and Lifecycle ES–if you need help, just call me.

But how about system integrators: Well, for the time being, system integrators have almost a free ride–they just have to deal with FDCC.  There are some of them that have some cool solutions built on the capabilities of SCAP, but for the most part I haven’t seen much movement except for people who do some R&D.  Unfortunately for system integrators, the Federal Acquisition Regulation now requires that anything you sell to the Government be configured IAW the NIST checklists program.  And just how do you think the NIST checklists program will be implemented?  I’ll take SCAP for $5Bazillion, Alex.  Smart sytem integrators will at least keep an eye on SCAP before it blindsides them 6 months from now.

Technical compliance tools are destined to be a commodity: For the longest time, the vulnerability assessment vendors made their reputation by having the best vulnerability signatures.  In order to get true compatibility across products, standardized SCAP feeds means that the pure-play security tools are going to have less things to differentiate themselves from all the other tools and they fall into a commodity market centered on the accuracy of their checks with reduced false positives and negatives.  While it may seem like a joyride for the time being (hey, we just got our ticket to sell to the Gubmint by being SCAP-certified), that will soon turn into frustration as the business model changes and the margins get smaller.  Smart vendors will figure out ways to differentiate themselves and will survive, the others will not.

Which leads me to this: Why is it that SCAP only applies to security tools?  I mean, seriously, guys like BigFix and NetIQ have crossover from technical policy compliance to network management systems–CPE in particular.  What we need is a similar effort applied to network and data center tools.  And don’t point me at SNMP, I’m talking rich data.  =)  On a positive note, expect some of the security pure-play tools to be bought up and incorporated into enterprise suites if they aren’t already.

Side notes:

I love how the many deer (well over 9000 deer on the NIST campus) all have ear tags.  It brings up all sorts of scientific studies ideas.  But apparently the deer are on birth control shots or something….

Former Potomac Forum students:  Whattayaknow, I met some of our former students who are probably reading this right now because I pimped out my blog probably too aggressively.  =)  Hi Shawn, Marc, and Bob!

Old friends:  Wow, I found some of them, too.  Hi Jess, Walid, Chris, and a cast of thousands.

Deer on NIST Gaithersburg Campus photo by Chucka_NC.



Similar Posts:

Posted in DISA, FISMA, NIST, Technical, What Works | 2 Comments »
Tags:

Keeping The Lights On: Cybersecurity Law for the Electric Grid

Posted September 23rd, 2008 by

Ever wondered if your electricity supply was safe from computer attack? Congress wondered that too. So they asked the Federal Energy Regulatory Commission (FERC) to find out. The answers they received in October of 2007 were not encouraging.

After 9/11 there was concern about the safety of the Bulk Power Supply (BPS). The President’s Commission on Critical Infrastructure Protection released a report which was explicit about the dangers faced. A frightening example of these dangers was demonstrated by the Aurora vulnerability, essentially a software hack that made a generator crash hard. When faced with this example industry moved to mitigate the problem with some prodding from Department of Homeland Security (DHS), Nuclear Regulatory Commission (NRC) and FERC. The Nuclear Sector, which is regulated by NRC, issued a requirement to address the problem. The Electric Sector was issued a recommendation to address the problem by the Electric Sector Information Sharing and Analysis Center (ES-ISAC). Guess which industry has moved forward with successful mitigation efforts and which has not. FERC reported back on these findings in October of 2007.

Fast forward to now. On September 11th the Bulk Power System Protection Act (BPSPA) of 2008 (PDF link) was put forward by Rep. Rick Boucher (D-VA), chairman of the House Subcommittee on Energy and Air Quality. In addition to the September 11th hearing on the BPSPA a closed door hearing was expected to be conducted the following week. The goal of this legislation is to expand the emergency power of FERC to regulate cybersecurity for the BPS. The act itself does not appear to be strongly opposed by the energy industry but, as always, the devil is in the details.

Diablo Canyon Nuclear Power Plant photo by emdot.

The draft legislation is disputed on three major points; whether to include national security threats, disclosure of threat information and a sunset provision.

FERC recommends wording that would make explicit the requirement to address national security threats. This seems an implicit and reasonable expectation that the people of the United States would have of the agency regulating the BPS but the Energy Sector considers this too expansive a role. They argue that it might cause expensive requirements to be issued such as stockpiling fuel.

The disclosure of threat information is a sore point. Here you can understand the pain of the industry in dealing with government intelligence agencies who would like to keep details of a threat spare to preserve the source of that information. Unfortunately the government must preserve their sources while providing enough information for the industry to react.

Both FERC and the Energy Sector agree on the idea of a sunset provision. The sunset provision in this case stipulates that so long as an order is implemented as a standard it should terminate one year after issuance unless renewed by the President or the Secretary of Energy. The issue is whether this sunset will include the orders to address existing problems (such as the Aurora vulnerability) in addition to orders issued for future vulnerabilities. FERC recommends that only future orders should be sunsetted while the Energy Sector recommends both current and future orders should be sunsetted.

One element which is not adequately addressed in this legislation is how FERC will build the capability to assess and manage cybersecurity issues for the BPS. What should be in place is a bipartite separation of duties between FERC and NIST similar to what is in place with the dual OMB/NIST FISMA roles. FERC would oversee the security while NIST would provide technical guidance on what security should be put in place. FERC does not have the experience in security frameworks or in depth expertise in SCADA security which is required for a cybersecurity initiative of this magnitude.

It is worth noting that Energy Policy Act of 2005 (PDF link) established a process through which the North American Electric Reliability Corporation’s (NERC) was authorized to enforce cybersecurity in the Energy Sector. NERC had gone so far as to create Critical Infrastructure Protection (CIP) standards to include with their Reliability Standards and had present them to FERC for approval by late 2007.

A review of the NERC CIP standards (CIP-001 through CIP-009) does not inspire confidence in NERC’s cybersecurity capabilities. I will discuss the shortcomings of this guidance in a subsequent post.



Similar Posts:

Posted in What Doesn't Work | 3 Comments »
Tags:

Some Words From a FAR

Posted September 9th, 2008 by

FAR: it’s the Federal Acquisition Regulation, and it covers all the buying that the government does.  For contractors, the FAR is a big deal–violate it and you end up blackballed from Government contracts or having to pay back money to your customer, either of which is a very bad thing.

In early August, OMB issued Memo 08-22 (standard .pdf caveat blah blah blah) which gave some of the administratrivia about how they want to manage FDCC–how to report it in your FISMA report, what is and isn’t a desktop, and a rough outline on how to validate your level of compliance.

Now I have mixed feelings about FDCC, you all should know that by now, but I think the Government actually did a decent thing here–they added FDCC (and any other NIST secure configuration checklists) to the FAR.

Check this section of 800-22 out:

On February 28, 2008, revised Part 39 of the Federal Acquisition Regulation (FAR) was published which reads:
PART 39-ACQUISITION OF INFORMATION TECHNOLOGY
1. The authority citation for 48 CFR part 39 continues to read as follows: Authority: 40 U.S.C. 121(c); 10U.S.C. chapter 137; and 42 U.S.C. 2473(c).
2. Amend section 39.101 by revising paragraph (d) to read as follows:
39.101 Policy.
* * * * *

(d) In acquiring information technology, agencies shall include the appropriate IT security policies and requirements, including use of common security configurations available from the NIST’s website at http://checklists.nist.gov. Agency contracting officers should consult with the requiring official to ensure the appropriate standards are incorporated.

Translated into English, what this means is that the NIST configurations checklists are coded into law for Government IT purchases.

This carries a HUGE impact to both the Government and contractors.  For the Government, they just outsourced part of their security to Dell and HP, whether they know it or not.  For the desktop manufacturers, they just signed up to learn how FDCC works if they want some of the Government’s money. 

Remember back in the halcyon days of FDCC when I predicted that one of the critical keys to success for FDCC was to be able to buy OEM desktops with the FDCC images on them.  It’s slowly becoming a reality.

Oh what’s that, you don’t sell desktops?  Well, this applies to all NIST configuration checklists, so as NIST adds to the intellectual property in the checklists program, you get to play too.  Looking at the DISA STIGs as a model, you might end up with a checklist for literally everything.

So as somebody who has no relation to the US Federal Government, you must be asking by now how you can ride the FDCC wave?  Here’s Rybolov’s plan for secure desktop world domination:

  • Wait for the government to attain 60-80% FDCC implementation
  • Wait for desktops to have an FDCC option for installed OS
  • Review your core applications on the FDCC compatibility list
  • Adopt FDCC as your desktop hardening standard
  • Buy your desktop hardware with the image pre-loaded
  • The FDCC configuration rolls uphill to be the default OS that they sell
  • ?????
  • Profit!

And the Government security trickle-down effect keeps rolling on….

Cynically, you could say that the OMB memos as of late (FDCC, DNSSEC) are very well coached and that OMB doesn’t know anything about IT, much less IT security.  You probably would be right, but seriously, OMB doesn’t get paid to know IT, they get paid to manage and budget, and in this case I see some sound public policy by asking the people who do know what they’re talking about.

While we have on our cynical hats, we might as well give a nod to those FISMA naysayers who have been complaining for years that the law wasn’t technical/specific enough.   Now we have very static checklists and the power to decide what a secure configuration should be has been taken out of the hands of the techies who would know and given to research organizations and bureaucratic organizations who have no vested interest in making your gear work.

Lighthouse From Afar

Lighthouse From AFAR photo by Kamoteus.



Similar Posts:

Posted in FISMA, NIST, What Doesn't Work, What Works | 8 Comments »
Tags:

Government Pre-Election Slowdown has Started

Posted September 9th, 2008 by

Signs of the pre-election slowdown are around us, and I’m definitely starting to feel it.

For those of you outside the beltway, it breaks down like this:  people aren’t willing to make any long-term decisions  or start any long-term projects because they will be overruled in a couple of months after the elections and as election platforms meet reality.  Typically this happens once most of the political appointees are in-place, and I have a feeling that early 2009 is going to be much fun, no matter who wins the presidency.

Now when the current president took charge of the executive branch, he issued a 5-point plan called the President’s Management Agenda.  You can check out the PMA on the OMB website.  And yes, E-Government is one of the 5.  You can expect something similar under the new administration.

As a parting shot, you know it’s a slowdown when you see contracts that will be awarded in November but the work doesn’t start until April.  =)

 

Lame ducks frozen in water

Lame Ducks Frozen in the Ice photo by digitalART2.



Similar Posts:

Posted in Odds-n-Sods | 1 Comment »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: