Analyzing Fortify’s Plan to “Fix” the Government’s Security Problem

Posted April 1st, 2009 by

So I like reading about what people think about security and the Government.  I know, you’re all surprised, so cue shock and awe amongst my reader population.

Anyway, this week it’s Fortify and a well-placed article in NextGov.  You remember Fortify, they are the guys with the cool FUD movie about how code scanning is going to save the world.  And oh yeah, there was this gem from SC Magazine: “Fortify’s Rachwald agrees that FISMA isn’t going anywhere, especially with the support of the paper shufflers. ‘It’s been great for people who know how to fill out forms. Why would they want it to go away?'”  OK, so far my opinion has been partially tainted–somehow I think I’m supposed to take something here personal but I’m not sure exactly what.

Fortify has been trying to step up to the Government feed trough over the past year or so.  In a rare moment of being touch-feely intuitive, from their marketing I get the feeling that Fortify is a bunch of Silicon Valley technologists who think they know what’s best for DC–digital carpetbagging.  Nothing new, all y’alls been doing this for as long as I’ve been working with the Government.

Now don’t get me wrong, I think Fortify makes some good products.  I think that universal adoption of code scanning, while not as foolproof as advertised, is a good thing.  I also think that software vendors should use scanning tools as part of their testing and QA.

Fortified cité of Carcassonne photo by http2007.

Now for a couple basic points that I want to get across:

  • Security is not a differentiator between competing products unless it’s the classified world. People buy IT products based on features, not security.
  • The IT industry is a broken market because there is no incentive to sell secure code.
  • In fact, software vendors are often rewarded market-wise because if you arrive first to market with the largest market penetration, you become the defacto standard.
  • The vendors are abstracted from the problems faced by their customers thanks to the terms of most EULAs–they don’t really have to fix security problems since the software is sold with no guarantees.
  • The Government is dependent upon the private sector to provide it with secure software.
  • It is a conflict of interest for the vendors to accurately represent their flaws unless the Government is going to pay to have them fixed.
  • It’s been proposed numerous the Government use its “huge” IT budget to require vendors to sell secure projects.
  • How do you determine that a vendor is shipping a secure product?

Or more to the point, how do I as a software vendor reasonably demonstrate that I have provided a secure product to the government without a making the economics infeasible for smaller vendors, creating an industry of certifiers ala PCI-DSS and SOX, or dramatically lengthening my development/procurement schedules?  Think of the problems with common criteria, because that’s our previous attempt.

We run into this problem all the time in Government IT security, but it’s mostly at the system integrator level.  It’s highly problematic to make contract requirements that are objective, demonstrable, and testable yet still take into account threats and vulnerabilities that do not exist today.

I’ve spent the past month writing a security requirements document for integrated special-purpose devices sold to the Government.  Part of this exercise was the realization that I can require that the vendor perform vulnerability scanning, but it becomes extremely difficult to include an amount of common sense into requirements when it comes to deciding what to fix.  “That depends” keeps coming back to bite me in the buttocks time and time again.  At this point, I usually tell my boss how I hate security folks, self included, because of their indecisiveness.

The end result is that I can specify a process (Common Criteria for software/hardware, Certification and Accreditation for integration projects) and an outcome (certification, product acceptance, “go live” authorization), leave the decision-making authority with the Government, and put it in the hands of contracts officers and subject-matter experts who know how to manage security.  Problems with this technique:

  • I can’t find enough contracts officers who are security experts.
  • As a contractor, how do I account for the costs I’m going to incur since it’s apparently “at the whim of the Government”?
  • I have to apply this “across the board” to all my suppliers due to procurement law.  This might not be possible right now for some kinds of outsourced development.
  • We haven’t really solved the problem of defining what constitutes a secure product.
  • We’ve just deferred the problem from a strategic solution to a tactical process depending on a handful of clueful people.

Honestly, though, I think that’s as good as we’re going to get.  Ours is not a perfect world.

And as for Fortify?  Guys, quit trying to insult the people who will ultimately recommend your product.  It’s bad mojo, especially in a town where the toes you step on today may be attached to the butt you kiss tomorrow.  =)



Similar Posts:

Posted in Outsourcing, Technical, What Doesn't Work, What Works | 2 Comments »
Tags:

LOLCATS: Defending our Cyber-Turf

Posted March 26th, 2009 by

Yeah, it’s old but way too good to pass up.  According to Congressional testimony from some industry experts, the US needs to defend its “Cyber-Turf” and so we have today’s IKANHAZFIZMA:

 

funny pictures



Similar Posts:

Posted in IKANHAZFIZMA | No Comments »
Tags:

Bringing You Only the Best in Security Network Diagrams

Posted March 24th, 2009 by

“Drawn” by an infosec engineer known simply as “TomBot” and passed down in email for years.  Click the diagram to get a bigger version.

Network Diagram by TomBot.



Similar Posts:

Posted in BSOFH, Technical, The Guerilla CISO, What Doesn't Work | 5 Comments »
Tags:

Ed Bellis’s Little SCAP Project

Posted March 19th, 2009 by

So way back in the halcyon days of 2008 when Dan Philpott, Chris Burton, Ian Charters, and I went to the NIST SCAP Conference.  Just by a strange coincidence, Ed Bellis threw out a twit along the lines of “wow, I wish there was a way to import and export all this vulnerability data” and I replied back with “Um, you mean like SCAP?

Fast forward 6 months.  Ed Bellis has been busy.  He delivered this presentation at SnowFROC 2009 in Denver:

So some ideas I have about what Ed is doing:

#1 This vulnerability correllation and automation should be part of vulnerability assessment (VA) products.  In fact, most VA products include some kind of ticketing and workflow nowadays if you get the “enterprise edition”. That’s nice, but…

#2 The VA industry is a broken market with compatibility in workflow.  Everybody wants to sell you *their* product to be the authoritative manager. That’s cool and all, but what I really need is the connectors to your competitor’s products so that I can have one database of vulnerabilities, one set of charts to show my auditors, and one trouble ticket system. SCAP helps here but only for static, bulk data transfers–that gets ugly really quickly.

#3 Ed’s correllation and automation software is a perfect community project because it’s a conflict of interest for any VA vendor to write it themselves. And to be honest, I wouldn’t be surprised if there aren’t a dozen skunkwork projects that people will admit to creating just in the comments section of this post. I remember 5 years ago trying to hack together some perl to take the output from the DISA SRR Scripts and aggregate them into a .csv.

#4 The web application security world needs to adopt SCAP. So far it’s just been the OS and shrinkwrapped application vendors and the whole race to detection and patching. Now the interesting part to me is that the market is all around tying vulnerabilities to specific versions of software and a patch, where when you get to the web application world, it’s more along the lines of one-off misconfigurations and coding errors. It takes a little bit of a mindshift in the vulnerability world, but that’s OK in my book.

#5 This solution is exactly what the Government needs and is exactly why SCAP was created. Imagine you’re the Federal Government with 3.5 million desktops, the only way you can manage all those is through VA automation and a tool that aggregates information from various VA products across multiple zones of trust, environments, and even organizations.

#6 Help Ed out! We need this.



Similar Posts:

Posted in Technical, What Works | 4 Comments »
Tags:

Cyber Security coming to a boil

Posted March 16th, 2009 by

During his campaign, then candidate Obama promised he would, “make cyber-security the top priority that it should be in the 21st century. I’ll declare our cyber-infrastructure a strategic asset, and appoint a national cyber-adviser, who will report directly to me.” Since Obama was elected there has been a great deal of speculation as to what real-life changes in direction and policy that promise would bring.

Last month, President Obama appointed Melissa Hathaway to be a Senior Director of the National Security Council. She immediately launched a 60-day review of security of Federal IT systems. As a result of this effort, there is much speculation that at the end of the 60-day review she will be appointed the National Cyber Advisor–the so-called Cyber Security Czar.

Just this week, the Director of the National Cyber Security Center, Rod A. Beckstrom, over at the Department of Homeland Security resigned. The press reports of Beckstrom’s resignation indicate some frustration on Beckstrom’s part. His frustration seems to be primarily aimed at the National Security Agency (NSA). Beckstrom suggests that the NSA has been subverting his efforts to coordinate cyber security efforts across the intelligence community.

A good friend of mine has suggested that the resignation is simply political and an artifact of the transition from one administration to another. He further suggests that this also signals a shift from leadership in cyber security from civilian agencies toward the Intelligence Community taking its turn at leadership. I think he may be right, too. However, I think there is more history here than just a shift in policy from one administration to another.

In my opinion, this isn’t just about politics. There are two drivers for this move. First, congress and the administration recognize that that the on-going assault on government and commercial networks is a national security issue and an economic security and competitiveness issue too. In today’s economic droop people often forget that two of our greatest economic strengths are our accumulated intellectual property and our hard working human capital. Both of these assests are discounted when criminal and national groups successfully attack our nations IT infrastructure. Recognizing this is a good thing, I’m not going to recount the long history of cyber assault on Federal IT systems by international cyber criminals, and “state-sponsored entities.” Facts and figures concerning this on-going assault and the damage associated with it is just a Google search away.

The second driver for a policy shift is that congress and the administration recognize that the FBI, Justice, DHS approach to cyber security is an utter failure. This failed approach sees cyber security as a criminal problem with industry participating in its own defense on a ‘voluntary’ basis. This has led to comical activities such as FBI delegation going to Moscow with hat in hand asking the Russians for help in tracking down successful Cyber Organized Crime groups based in Russia. The fact that these groups may have had strong official or unofficial connections with the Russian government should have given the FBI an indication of the lack of cooperation they would face –- I believe in Law Enforcement circles this is usually called a “clue”. Likewise, FBI delegations to Russia trying to track down Russian Cyber attackers that may have had some direct level of state support were equally unproductive. To be fair, the FBI was placed in an impossible position when they were asked to organize delegations like this.

So that kind of sums up the civilian or “law enforcement” approach toward national cyber security.

That leaves us to consider the much discussed alternative, specifically a shift in policy toward giving the intelligence community leadership in providing cyber national security. There have been attempts in the past to give the Intelligence Community greater responsibility for cyber security, but while the Intelligence Community seemed to have the technical resources to address these responsibilities, they were often confused by the mission and hampered by legislation and culture. By temperament, the Intelligence Community is about collection and analysis of information. Once you start asking them to do something about a situation that they have studied or understand well, you are often asking them to not just change their mission but also act against the very culture that made them successful. To understand a situation, the Intelligence Community works quietly, secretly, and in the shadows. To take action, they have to emerge for the shadows and act very publically. This transition can be difficult and even disastrous. Such transitions can give you the Bay of Pigs, non-judicial detention at Gitmo, and odd-ball assassinations–all sorts of activities that people hate because the actions themselves were not “peer-reviewed” as best security practices.

It’s not that the Intelligence Community is incompetent (well everyone makes mistakes or hides them), it’s just that that transition from intelligence/information collection to public coordination, and policy leadership, with all of the very public meetings, policy reviews, and planning drives the Intelligence Community from a position of strength and expertise to new ground. Unfortunately, another strong element of the culture of the Intelligence Community is that if the President calls, “they haul…” They just can’t bring themselves to say no, even if it’s a bad idea.

That brings us to the question, who should be responsible for cyber security? Well, every government agency wants the mission because of the funding that goes with it. But, it’s not clear who has the right perspective and culture. I suspect that the right answer is to combine the experience, and technical know-how from several agencies and to develop some new capabilities. This means that leadership of the effort has to be unambiguous. That is precisely why I believe the Obama Administration will keep the leadership on their new approach to Cyber Security right inside the White House itself. That really shouldn’t be a surprise since that is exactly what the Obama as a candidate said he would do.

Enigma Machines Collection at the National Cryptologic Museum photo by brewbooks.



Similar Posts:

Posted in Public Policy, Rants, What Doesn't Work, What Works | 6 Comments »
Tags:

Certification and Accreditation Seminar, March 30th and 31st

Posted March 13th, 2009 by

We’ve got another good US Government Security Certification and Accreditation (C&A) Seminar/Workshop coming up at the end of March with Potomac Forum.

Graydon McKee (Ascension Risk Management and associated blog) and Dan Philpott (Fismapedia Mastermind and Guerilla-CISO Contributor) are going to the core of the instruction, with a couple others thrown in to round it all out.  I might stop by if I have the time.

What we promise:

  • An opportunity to hear NIST’s version of events and what they’re trying to accomplish
  • An opportunity to ask as many questions as you possibly can in 2 days
  • Good materials put together
  • An update on some of the recent security initiatives
  • An opportunity to commiserate with security folks from other agencies and contractors
  • No sales pitches and no products

See you all there!



Similar Posts:

Posted in FISMA, NIST, Speaking | No Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: