Where For Art Thou, 60-Day Review

Posted May 7th, 2009 by

April Fools Day pranks aside, I’m wondering what happened to the 60-day Cybersecurity Review.  Supposedly, it was turned into the President on the 17th.  I guess all I can do is sigh and say “So much for transparency in Government”.

I’m trying hard to be understanding here, I really am.  But isn’t the administration pulling the same Comprehensive National Cybersecurity Initiative thing again, telling the professionals out in the private sector that it depends on, “You can’t handle the truth!”

And this is the problem.  Let’s face it, our information sharing from Government to private sector really sucks right now.  I understand why this is–when you have threats and intentions that come from classified sources, if you share that information, you risk losing your sources.  (ref: Ultra and  Coventry, although it’s semi-controversial)

Secret Passage photo by electricinca.

Looking back at one of the weaknesses of our information-sharing strategy so far:

  • Most of the critical infrastructure is owned and operated by the private sector.  Government (and the nation at-large) depends on these guys and the resilience of the IT that these
  • The private sector (or at least critical infrastructure owners and operators) need the information to protect their infrastructure.
  • Our process for clearing someone to receive sensitive information is to do a criminal records investigation, credit report, and talk to a handful of their friends to find out who they really are.  It takes 6-18 months.  This is not quick.
  • We have some information-sharing going on.  HSIN and Infragard are pretty good so far–we give you a background check and some SBU-type information.  Problem is that they don’t have enough uptake out in the security industry.  If you make/sell security products and services for Government and critical infrastructure, you owe it to yourself to be part of these.
  • I’ve heard people from Silicon Valley talk about how the Government doesn’t listen to them and that they have good ideas.  Yes they do have some ideas, but they’re detached from the true needs because they don’t have the information that they need to build the right products and services, so all they can do is align themselves with compliance frameworks and wonder why the Government doesn’t buy their kit.  It’s epic fail on a macromarket scale.

In my opinion, Government can’t figure out if they are a partner or a regulator.  Choose wisely, it’s hard to be both.

As a regulator, we just establish the standard and, in theory anyway, the private sector folks don’t need to know the reasoning behind the standard.   It’s fairly easy to manage but not very flexible–you don’t get much innovation and new technology if people don’t understand the business case.  This is also a more traditional role for Government to take.

As a partner, we can share information and consequences with the private sector.  It’s more flexible in response but takes much more effort and money to bring them information.  It also takes participation from both sides–Government and private sector.

Now to tie it all off by going back to the 60-Day Cybersecurity Review….  The private sector needs information contained in the review.  Not all of it, mind you, just the parts that they need to do their job.  They need it to help the Government.  They need it to build products that fit the Government’s needs.  They need it to secure their own infrastructure.



Similar Posts:

Posted in Public Policy, Risk Management | 3 Comments »
Tags:

Blow-By-Blow on S.773–The Cybersecurity Act of 2009–Part 5

Posted May 4th, 2009 by

Rybolov Note: this is part 4 in a series about S.773.  Go read the bill hereGo read part one hereGo read part two hereGo read part three here. Go read part four here.

Themes: I’ve read this thing back and forth, and one theme emerges overall: We’ve talked for the better part of a decade about what it’s going to take to “solve” this problem that is IT security, from an internal Federal Government standpoint, from a military-industrial complex standpoint, from a state and local government standpoint, from a private-sector standpoint, and from an end-user standpoint.  This bill takes some of the best though on the issue, wraps it all up, and presents it as a “if you want to get the job done, this is the way to do it”.

Missing: The role of DHS.  Commerce is highly represented, over-represented to my mindset.  Looking at the pieces of who owns what:

Commerce security organizations:

NTIA–Technically not a security organization, but they manage the DNS root and set telecom policy.

NIST–They write the standards for security.

FTC–They regulate trade and have oversight over business fraud.

DHS Security organizations:

NPPD–They are responsible for critical infrastructure and national risk management.

NCSD–They do the security operations side of our national cybersecurity strategy and run US-CERT. (BTW, hi guys!)

Secret Service–They have the primary responsibility of protecting the US Currency which also includes computer crimes against financial infrastructure.

Science and Technology Directorate–They are responsible for research and development, including IT security.

DOJ Security Organizations:

FBI–Surprise, they do investigations.

So you see, some of the things that are tasked to Commerce are done by DHS and DOJ.  This is probably the nature of the bill, it was introduced in the Commerce committee so it’s understandable that it would be Commerce-centric.

Cost: One thing kept nagging me in the back of my head while going through this bill is the cost to do everything  We’re asking to do a lot in this bill, now what’s the total cost?  Typically what happens when a bill makes it out of committee is that the Congressional Budget Office attached a price to the legislation as far as the total cost and then what’s the breakdown for the average American household.  That data isn’t published yet on the bill’s page, so we’ll see in the next iteration.

In-Your-Face Politics: Really, this bill is showing us how to do the full security piece.  It includes everything.  It’s challenging people to come up with alternatives.  It’s challenging people to delete the sections that don’t make sense.  It’s challenging people to fix the scope issues.  Like it or hate it, it definitely stirs up debate.

Final Thoughts: S.773 is a pretty decent bill.  It has some warts that need to be fixed, but overall it’s a pretty positive step.

Capitol photo by bigmikesndtech.



Similar Posts:

Posted in Public Policy | No Comments »
Tags:

Blow-By-Blow on S.773–The Cybersecurity Act of 2009–Part 4

Posted May 1st, 2009 by

Rybolov Note: this is part 4 in a series about S.773.  Go read the bill hereGo read part one hereGo read part two hereGo read part three hereGo read part 5 here. =)

SEC. 18. CYBERSECURITY RESPONSIBILITIES AND AUTHORITY. This section needs to be reviewed line-by-line because it’s dense:

“The President–

(1) within 1 year after the date of enactment of this Act, shall develop and implement a comprehensive national cybersecurity strategy, which shall include–

(A) a long-term vision of the Nation’s cybersecurity future; and

(B) a plan that encompasses all aspects of national security, including the participation of the private sector, including critical infrastructure operators and managers;”

OK, fair enough, this calls for a cybersecurity strategy that includes the agencies and critical infrastructure.  Most of that is in-play already and has overlap with some other sections.

(2) may declare a cybersecurity emergency and order the limitation or shutdown of Internet traffic to and from any compromised Federal Government or United States critical infrastructure information system or network;

Declaring an emergency is already a President function for natural disasters, this makes sense, except where you militarized cybersecurity and indirectly give the President the authority here to declare a cyberwar, depending on how you interpret this paragraph.

The cutoff authority has been given much talk.  This part pertains only to Government systems and critical infrastructure.  Note that the criteria here is that the part being cutoff has to have been compromised, which makes more sense.  The part that I’m worried about is when we preemptively cut off the network in anticipation of pwnage.

(3) shall designate an agency to be responsible for coordinating the response and restoration of any Federal Government or United States critical infrastructure information system or network affected by a cybersecurity emergency declaration under paragraph (2);

This is interesting to me because it leaves the designation up to the President.  Remember, we have all this debate as to who should “own” cybersecurity: DHS, DoD, NSA, FBI, and even Commerce have been proposed here.  I don’t think Congress should leave this designation to the President–it needs to be decided before an incident so that we don’t fight over jurisdiction issues during the incident.  Ref: Cyber-Katrina.

(4) shall, through the appropriate department or agency, review equipment that would be needed after a cybersecurity attack and develop a strategy for the acquisition, storage, and periodic replacement of such equipment;

This is good.  What it means is stockpiling or contracting for equipment in advance of an attack… think DDoS response teams and you have a pretty good idea.  And hey, this also works in disaster recovery, which I’ve never understood why we don’t manage some DR at the national level.  GSA, are you paying attention here?

(5) shall direct the periodic mapping of Federal Government and United States critical infrastructure information systems or networks, and shall develop metrics to measure the effectiveness of the mapping process;

Enumeration is good, depending on what we’re using the information for.  If you use it to beat up on the agency CISOs and the critical infrastructure owners/operators, then we have better things to spend our time doing.  If you do this and then use the information to help people Ref: security metrics, architecture support, Federal Enterprise Architecture.  I also have a problem with this because you can map vulnerabilities but how do you get the information to the right people who can fix them?

(6) may order the disconnection of any Federal Government or United States critical infrastructure information systems or networks in the interest of national security;

OK, this gives the President authority over private networks.  And fo-shizzle, I thought the President already had disconnect authority over Government networks.  If I was an owner of critical infrastructure I would be sh*tting bricks here because this means that the President has disconnect authority for my gear and doesn’t have to give me an answer on why or a remediation plan to get it turned back on–Ref: National Security Letter.  I think we need the disconnect authority, but there has to be some way for people to get turned back on.

(7) shall, through the Office of Science and Technology Policy, direct an annual review of all Federal cyber technology research and development investments;

Good stuff, I would be surprised if this isn’t happening already, what with Congress providing the budget for cyber technology research.

(8) may delegate original classification authority to the appropriate Federal official for the purposes of improving the Nation’s cybersecurity posture;

This paragraph is interesting, mostly because it could go anyway.  If we get a Cybersecurity Advisor, this will most likely be dedicated to them, meaning that they get the authority to determine what’s national security information.  This also works in conjunction with quite a few sections of the bill, including all the information-sharing initiatives and paragraph 6 above.

(9) shall, through the appropriate department or agency, promulgate rules for Federal professional responsibilities regarding cybersecurity, and shall provide to the Congress an annual report on Federal agency compliance with those rules;

I had to read this paragraph a couple of times.  Really what I think we’re doing is establishing a case for agency executives to be found negligent in their duty if they do not ensure security inside their agency–think CEO liability for negligence.

(10) shall withhold additional compensation, direct corrective action for Federal personnel, or terminate a Federal contract in violation of Federal rules, and shall report any such action to the Congress in an unclassified format within 48 hours after taking any such action; and

There are 2 parts of this paragraph: Federal personnel and contractors.  This is a sanctions part of the legislation.  Note that there is not a penalty and/or authority for anybody outside of Government.  The problem with this is that proving negligence is very hard in the security world.  Combined with Paragraph 9, this is a good combination provided that the professional responsibilities are written correctly.  I still think this has room for abuse because of scoping problems–we already have rules for sanctions of people (personnel law) and contracts (cure notices, Federal Acquisition Regulations), only they don’t have much teeth up to this point because it’s hard to prove negligence.

(11) shall notify the Congress within 48 hours after providing a cyber-related certification of legality to a United States person.

I had to search around for a description here.  I found some people who said this paragraph pertained to the certification of professionals as in section 7.  This is wrong.  Basically, what happens is that the Department of Justice issues a “certification of legality” when somebody (usually inside the Government) asks them if a certain act is legal to perform.  Think legal review for building a wiretap program: the President has to go to DoJ and ask them if the program is legal under existing laws.

What this paragraph really does is it institutes Congressional oversight on a “FYI-basis” over Executive Branch decisions on policy to keep them from overstepping their legal bounds.

Verdict: This section is all over the map.  Like most things in S.773, it has some scope issues but overall this section establishes tasks that you can expect the Cybersecurity Advisor or DHS under the Cybersecurity Advisor’s auspices to perform.

Capitol Rotunda photo by OakleyOriginals.

SEC. 19. QUADRENNIAL CYBER REVIEW. This section mandates a review of the cyberstrategy every 4 years.

Verdict: We’ve been doing this so far on an ad-hoc basis, might as well make it official.

SEC. 20. JOINT INTELLIGENCE THREAT ASSESSMENT. This section mandates an annual report on the bad guys and what they’re doing.  This is similar to the Congressional testimony we’ve seen so far on the subject.  If we’re going to expect Congress to make good public policy decisions, they need the information.

Verdict: OK, I don’t see much wrong with this as long as it’s done right and not abused by politics.

SEC. 21. INTERNATIONAL NORMS AND CYBERSECURITY DETERRANCE MEASURES. This section authorizes/mandates the President to cooperate with other countries about “cybersecurity stuff”.

Verdict: Not specific enough to mean anything.  If we keep this section, we need to enumerate specifically what we want the Executive Branch to do.

SEC. 22. FEDERAL SECURE PRODUCTS AND SERVICES ACQUISITIONS BOARD. This section creates a board to review large IT purchases.  Yes, that slows down the purchasing process horribly, as if it isn’t bad enough by itself.  Um, I thought we were supposed to do this with the Federal Enterprise Architecture.

Verdict: This is a macro-scale solution for a micro-scale problem.  Sorry, it doesn’t work for me.  Make FEA responsible for the macro-scale and push good, solid guidance down to the agencies for the micro-scale.  Replace this section with the NIST checklists program and a true security architecture model.



Similar Posts:

Posted in Public Policy | No Comments »
Tags:

In Response to “Cyber Security Coming to a Boil” Comments….

Posted March 24th, 2009 by

Rybolov’s comment: This is Ian’s response to the comments for his post on Cybersecurity Coming to a Boil.  It was such a good dialog that he wanted to make a large comment which as we all know, eventually transforms itself into a blog post.  =)

You are making some excellent points; putting the leadership of the Administration’s new Cyber security initiative directly in the White House might appear to be a temporary solution or a quick fix. From my point of view, it looks more like an honest approach. By that I mean that I think the Administration is acknowledging a few things:

  • This is a significant problem
  • There is no coherent approach across the government
  • There is no clear leadership or authority to act on the issue across the government
  • Because of the perception that a large budget commitment will have to be allocated to any effective solution, many Agencies are claiming leadership or competing for leadership to scoop up those resources
  • The Administration does not know what the specific solution they are proposing is — YET

I think this last point is the most important and is driving the 60-day security assessment. I also think that assessment is much more complex than a simple review of FISMA scores for the past few years. I suspect that the 60-day review is also considering things like legal mandates and authorities for various aspects of Cyber security on a National level. If that is the case, I’m not familiar with a similar review ever having taken place.

2004 World Cyber Games photo by jurvetson.  Contrary to what the LiquidMatrix Security folks might think, the purpose of this post isn’t to jam “cyber” into every 5th word.  =)

So, where does this take us? Well, I think we will see the Cyber Security Czar, propose a unified policy, a unified approach and probably some basic enabling legislation. I suspect that this will mean that the Czar will have direct control over existing programs and resources. I think the Cyber Security Czar taking control of Cyber Security-related research programs will be one of the most visible first steps toward establishing central control.

From this we will see new organizational and reporting authorities that will span existing Agencies. I think we can also anticipate that we will see new policies put in place and a new set of guidelines of minimum level of security capabilities mandated for all Agency networks (raising bottom-line security). This last point will probably prove to be the most trying or contentious effort within the existing Agency structure. It is not clear how existing Agencies that are clearly underfunding or under supporting Cyber Security will be assessed. It is even less clear where remedial funding or personnel positions will come from. And the stickiest point of all is…. how do you reform the leadership and policy in those Agencies to positively change their security culture? I noticed that someone used the C-word in response to my initial comments. This goes way beyond compliance. In the case of some Federal Agencies and perhaps some industries we may be talking about a complete change sea-change with respect to the emphasis and priority given to Cyber Security.

These are all difficult issues. And I believe the Administration will address them one step at a time.
In the long-term it is less clear how Cyber Security will be managed. The so-called war on drugs has been managed by central authority directly from the White House for decades. And to be sure, to put a working national system together that protects our Government and critical national infrastructure from Cyber attack will probably take a similar level of effort and perhaps require a similar long-term commitment. Let’s just hope that it is better thought-out and more effective than the so-called war on drugs.

Vlad’s point concerning Intelligence Community taking the lead with respect to Cyber Security is an interesting one, I think the Intelligence Community will be important players in this new initiative. To be frank, between the Defense and Intelligence Communities there is considerable technical expertise that will be sorely needed. However, for legal reasons, there are real limits as to what the Intelligence and Defense Communities can do in many situations. This is a parallel problem to the Cyber Security as a Law Enforcement problem. The “solution” will clearly involve a variety of players each with their own expertise and authorities. And while I am not anticipating that Tom Clancy will be appointed the Cyber Security Czar any time soon. I do expect that a long-term approach will require the stand-up of either a new organization empowered to act across current legal boundaries (that will require new legislation), or a new coordinating organization like the Counter Terrorism Center, that will allow all of the current players bring their individual strengths and authorities to focus on a situation on a case by case basis as they are needed (that may require new legislation).

If you press me, I think a joint coordinating body will be the preferred choice of the Administration. Everyone likes the idea of everyone working and playing well together. And, that option also sounds a lot less expensive. And that is important in today’s economic climate.



Similar Posts:

Posted in FISMA, Public Policy, Technical | 2 Comments »
Tags:

Ed Bellis’s Little SCAP Project

Posted March 19th, 2009 by

So way back in the halcyon days of 2008 when Dan Philpott, Chris Burton, Ian Charters, and I went to the NIST SCAP Conference.  Just by a strange coincidence, Ed Bellis threw out a twit along the lines of “wow, I wish there was a way to import and export all this vulnerability data” and I replied back with “Um, you mean like SCAP?

Fast forward 6 months.  Ed Bellis has been busy.  He delivered this presentation at SnowFROC 2009 in Denver:

So some ideas I have about what Ed is doing:

#1 This vulnerability correllation and automation should be part of vulnerability assessment (VA) products.  In fact, most VA products include some kind of ticketing and workflow nowadays if you get the “enterprise edition”. That’s nice, but…

#2 The VA industry is a broken market with compatibility in workflow.  Everybody wants to sell you *their* product to be the authoritative manager. That’s cool and all, but what I really need is the connectors to your competitor’s products so that I can have one database of vulnerabilities, one set of charts to show my auditors, and one trouble ticket system. SCAP helps here but only for static, bulk data transfers–that gets ugly really quickly.

#3 Ed’s correllation and automation software is a perfect community project because it’s a conflict of interest for any VA vendor to write it themselves. And to be honest, I wouldn’t be surprised if there aren’t a dozen skunkwork projects that people will admit to creating just in the comments section of this post. I remember 5 years ago trying to hack together some perl to take the output from the DISA SRR Scripts and aggregate them into a .csv.

#4 The web application security world needs to adopt SCAP. So far it’s just been the OS and shrinkwrapped application vendors and the whole race to detection and patching. Now the interesting part to me is that the market is all around tying vulnerabilities to specific versions of software and a patch, where when you get to the web application world, it’s more along the lines of one-off misconfigurations and coding errors. It takes a little bit of a mindshift in the vulnerability world, but that’s OK in my book.

#5 This solution is exactly what the Government needs and is exactly why SCAP was created. Imagine you’re the Federal Government with 3.5 million desktops, the only way you can manage all those is through VA automation and a tool that aggregates information from various VA products across multiple zones of trust, environments, and even organizations.

#6 Help Ed out! We need this.



Similar Posts:

Posted in Technical, What Works | 4 Comments »
Tags:

The 10 CAG-egorically Wrong Ways to Introduce Standards

Posted February 20th, 2009 by

The Consensus Audit Guidelines (CAG) appear, at this point, to be a reasonable set of guidelines for mediating some human threats. I’m looking forward to seeing what CAG offers and have no doubt there will be worthwhile and actionable controls in the document. That said, there are significant reasons approach CAG with skepticism and assess it critically.

The motivation for CAG is described in a set of slides at the Gilligan Group site. It starts with a focus on what CIO’s fear most: attacks, reduced operational capability, public criticism, data loss, etc. Then it rightly questions whether FISMA is adequately addressing those problems. It doesn’t and this is the genesis of the CAG.

Consensus photo by Eirik Newth.

Unfortunately CAG subsequently develops by pairing this first valid premise with a set of false premises.  These propositions are drawn from slides at gilligangroupinc.com, attributed to John Gilligan or Alan Paller:

  1. All that matters are attacks. The central tenet of Bush’s Comprehensive National Cyber Initiative (CNCI) is adopted as the CAG theme: “Defense Must Be Informed by the Offense”. CAG envisions security as defense against penetration attacks. As any seasoned security practitioner knows, attacks are a limited subset of the threats to confidentiality, integrity and availability that information and information systems face.
  2. Security through obscurity. CAG seems to have taken the unspoken CNCI theme to heart too, “The most effective security is not exposed to public criticism.” Since its very public December 11th announcement no drafts have been made publicly available for comment.
  3. False dichotomy. CAG has been promoted as an alternative to the OMB/NIST approach to FISMA. It isn’t. An alternative would target a fuller range of threats to information and information system security. CAG should be considered a complement to NIST guidance, an addendum of security controls focused on defense against penetration by hackers. NIST has even acted on this approach by including some CAG controls into the 800-53 Rev. 3 catalog of controls.
  4. There is too much NIST guidance! This is the implication of one CAG slide that lists 1200 pages of guidance, 15 FIPS docs and the assorted Special Publications not related to FISMA as detriments to security. It’s like complaining that Wikipedia has too many articles to contribute to improved learning. Speaking as someone who scrambled to secure Federal systems before FISMA and NIST’s extensive guidance, having that documentation greatly improves my ability to efficiently and effectively secure systems.
  5. NIST guidance doesn’t tell me how to secure my systems! NIST’s FISMA guidance doesn’t step you through securing your SQL Server. The Chairman of the Joint Chiefs also doesn’t deliver your milk. Why not? It’s not their job. NIST’s FISMA guidance helps you to assess the risks to the system, decide how to secure it, secure it accordingly, check that a minimum of controls are in place and then accept responsibility for operating the system. NIST also provides documents, checklists, repositories, standards, working groups and validation of automated tools that help with the actual security implementation.
  6. Automated security controls negate human errors. With the premise of all threats being attacks this is nearly a plausible premise. But not all security is technical. Not all threats come from the Internet. DHS, NIST, Mitre, and their partners have pursued automated security controls to enforce and audit security controls for years but automated security controls can only go so far. Human errors, glitches, unexpected conflicts and operational requirements will always factor into the implementation of security.
  7. Audit compatibility as a hallmark of good security. There is a conflict of focus at the heart of the CAG, it seeks to both improve its subset of security and improve audit compatibility. For technical controls this is somewhat achievable using automation, something NIST has pursued for years with government and industry partners. For operational and management controls it results in audit checklists. But audits are fundamentally concerned with testing the particular and repeatable, security needs focus on evaluating the whole to ensure the necessary security results. An audit sees if antivirus software is installed, an evaluation sees if the antivirus software is effective.
  8. Metrics, but only these metrics over here. When selecting the current crop of CAG controls decisions on what to include were reportedly based on metrics of the highest threats. Great idea, a quantitative approach often discovers counter-intuitive facts. Only the metrics were cherry picked. Instead of looking at all realized threats or real threat impacts only a count of common penetration attacks were considered.
  9. With a sample of 1. As a basis for determining what security should focus on the whole breadth of the security profession was queried, so long as they were penetration testers. Yes, penetration testers are some very smart and talented people but penetration testing is to security what HUMINT is to intelligence services. Important players, expert practitioners but limited in scope and best used in conjunction with other intelligence assets.
  10. Assessments rely on paper artifacts. The NIST guidance does not require paper artifacts. The first line in the NIST SP 800-53A preface is, “Security control assessments are not about checklists, simple pass-fail results, or generating paperwork to pass inspections or audits-rather, security controls assessments are the principal vehicle used to verify that the implementers and operators of information systems are meeting their stated security goals and objectives.” NIST SP 800-37 specifically and repeatedly states, “Security accreditation packages can be submitted in either paper or electronic format.”

CAG is a missed opportunity. Of the myriad problems with our current FISMA regime a lot of good could be achieved. The problems with guidance have many causes but can be addressed through cooperative development of best practices outside of NIST. The Assessment Cases for SP 800-53A is an example of how cooperative development can achieve great results and provide clear guidance. Other problems exist and can be addressed with better training and community developments.

My hope is that the Consensus Audit Guidelines will move towards a more open, collaborative development environment. The first release is sure to deliver useful security controls against penetration attacks. As with all good security practices it will likely need to go through a few iterations and lots of critical assessment to mature. An open environment would help foster a more complete consensus.

Consensus photo by mugley.



Similar Posts:

Posted in BSOFH, FISMA, Rants, Technical, What Doesn't Work, What Works | 9 Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: