Keeping The Lights On: Cybersecurity Law for the Electric Grid

Posted September 23rd, 2008 by

Ever wondered if your electricity supply was safe from computer attack? Congress wondered that too. So they asked the Federal Energy Regulatory Commission (FERC) to find out. The answers they received in October of 2007 were not encouraging.

After 9/11 there was concern about the safety of the Bulk Power Supply (BPS). The President’s Commission on Critical Infrastructure Protection released a report which was explicit about the dangers faced. A frightening example of these dangers was demonstrated by the Aurora vulnerability, essentially a software hack that made a generator crash hard. When faced with this example industry moved to mitigate the problem with some prodding from Department of Homeland Security (DHS), Nuclear Regulatory Commission (NRC) and FERC. The Nuclear Sector, which is regulated by NRC, issued a requirement to address the problem. The Electric Sector was issued a recommendation to address the problem by the Electric Sector Information Sharing and Analysis Center (ES-ISAC). Guess which industry has moved forward with successful mitigation efforts and which has not. FERC reported back on these findings in October of 2007.

Fast forward to now. On September 11th the Bulk Power System Protection Act (BPSPA) of 2008 (PDF link) was put forward by Rep. Rick Boucher (D-VA), chairman of the House Subcommittee on Energy and Air Quality. In addition to the September 11th hearing on the BPSPA a closed door hearing was expected to be conducted the following week. The goal of this legislation is to expand the emergency power of FERC to regulate cybersecurity for the BPS. The act itself does not appear to be strongly opposed by the energy industry but, as always, the devil is in the details.

Diablo Canyon Nuclear Power Plant photo by emdot.

The draft legislation is disputed on three major points; whether to include national security threats, disclosure of threat information and a sunset provision.

FERC recommends wording that would make explicit the requirement to address national security threats. This seems an implicit and reasonable expectation that the people of the United States would have of the agency regulating the BPS but the Energy Sector considers this too expansive a role. They argue that it might cause expensive requirements to be issued such as stockpiling fuel.

The disclosure of threat information is a sore point. Here you can understand the pain of the industry in dealing with government intelligence agencies who would like to keep details of a threat spare to preserve the source of that information. Unfortunately the government must preserve their sources while providing enough information for the industry to react.

Both FERC and the Energy Sector agree on the idea of a sunset provision. The sunset provision in this case stipulates that so long as an order is implemented as a standard it should terminate one year after issuance unless renewed by the President or the Secretary of Energy. The issue is whether this sunset will include the orders to address existing problems (such as the Aurora vulnerability) in addition to orders issued for future vulnerabilities. FERC recommends that only future orders should be sunsetted while the Energy Sector recommends both current and future orders should be sunsetted.

One element which is not adequately addressed in this legislation is how FERC will build the capability to assess and manage cybersecurity issues for the BPS. What should be in place is a bipartite separation of duties between FERC and NIST similar to what is in place with the dual OMB/NIST FISMA roles. FERC would oversee the security while NIST would provide technical guidance on what security should be put in place. FERC does not have the experience in security frameworks or in depth expertise in SCADA security which is required for a cybersecurity initiative of this magnitude.

It is worth noting that Energy Policy Act of 2005 (PDF link) established a process through which the North American Electric Reliability Corporation’s (NERC) was authorized to enforce cybersecurity in the Energy Sector. NERC had gone so far as to create Critical Infrastructure Protection (CIP) standards to include with their Reliability Standards and had present them to FERC for approval by late 2007.

A review of the NERC CIP standards (CIP-001 through CIP-009) does not inspire confidence in NERC’s cybersecurity capabilities. I will discuss the shortcomings of this guidance in a subsequent post.



Similar Posts:

Posted in What Doesn't Work | 3 Comments »
Tags:

Super Secret Security Control You Were Never Meant To See

Posted September 4th, 2008 by

This super secret security control is from the unpublished control catalog of an agency we would be foolish to name here.  Oh, darn, you talked me into it, the agency is the Director of National Intelligence – Extralegal Ventures to Rectify Information Technology Hacks, Incursions and Numbskulls Gabbing (DNI-EVRYTHING):

PS-1337 PERSONNEL SANITIZATION AND DISPOSAL

Control:
The organization sanitizes information system personnel prior to disposal or release for burial.

Supplemental Guidance:
Sanitization is the process used to remove information from information system personnel such that there is reasonable assurance, in proportion to the confidentiality of the information, that the information cannot be retrieved, recovered or extraordinarily renditioned. Sanitization techniques, including clearing, purging, and destroying personnel information, prevent the disclosure of organizational information to unauthorized individuals when personnel are disposed. The organization uses its discretion on sanitization techniques and procedures for personnel containing information deemed to be in the public domain or publicly releasable, or deemed to have no adverse impact on the organization or individuals if released for reuse or disposed. The Black Operations For the Homeland (BOFH) provides personnel sanitization guidance and maintains a listing of approved sanitization procedures in their publication “Leave No Incriminating Evidence (or Where Jimmy Hoffa Went) Directive and BBQ Cookbook”.

Control Enhancements:
(1) The organization tracks, documents, and verifies personnel sanitization and disposal actions.
(2) The organization periodically tests sanitization equipment and procedures to verify correct performance.
(3) The organization employs personnel sanitizers (‘cleaners’) who bear an uncanny resemblance to either Harvey Keitel or Jean Reno to perform ad hoc personnel sanitization procedures.
(4) Lbh fubhyq arire gehfg EBG13 rapelcgvba be chg lbhe snvgu va pbafcvenpl gurbevrf. (ROT13 Super-Encrypted)

LOW: Not Selected  MOD: PS-1337(1)(2)  HIGH: PS-1337(1)(2)(3)  MAJESTIC12: PS-1337(1)(2)(3)(4)



Similar Posts:

Posted in IKANHAZFIZMA, Odds-n-Sods | 1 Comment »
Tags:

No, FISMA Doesn’t Require That, Silly Product Pushers

Posted July 31st, 2008 by

Post #9678291 on why people don’t understand what FISMA really isSecure64 DNSSEC Press Releases.

“FISMA Act encourages U.S. government agencies to configure their DNS servers to the DNSSEC security specifications set by the National Institute of Standards and Technology, and it has been reported that the federal governments Office of Management and Budget (OMB) plans to begin enforcing DNSSEC requirements through an auditing process, setting the standard for DNS best practices.”

Yep, if you stamp FISMA on it, people will buy it, maybe in your PR department’s wettest and wildest dreams.  Guys, it’s been 6 years, that kind of marketing doesn’t work nowadays, mostly because we spent ourselves into oblivion buying junkware similar to yours and now we’re all jaded.

Now don’t get me wrong, DNSSEC is a good thing, especially this month.  But there is something I need to address:  FISMA requires good security management with a dozen or so key indicators, not a solution down to the technical level.  Allusions to OMB are just FUD, FUD, and more FUD because unless it’s in a memo to agency heads, it’s all posturing–something everybody in this town knows how to do very well.  OMB would rather stay out of mandating DNSSEC and maybe give a “due date” once NIST has a final standard.

My one word of wisdom for today:  anybody who tries to sell a product and uses FISMA as the “compelling event” has no clue what they’re talking about.



Similar Posts:

Posted in FISMA, What Doesn't Work | 7 Comments »
Tags:

Exhaustive Security Testing is Bad For You

Posted July 17th, 2008 by

Hot on the heels of Security Assessments as Fraud, Waste, and Abuse comes this heartwarming lolcat.

funny pictures



Similar Posts:

Posted in IKANHAZFIZMA | No Comments »
Tags:

Security Assessments as Fraud, Waste, and Abuse

Posted July 17th, 2008 by

I’m going to put on my Government Security Heretic Hat for awhile here, bear me out.  By my estimate, half of the security assessments received by the Government have some kind of fraud, waste, and abuse.

What makes me say this is the amount of redundancy in some testing that I’ve seen without any value added.

The way to avoid this redundancy is the concept of common/shared controls.  The whole idea is that you take whatever security controls you have across the board and put them into one bucket.  You test that bucket once and then whenever something  shares controls with that bucket, you look at the shared control bucket and make sure that the assessment is still relevant and accurate.

So, what makes a security assessment not fraud, waste, and abuse?  It’s a good assessment if it does the following:

  • Does not repeat a previous assessment.
  • Discovers previously-undiscovered vulnerabilities, weaknesses, or findings.
  • Has findings that get fed into a risk management plan (accepted, avoided, transferred, etc–think POA&M).
  • Is not exhaustive when it doesn’t need to be.
  • Provides value to the project team, system owner, and Authorizing Official to make key decisions.

Now the problem is that the typical auditor has a hard time stopping–they have an ethical obligation to investigate anything that their “professional skepticism” tells them is out of place, just like cops have an ethical obligation to investigate anything that they think is a crime.

The Solution?  Don’t use auditors! The public accounting model that we adopted for information security does not scale the way that we need it to for ST&E, and we need to understand this in order to fix security in the Government.

What we need to be doing is Security Test and Evaluation which is focused on risk, not on compliance using a checklist of control objectives.  Usually if you know enough to say “Wow, your patch management process is whacked, you’re at a high risk!” then that’s enough to stop testing patch management controls.  This is one of the beefs I have with 800-53A in the hands of less-than-clueful people:  they will test until exhaustion.

There isn’t a whole lot of difference between ST&E and an audit, just the purpose.  Audits are by nature confrontational because you’re trying to prove that fraud, waste, and abuse hasn’t occured.  ST&E is helping the project team find things that they haven’t thought of before and eventually get the large problems funded and fixed.

The Little Frauds Songbook

The Little Frauds Harrigan & Hart’s Songs & Sketches Photo by Boston Public Library



Similar Posts:

Posted in FISMA, NIST, Risk Management, What Doesn't Work | 8 Comments »
Tags:

SP 800-53A Now Finally Final

Posted July 1st, 2008 by

The perpetual draft document, SP 800-53A, has been officially released after 3 years.  Check out the announcement from NIST here.

Now the interesting thing to me is that NIST is working with some other players (DNI comes to mind) on reference implementations of 800-53A.  This is big, so big that I can’t add enough hyperbole to it.

Why do they need to do reference implementations?  Well, because by itself, SP 800-53A is dangerous if it’s given to people who “don’t get it”.  By that what I mean is this:

  • SP 800-53 needs tailoring to distill into actual requirements.
  • SP 800-53A needs a huge amount of tailoring to distill into test cases/procedures that match the tailoring that you did with 800-53.
  • Taken at face value, 800-53 and 800-53A become the source of “death by compliance”.
  • If you think the auditors could grill you to death with 800-53, 800-53A gives them tons more material.

Now time for a war story: I worked on a project where the contractor was having a hard time building a security program, mostly because they didn’t have the right staff to get the job done.  The government told the contractor to use 800-53A as a starting point, and 6 months of insanity followed with 13 “security engineers” in a conference room cranking out documentation that had no basis in reality.  At the end of it all, the contractor handed the Government a bill for $1M.

Now don’t get me wrong, I like the ideas behind 800-53A, but the first thing you need to know when you start using it is when you shouldn’t use it:

  • Don’t run test procedures on every computer you have, use an automated tool and do spot-checks to validate that the automated tool works.
  • Use less test procedures on low-criticality systems.
  • “This procedure is conducted as part of the hardening validation process.”
  • Common controls are even more important because you do not want the repetition of effort.

And whatever you do, don’t let 800-53A turn your risk management into a compliance activity.  It has all the potential to do that.

US Government Docs

US Government Doc’s photo by Manchester Library.



Similar Posts:

Posted in FISMA, NIST, Risk Management, What Doesn't Work, What Works | 12 Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: