When Standards Aren’t Good Enough

Posted May 22nd, 2009 by

One of the best things about being almost older than dirt is that I’ve seen several cycles within the security community.  Just like fashion and ladies’ hemlines, if you pay attention long enough, you’ll see history repeat itself, or something that closely resembles history.  Time for a short trip “down memory lane…”

In the early days of computer security, all eyes were fixed on Linthicum and the security labs associated with the NSA.  In the late 80’s and early 90’s the NSA evaluation program was notoriously slow – glacial would be a word one could use…  Bottom line, the process just wasn’t responsive enough to keep up with the changes and improvements in technology.  Products would be in evaluation for years before coming out of the process with their enabling technology nearly obsolete.   It didn’t matter, it was the only game in town until NIST and the Common Criteria labs  came onto the scene.  This has worked well, however the reality is, it’s not much better at vetting and moving technology from vendors to users.  The problem is, the evaluation process takes time and time means money, but it also means that the code submitted for evaluation will most likely be several revisions old by the time it emerges from evaluation.   Granted, it may only be 6 months, but it might take a year – regardless, this is far better than before.

So…  practically speaking, if the base version of FooOS submitted for evaluation is, say Version 5.0.1, several revisions —  each solving operational problems affecting the  organization — may have been released.  We may find that we need to run Version 5.6.10r3 in order to pass encrypted traffic via the network.  Because we encrypt traffic we must use FIPS-Level 2 certified code – but in the example above, the validated version of the FooOS will not work in our network…    What does the CISO do?  We’ll return to this in a moment, it gets better!

In order to reach levels of FIPS-140 goodness, one vendor in particular has instituted “FIPS Mode.”  What this does is require administration of the box from apposition directly in front  of the equipment, or at the length of your longest console cable…  Clearly, this is not suitable for organizations with equipment deployed worldwide to locations that do not have qualified administrators or network engineers.  Further, having to fly a technician to Burundi to clear sessions on a box every time it becomes catatonic is ridiculous at worst.  At best it’s not in accordance with the network concept of operations.  How does the CISO propose a workable, secure solution?


Standard Hill photo by timparkinson.

Now to my point.  (about time Vlad)   How does the CISO approach this situation?  Allow me to tell you the approach I’ve taken….

1. Accept the fact that once Foo OS has achieved a level of FIPS-140 goodness, the likelihood that the modules of code within the OS implementing cryptographic functionality in follow-on versions have not been changed.  This also means you have to assume the vendor has done a good job of documenting the changes to their baseline in their release notes, and that they HAVE modular code…

2. Delve into vendor documentation and FIPS-140 to find out exactly what “FIPS Mode” is, its benefits and the requirement.  Much of the written documentation in the standard deals with physical security of the cryptographic module itself (e.g., tamper-evident seals) – but most helpful is Table 1.

Security Level  1 Security Level 2 Security Level 3 Security Level 4
Cryptographic

Module Specification

Specification of cryptographic module, cryptographic boundary, Approved algorithms, and Approved modes of operation. Description of cryptographic module, including all hardware, software, and firmware components. Statement of module security policy.
Cryptographic Module Ports and Interfaces Required and optional interfaces. Specification of all interfaces and of all input and output data paths. Data ports for unprotected critical security parameters logically or physically separated from other data ports.
Roles, Services, and Authentication Logical separation of required and optional roles and services Role-based or identity-based operator authentication Identity-based operator authentication.
Finite State Model Specification of finite state model.  Required and optional states.  State transition diagram and specification of state transitions.
Physical Security Production grade equipment. Locks or tamper evidence. Tamper detection and response for covers and doors. Tamper detection and response envelope.  EFP or EFT.
Operational Environment Single operator. Executable code. Approved integrity technique. Referenced PPs evaluated at EAL2 with specified discretionary access control mechanisms and auditing. Referenced PPs plus trusted path evaluated at EAL3 plus security policy modeling. Referenced PPs plus trusted path evaluated at EAL4.
Cryptographic Key Management Key management mechanisms: random number and key generation, key establishment, key distribution, key entry/output, key storage, and key zeroization.
Secret and private keys established using manual methods may be entered or output in plaintext form. Secret and private keys established using manual methods shall be entered or output encrypted or with split knowledge procedures.
EMI/EMC 47 CFR FCC Part 15. Subpart B, Class A (Business use). Applicable FCC requirements (for radio). 47 CFR FCC Part 15. Subpart B, Class B (Home use).
Self-Tests Power-up tests: cryptographic algorithm tests, software/firmware integrity tests, critical functions tests. Conditional tests.
Design Assurance Configuration management (CM). Secure installation and generation. Design and policy correspondence. Guidance documents. CM system. Secure distribution. Functional specification. High-level language implementation. Formal model. Detailed explanations (informal proofs). Preconditions and postconditions.
Mitigation of Other Attacks Specification of mitigation of attacks for which no testable requirements are currently available.

Summary of Security Requirements From FIPS-140-2

Bottom line — some “features” are indeed useful,  but this one particular vendor’s implementation into a “one-size fits all” option tends to limit the use of the feature at all in some operational scenarios (most notably, the one your humble author is dealing with.)  BTW, changing vendors is not an option.

3. Upon analyzing the FIPS requirements against operational needs, and (importantly) the environment the equipment is operating in, one has to draw the line between “operating in vendor FIPS Mode,” and using FIPS 140-2 encryption.

4. Document the decision and the rationale.

Once again, security professionals have to help managers to strike a healthy balance between “enough” security and operational requirements.   You would think that using approved equipment, operating systems, and vendors using the CC evaluation process would be enough.  Reading the standard, we see the official acknowledgement that “Your Mileage May Indeed Vary:” TM

While the security requirements specified in this standard are intended to maintain the security provided by a cryptographic module, conformance to this standard is not sufficient to ensure that a particular module is secure. The operator of a cryptographic module is responsible for ensuring that the security provided by a module is sufficient and acceptable to the owner of the information that is being protected and that any residual risk is acknowledged and accepted.”     FIPS 140-2 Sec 15, Qualifications

The next paragraph constitutes validation of the approach I’ve embraced:

“Similarly, the use of a validated cryptographic module in a computer or telecommunications system does not guarantee the security of the overall system. The responsible authority in each agency shall ensure that the security of the system is sufficient and acceptable.”  (Emphasis added.)

One could say, “it depends,” but you wouldn’t think so at first glance – it’s a Standard for Pete’s sake!

Then again, nobody said this job would be easy!

Vlad



Similar Posts:

Posted in Rants, Risk Management, Technical | 4 Comments »
Tags:

Where For Art Thou, 60-Day Review

Posted May 7th, 2009 by

April Fools Day pranks aside, I’m wondering what happened to the 60-day Cybersecurity Review.  Supposedly, it was turned into the President on the 17th.  I guess all I can do is sigh and say “So much for transparency in Government”.

I’m trying hard to be understanding here, I really am.  But isn’t the administration pulling the same Comprehensive National Cybersecurity Initiative thing again, telling the professionals out in the private sector that it depends on, “You can’t handle the truth!”

And this is the problem.  Let’s face it, our information sharing from Government to private sector really sucks right now.  I understand why this is–when you have threats and intentions that come from classified sources, if you share that information, you risk losing your sources.  (ref: Ultra and  Coventry, although it’s semi-controversial)

Secret Passage photo by electricinca.

Looking back at one of the weaknesses of our information-sharing strategy so far:

  • Most of the critical infrastructure is owned and operated by the private sector.  Government (and the nation at-large) depends on these guys and the resilience of the IT that these
  • The private sector (or at least critical infrastructure owners and operators) need the information to protect their infrastructure.
  • Our process for clearing someone to receive sensitive information is to do a criminal records investigation, credit report, and talk to a handful of their friends to find out who they really are.  It takes 6-18 months.  This is not quick.
  • We have some information-sharing going on.  HSIN and Infragard are pretty good so far–we give you a background check and some SBU-type information.  Problem is that they don’t have enough uptake out in the security industry.  If you make/sell security products and services for Government and critical infrastructure, you owe it to yourself to be part of these.
  • I’ve heard people from Silicon Valley talk about how the Government doesn’t listen to them and that they have good ideas.  Yes they do have some ideas, but they’re detached from the true needs because they don’t have the information that they need to build the right products and services, so all they can do is align themselves with compliance frameworks and wonder why the Government doesn’t buy their kit.  It’s epic fail on a macromarket scale.

In my opinion, Government can’t figure out if they are a partner or a regulator.  Choose wisely, it’s hard to be both.

As a regulator, we just establish the standard and, in theory anyway, the private sector folks don’t need to know the reasoning behind the standard.   It’s fairly easy to manage but not very flexible–you don’t get much innovation and new technology if people don’t understand the business case.  This is also a more traditional role for Government to take.

As a partner, we can share information and consequences with the private sector.  It’s more flexible in response but takes much more effort and money to bring them information.  It also takes participation from both sides–Government and private sector.

Now to tie it all off by going back to the 60-Day Cybersecurity Review….  The private sector needs information contained in the review.  Not all of it, mind you, just the parts that they need to do their job.  They need it to help the Government.  They need it to build products that fit the Government’s needs.  They need it to secure their own infrastructure.



Similar Posts:

Posted in Public Policy, Risk Management | 3 Comments »
Tags:

Blow-By-Blow on S.773–The Cybersecurity Act of 2009–Part 5

Posted May 4th, 2009 by

Rybolov Note: this is part 4 in a series about S.773.  Go read the bill hereGo read part one hereGo read part two hereGo read part three here. Go read part four here.

Themes: I’ve read this thing back and forth, and one theme emerges overall: We’ve talked for the better part of a decade about what it’s going to take to “solve” this problem that is IT security, from an internal Federal Government standpoint, from a military-industrial complex standpoint, from a state and local government standpoint, from a private-sector standpoint, and from an end-user standpoint.  This bill takes some of the best though on the issue, wraps it all up, and presents it as a “if you want to get the job done, this is the way to do it”.

Missing: The role of DHS.  Commerce is highly represented, over-represented to my mindset.  Looking at the pieces of who owns what:

Commerce security organizations:

NTIA–Technically not a security organization, but they manage the DNS root and set telecom policy.

NIST–They write the standards for security.

FTC–They regulate trade and have oversight over business fraud.

DHS Security organizations:

NPPD–They are responsible for critical infrastructure and national risk management.

NCSD–They do the security operations side of our national cybersecurity strategy and run US-CERT. (BTW, hi guys!)

Secret Service–They have the primary responsibility of protecting the US Currency which also includes computer crimes against financial infrastructure.

Science and Technology Directorate–They are responsible for research and development, including IT security.

DOJ Security Organizations:

FBI–Surprise, they do investigations.

So you see, some of the things that are tasked to Commerce are done by DHS and DOJ.  This is probably the nature of the bill, it was introduced in the Commerce committee so it’s understandable that it would be Commerce-centric.

Cost: One thing kept nagging me in the back of my head while going through this bill is the cost to do everything  We’re asking to do a lot in this bill, now what’s the total cost?  Typically what happens when a bill makes it out of committee is that the Congressional Budget Office attached a price to the legislation as far as the total cost and then what’s the breakdown for the average American household.  That data isn’t published yet on the bill’s page, so we’ll see in the next iteration.

In-Your-Face Politics: Really, this bill is showing us how to do the full security piece.  It includes everything.  It’s challenging people to come up with alternatives.  It’s challenging people to delete the sections that don’t make sense.  It’s challenging people to fix the scope issues.  Like it or hate it, it definitely stirs up debate.

Final Thoughts: S.773 is a pretty decent bill.  It has some warts that need to be fixed, but overall it’s a pretty positive step.

Capitol photo by bigmikesndtech.



Similar Posts:

Posted in Public Policy | No Comments »
Tags:

Blow-By-Blow on S.773–The Cybersecurity Act of 2009–Part 4

Posted May 1st, 2009 by

Rybolov Note: this is part 4 in a series about S.773.  Go read the bill hereGo read part one hereGo read part two hereGo read part three hereGo read part 5 here. =)

SEC. 18. CYBERSECURITY RESPONSIBILITIES AND AUTHORITY. This section needs to be reviewed line-by-line because it’s dense:

“The President–

(1) within 1 year after the date of enactment of this Act, shall develop and implement a comprehensive national cybersecurity strategy, which shall include–

(A) a long-term vision of the Nation’s cybersecurity future; and

(B) a plan that encompasses all aspects of national security, including the participation of the private sector, including critical infrastructure operators and managers;”

OK, fair enough, this calls for a cybersecurity strategy that includes the agencies and critical infrastructure.  Most of that is in-play already and has overlap with some other sections.

(2) may declare a cybersecurity emergency and order the limitation or shutdown of Internet traffic to and from any compromised Federal Government or United States critical infrastructure information system or network;

Declaring an emergency is already a President function for natural disasters, this makes sense, except where you militarized cybersecurity and indirectly give the President the authority here to declare a cyberwar, depending on how you interpret this paragraph.

The cutoff authority has been given much talk.  This part pertains only to Government systems and critical infrastructure.  Note that the criteria here is that the part being cutoff has to have been compromised, which makes more sense.  The part that I’m worried about is when we preemptively cut off the network in anticipation of pwnage.

(3) shall designate an agency to be responsible for coordinating the response and restoration of any Federal Government or United States critical infrastructure information system or network affected by a cybersecurity emergency declaration under paragraph (2);

This is interesting to me because it leaves the designation up to the President.  Remember, we have all this debate as to who should “own” cybersecurity: DHS, DoD, NSA, FBI, and even Commerce have been proposed here.  I don’t think Congress should leave this designation to the President–it needs to be decided before an incident so that we don’t fight over jurisdiction issues during the incident.  Ref: Cyber-Katrina.

(4) shall, through the appropriate department or agency, review equipment that would be needed after a cybersecurity attack and develop a strategy for the acquisition, storage, and periodic replacement of such equipment;

This is good.  What it means is stockpiling or contracting for equipment in advance of an attack… think DDoS response teams and you have a pretty good idea.  And hey, this also works in disaster recovery, which I’ve never understood why we don’t manage some DR at the national level.  GSA, are you paying attention here?

(5) shall direct the periodic mapping of Federal Government and United States critical infrastructure information systems or networks, and shall develop metrics to measure the effectiveness of the mapping process;

Enumeration is good, depending on what we’re using the information for.  If you use it to beat up on the agency CISOs and the critical infrastructure owners/operators, then we have better things to spend our time doing.  If you do this and then use the information to help people Ref: security metrics, architecture support, Federal Enterprise Architecture.  I also have a problem with this because you can map vulnerabilities but how do you get the information to the right people who can fix them?

(6) may order the disconnection of any Federal Government or United States critical infrastructure information systems or networks in the interest of national security;

OK, this gives the President authority over private networks.  And fo-shizzle, I thought the President already had disconnect authority over Government networks.  If I was an owner of critical infrastructure I would be sh*tting bricks here because this means that the President has disconnect authority for my gear and doesn’t have to give me an answer on why or a remediation plan to get it turned back on–Ref: National Security Letter.  I think we need the disconnect authority, but there has to be some way for people to get turned back on.

(7) shall, through the Office of Science and Technology Policy, direct an annual review of all Federal cyber technology research and development investments;

Good stuff, I would be surprised if this isn’t happening already, what with Congress providing the budget for cyber technology research.

(8) may delegate original classification authority to the appropriate Federal official for the purposes of improving the Nation’s cybersecurity posture;

This paragraph is interesting, mostly because it could go anyway.  If we get a Cybersecurity Advisor, this will most likely be dedicated to them, meaning that they get the authority to determine what’s national security information.  This also works in conjunction with quite a few sections of the bill, including all the information-sharing initiatives and paragraph 6 above.

(9) shall, through the appropriate department or agency, promulgate rules for Federal professional responsibilities regarding cybersecurity, and shall provide to the Congress an annual report on Federal agency compliance with those rules;

I had to read this paragraph a couple of times.  Really what I think we’re doing is establishing a case for agency executives to be found negligent in their duty if they do not ensure security inside their agency–think CEO liability for negligence.

(10) shall withhold additional compensation, direct corrective action for Federal personnel, or terminate a Federal contract in violation of Federal rules, and shall report any such action to the Congress in an unclassified format within 48 hours after taking any such action; and

There are 2 parts of this paragraph: Federal personnel and contractors.  This is a sanctions part of the legislation.  Note that there is not a penalty and/or authority for anybody outside of Government.  The problem with this is that proving negligence is very hard in the security world.  Combined with Paragraph 9, this is a good combination provided that the professional responsibilities are written correctly.  I still think this has room for abuse because of scoping problems–we already have rules for sanctions of people (personnel law) and contracts (cure notices, Federal Acquisition Regulations), only they don’t have much teeth up to this point because it’s hard to prove negligence.

(11) shall notify the Congress within 48 hours after providing a cyber-related certification of legality to a United States person.

I had to search around for a description here.  I found some people who said this paragraph pertained to the certification of professionals as in section 7.  This is wrong.  Basically, what happens is that the Department of Justice issues a “certification of legality” when somebody (usually inside the Government) asks them if a certain act is legal to perform.  Think legal review for building a wiretap program: the President has to go to DoJ and ask them if the program is legal under existing laws.

What this paragraph really does is it institutes Congressional oversight on a “FYI-basis” over Executive Branch decisions on policy to keep them from overstepping their legal bounds.

Verdict: This section is all over the map.  Like most things in S.773, it has some scope issues but overall this section establishes tasks that you can expect the Cybersecurity Advisor or DHS under the Cybersecurity Advisor’s auspices to perform.

Capitol Rotunda photo by OakleyOriginals.

SEC. 19. QUADRENNIAL CYBER REVIEW. This section mandates a review of the cyberstrategy every 4 years.

Verdict: We’ve been doing this so far on an ad-hoc basis, might as well make it official.

SEC. 20. JOINT INTELLIGENCE THREAT ASSESSMENT. This section mandates an annual report on the bad guys and what they’re doing.  This is similar to the Congressional testimony we’ve seen so far on the subject.  If we’re going to expect Congress to make good public policy decisions, they need the information.

Verdict: OK, I don’t see much wrong with this as long as it’s done right and not abused by politics.

SEC. 21. INTERNATIONAL NORMS AND CYBERSECURITY DETERRANCE MEASURES. This section authorizes/mandates the President to cooperate with other countries about “cybersecurity stuff”.

Verdict: Not specific enough to mean anything.  If we keep this section, we need to enumerate specifically what we want the Executive Branch to do.

SEC. 22. FEDERAL SECURE PRODUCTS AND SERVICES ACQUISITIONS BOARD. This section creates a board to review large IT purchases.  Yes, that slows down the purchasing process horribly, as if it isn’t bad enough by itself.  Um, I thought we were supposed to do this with the Federal Enterprise Architecture.

Verdict: This is a macro-scale solution for a micro-scale problem.  Sorry, it doesn’t work for me.  Make FEA responsible for the macro-scale and push good, solid guidance down to the agencies for the micro-scale.  Replace this section with the NIST checklists program and a true security architecture model.



Similar Posts:

Posted in Public Policy | No Comments »
Tags:

Blow-By-Blow on S.773–The Cybersecurity Act of 2009–Part 3

Posted April 30th, 2009 by

Rybolov Note: this is part 3 in a series about S.773.  Go read the bill hereGo read part one hereGo read part two here. Go read part four hereGo read part 5 here. =)

SEC. 13. CYBERSECURITY COMPETITION AND CHALLENGE. This section of the bill creates a series of competitions for a range of ages and skills… with cash prizes!  Mostly it’s just the administration of competitions–cash prizes, no illegal activities, etc.

This goes back to the age-old discussions of glorification of illegal activities, giving tools to people who are too young to know how to stay out of jail.

But then again, I know why this section of the bill is in there.  If we want to grow enough security professionals to even remotely keep up with demand, we need to do a much better job at recruiting younger techies to the “security dark side”.  Competitions are a start, the next step is to get them into formal education and apprenticeships to learn from the gray-hairs that have been in industry for awhile.

Once again, the same verbiage about tasking Commerce with leading this effort… I’m not sure they’re the ones to do this.

Verdict: Already happening although in ad-hoc fashion.  I’m not sold on teaching high school kids to hack, but yeah, we need to do this.

SEC. 14. PUBLIC-PRIVATE CLEARINGHOUSE. Although the title of this sounds really cool, like super-FOIA stuff, it’s really just information-sharing with critical infrastructure owners and operators.

One interesting provision is this:

“The Secretary of Commerce–

(1) shall have access to all relevant data concerning such networks without regard to any provision of law, regulation, rule, or policy restricting such access”

In other words, all your critical infrastructure information belong to Feds.  This is interesting because it can run the range from the Feds asking power grid operators for information and getting what they get, or it can be stretched into justification for auditing of privately-owned critical infrastructure.  I’m pretty sure that they mean the former, but I can see the latter being used at a later stage in the game.

One thing I thought was interesting is that this section only refers to information sharing with critical infrastructure.  There is a big gap here in sharing information with state and local government, local (ie, non-Federal) law enforcement, and private industry.  I think other sections–most notably  section 5–deal with this somewhat, but it’s always been a problem with information dissemination because how do you get classified data down to the people who need it to do their jobs but don’t have any level of clearance or trustability other than they won an election to be sheriff in Lemhi County, Idaho? (population 5000)  Also reference the Homeland Security Information Network to see how we’re doing this today.

Verdict: Really, I think this section is a way for the Feds to gather information from the critical infrastructure owners and I don’t see much information flow the other way, since the means for the flow to critical infrastructure owners already exists in HSIN.

Capitol photo by rpongsaj.

SEC. 15. CYBERSECURITY RISK MANAGEMENT REPORT. This small section is to do some investigation on something that has been bouncing around the security community for some time now: tying security risks into financial statements, cyberinsurance, company liability, etc.

Verdict: Seems pretty benign, hope it’s not just another case where we report on something and nothing actually happens. This has potential to be the big fix for security because it deals with the business factors instead of the symptoms.

SEC. 16. LEGAL FRAMEWORK REVIEW AND REPORT. This section requires a review of the laws, national-level policies, and basically what is our national-level governance for IT security.  As weird as this sounds, this is something that needs to be done because once we have a national strategy that aligns with our laws and policies and then is translated into funding and tasks to specific agencies, then we might have a chance at fixing things.  The one caveat is that if we don’t act on the report, it will become yet another National Strategy to Secure Cyberspace, where we had lots of ideas but they were never fulfilled.

Verdict: Some of this should have been done in the 60-day Cybersecurity Review.  This is more of the same, and is a perfect task for the Cybersecurity Advisor when the position is eventually staffed.

SEC. 17. AUTHENTICATION AND CIVIL LIBERTIES REPORT. This section is really short, but read it verbatim here, you need to because this one sentence will change the game considerably.

“Within 1 year after the date of enactment of this Act, the President, or the President’s designee, shall review, and report to Congress, on the feasibility of an identity management and authentication program, with the appropriate civil liberties and privacy protections, for government and critical infrastructure information systems and networks.”

So my take on it is something like REAL-ID and/or HSPD-12 but for critical infrastructure.

My personal belief is that if you have centralized identity management, it runs contrary to civil liberties and privacy protections: the power of identification lies with the group that issues the identification.  Hence the “rejection” of REAL-ID.

If I operated critical infrastructure, I would definitely protest this section because it gives the Government the decision-making authority on who can access my gear.  Identity and access management is so pivotal to how we do security that there is no way I would give it up.

On the bright side, this section just calls for a feasibility report.

Verdict: Oh man, identification and authentication nation-wide for critical infrastructure?  We can’t even do it in a semi-hierarchical top-down world of Government agencies, much less the privately-owned critical infrastructure.



Similar Posts:

Posted in Public Policy | 1 Comment »
Tags:

Lolcats take on Laws, Sausage, Cyberwhatzits, and PCI

Posted April 23rd, 2009 by

I just love how the security twit community this month is turning into a bunch of public policy wonks before my very eyes.  First we have S.773, the Cybersecurity Act of 2009 and then the House hearings on the effectiveness of PCI.

“To retain respect for sausages and laws, one must not watch them in the making.” –Otto von Bismark (ref: Wikipedia)


funny pictures



Similar Posts:

Posted in IKANHAZFIZMA, Public Policy | 2 Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: