Evolution of Penetration Testing: Part 2

Posted October 13th, 2008 by

In part 1 on this blog I outlined the fact penetration testing evolved from a grey-art practiced by hackers into a more formal process.  This evolution has created a bifurcation within “boutique” penetration test service providers.

On the one hand tools-oriented budget firms offer little value added beyond simply running simple vulnerability scans.  On the other more profession and experienced firms offer the same tests and scans but also offer analysis that can be offered as direct actionable input into an organization’s existing security governance structure. 

The fly in the ointment is that not all security consumers or security organizations are created equally.  Some IT security organizations can be characterized a compliance-based.  That is to say that they establish and follow a set of rule that they believe will put them on the road to IT security.

On the other hand, most IT security organizations are risk-based and technically oriented.  They also follow a formal structure but, addressing risk with the appropriate application of process, procedures, and technology.  In  graphical terms the situation would appear to line-up as depicted in table 1.  Table quadrant 1 representing a weak security organization supported by, “Tool-boys” is noted in red because the risks associated with this coupling.  Quadrants 2 and 3 are noted in yellow because of the risks associated with either a weak security organization or weak testing input.  

Table 1

 

“Tool-Boys”

Technical Pen Test Firms

Compliance Based Security

1

2

Technical/Risk-based Security

3

4

 

However, in the real world the table should look more like Table 2. With the increasing acceptance of Compliance-based security models, a set of independently administered vulnerability scans suffices to “check the box” for the requirements for a penetration test.  This is good news for these budget “boutique” firms. 

Table 2

 

“Tool-Boys”

Technical Pen Test Firms

Compliance Based Security

1

2

Technical/Risk-based Security

3

4

 

 

However, as might be expected, it is bad news for IT security in general since all networks live in the same security ecosystem.   Market drivers that encourage poor security practices hurt us all.

 

 

 

 

Hacker Store photo by LatinSuD.



Similar Posts:

Posted in Rants, Technical | 4 Comments »
Tags:

Evolution of Penetration Testing: Part 1

Posted October 13th, 2008 by

Penetration testing is a controversial topic with an interesting history. It is made all that much more controversial and perplexing because of an common disconnect between the service provider and the consumer.

Penetration started as a grey-art that was often practiced/delivered in an unstructured and undisciplined manner by reformed or semi-reformed hackers. Penetration testers used their own techniques and either their own home-grown tools or tools borrowed or traded with close associates. There was little reproducibility or consistency of results or reporting. As a result, the services were hard to integrate into a security program.

As the art evolved it became more structure and disciplined and tools, techniques, and reporting became more standardized. This evolution was driven by papers, articles, technical notes that were both formally published and informally distributed. In the end, a standardized methodology emerged that was largely based on the disciplined approach used by the most successful hackers.

Hakker Kitteh photo by blmurch.

At about the same time open-source, government and commercial tools began to emerge that automated many of the steps of the standardized methodology. These tools had two divergent impacts on the art of penetration testing. As these tools were refined and constantly improved they reinforced the standard methodology, provided more consistent and reproducible results and improved and standardized penetration reporting. All of this made penetration testing easier for the consumer to absorb and integrate into security programs. As a result, regulations and security protocols emerged that required penetration and security assessments. Nmap and Nessus are excellent examples of the kind of tools that help shape and push this evolution. And, because of their utility they are still indispensable tools today.

However, Nessus also helped to automate both data collection and analysis, it has lowered the bar for the skills and experience needed to conduct portions of the penetration testing methodology. This lowered the cost of penetration testing and made them much more broadly available. Thus, giving rise to so-called “boutique firms.” The problem with penetration testing “boutique firms” is that they fall into two broad categories; specialized highly professional firms led by experienced and technical security professionals who can translate automated tool output into root-cause analysis of vulnerabilities, and security program flaws. The second category of firm consists of opportunist firms with just enough knowledge to run automated tools and cut and paste the tool output into client reports. The later firms are some times called “tool-firms” and their employees “tool-boys.”

The later flourish for two reasons. The first is that they can offer their services at rock bottom prices. The second reason is that security organizations are often so ill-informed of the intricacies of the penetration testing process that can’t make a meaningful distinction between the professional firms and the tool-boys except on the basis of costs.



Similar Posts:

Posted in Rants, Technical | 2 Comments »
Tags:

Et Tu, TIC?

Posted October 7th, 2008 by

Let’s talk about TIC today, dear readers, for I smell a conspiracy theory brewing.

For those of you who missed the quick brief, TIC is short for “Trusted Internet Connections” and is an architecture model/mandate/$foo to take all of the Internet connections in the Government (srsly, nobody knows how many of them really exist, but it’s somewhere in the 2,000-10,000 range) and consolidate them into 50.  These connections will then be monitored by DHS’s Einstein program.

No, Not That Kind of TIC photo by m.prinke.

Bringing you all up to date, you’ll need to do some homework:

Now having read all of this, some things become fairly obvious:

  • If you have the following people needing connections:
    • 24 agencies, plus
    • DoD with 2 points of presence, plus
    • Intelligence agencies with a handful of Internet connections, means that:
  • That basically, everybody gets one Internet connection.  This is not good, it’s all single point-of-DOS.
  • Agencies have been designated as Internet providers for other agencies.  Sounds like LoB in action.
  • Given the amount of traffic going through the TIC access points, it most likely is going to take a significant amount of hardware to monitor all these connections–maybe you saved 50% of the monitoring hardware by reducing the footprint, but it’s still hardware-intensive.
  • TIC is closely tied with the Networx contract.
  • In order to share Internet connections, there needs to be a network core between all of the agencies so that an agency without a TIC access point can route through multiple TIC service provider agencies.

And this is where my conspiracy theory comes in:  TIC is more about making a grand unified Government network than it is monitoring events–Einstein is just an intermediate goal.   If you think about it, this is where the Government is headed.

We were headed this way back in ought-two with a wonderful name: GovNet.  To be honest, the groundwork wasn’t there and the idea was way ahead of its time and died a horrible death, but it’s gradually starting to happen, thanks to TIC, FDCC, and Einstein. 

More fun links:

If you want to get a reaction out of the OMB folks, mention GovNet and watch them backpedal and cringe,–I think the pain factor was very high for them on GovNet. So I think that we should, as a cadre of information security folks, start calling TIC what it really is:  Govnet 2.0!  =)



Similar Posts:

Posted in Technical | 2 Comments »
Tags:

NIST and SCAP; SCAP @ Large Part 2

Posted October 2nd, 2008 by

There is another challenge that SCAP addresses without it being officially on the SCAP program’s agenda.  With the advent of SCAP we now have a common reporting criteria by which we can now judge SCAP certified products.  If you have ever used an automated vulnerability scanner as part of a penetration test or security audit, you know that not all vulnerability scanners are created equal.  Some have much lower false positive alert and reporting rates than others.  Likewise, it is known that false negative alert and reporting rates vary.  And, because of the various technical approaches taken by the scanners, some provide much more consistent results. The challenge has been that without a common criteria to test against, it is difficult for a small or even fairly large security organization to find the resources to effectively test these products in a fair apples to apples test.

This is where NIST has a real opportunity on its hands.  With the release of the SCAP protocol, we have the criteria by which performance comparisons can be made.  What we are lacking is a common test environment.

Benchmark photo by bzo.

Let me veer off-topic for a moment to provide some background.  In the last few years the Linux community has created various “live distributions” for various specialized requirements.  What live distributions are, are CD, DVD or Flash-media-based operating systems that are executed upon boot.  That is to say that they boot and run directly from CD or DVD.  So, by using a Linux live distribution, you can run Linux off of you home Windows-based laptop without ever installing Linux to your hard disk.  This has opened up a world of specialized possibilities for this community.  One of them is the standardized training environment.  For example, security testers have created DVL (damn vulnerable Linux http://www.damnvulnerablelinux.org/).  DVL is a live distribution that with well documented security vulnerabilities, this distribution is used as a training aid for teaching vulnerability assessment and mitigation. There are other similar efforts created with the same intent such as the excellent DE-ICE training targets (http://de-ice.net/hackerpedia/index.php/De-ICE.net_PenTest_Disks).

NIST could follow-up on the release of the SCAP protocol by also building and releasing a common testing environment based perhaps on live distribution technology. Such an environment with well documented vulnerabilities would allow for the creation of objective benchmarks to be created to rate the accuracy, reproducibility, completeness of the results of SCAP certified vulnerability testing and reporting products.  This would aid government agencies, businesses and even individuals in their purchasing decisions.  It would also allow provide vendors with an objective and common test environment in which they can test and improve their products.  I admit this would be a significant undertaking for NIST.  However, I would suggest that such a test environment could be designed in such a manner that it could be built and released as a series of inter-operable modules based on live distribution technology.  The initial release might only offer a relatively modest set of tests but with the release of each module building on the results of previous releases, a highly demanding and sophisticated test environment could soon be realized.  Because of the importance and utility of such a project, industry and outside security experts might want to participate in and contribute to such an endeavor.

 



Similar Posts:

Posted in NIST, Technical, What Works | No Comments »
Tags:

NIST and SCAP; Busting a cap on intruders Part 1

Posted October 1st, 2008 by

I was attending a conference at NIST (the National Institute of Standards) concerning the SCAP program (Security Content Automation Protocol; pronounced ESS-cap).  SCAP is focused on providing the Federal government with automated, common,  interoperable security solutions.  Specifically the SCAP program has developed a common set of standards for reporting security vulnerabilities for use in automated security scanners, security appliances and reporting systems.

Well, why do we need SCAP?  If we use the Godfather of all vulnerability management tools, the NESSUS vulnerability scanner as an example, we have seen that industry has produced a number of similar products.  Each has its own strengths and rich feature set.  However, none of them use the same “language” for detecting or describing or reporting a potential vulnerability.  This not only means that these various products can only be used to operate with each other with some measure of difficulty but, trying to aggregate and manage the result of reports from these systems can be tedious.

“Tim Bray at XML 2005” photo by Roland.

As a result of these efforts and vision of the dedicated employees at NIST, industry is already scrambling to get their related products SCAP certified.  And, Federal agencies are also specifying in contracts that products must be SCAP certified in order to be qualified for purchase.  This is real progress and great news for the tax payer who will get real better value for their tax dollar.  But, it is not a revolution — yet.  Where I see the revolution emerging is in six-month to a year time frame when industry takes note of the SCAP program and we begin to see SCAP certified and SCAP interoperable products being ordered.  It will not be long after that that we may see the SCAP protocol used in even consumer-level products like personal firewalls.  This ability to give us all a common language will significantly reduce the cost of building and supporting vulnerability scanners and vulnerability reporting tools.  This cost reduction will allow resources to be freed up to address prevention and mitigation concerns in a more meaningful manner.

For example, industry has tools that enable network and security support professionals to detect a mis-configuration in a desktop machine in their network and correct it.  But, only the largest and most well funded security IT security departments have such tools.  With the advent of SCAP, these kind of services will be much more affordable and supportable and thus more common.  In fact, because much of this can be automated, I can even envision the McAfee, Symantec, and others who are well placed in the vulnerability scanning market to offer support services over the wire to smaller businesses and to consumers.  Moreover, as this technology improves and becomes commoditized, I can see ISP’s offering security scanning and mediation as a service to their customers.



Similar Posts:

Posted in NIST, Technical, The Guerilla CISO, What Works | No Comments »
Tags:

Workin’ for the ‘Counters: an Analysis of my Love-Hate Relationship with the CPAs

Posted September 30th, 2008 by

No big surprise by now, I work for an accounting firm.  Oh, what’s that?  Oh yes, that’s right, it’s a consulting firm with a high percentage of accountants, including a plethora of CPAs.  “Accounting firm” is so 1950s-ish. =)

It’s my secret theory (well, not so much of a secret now, just between the Internet and me) that the primary problem we have in information security is that as a field we have borrowed heavily from public accounting.  The only problem is that public accounting is different from what we do.

Goals for public accounting run something like this:

  • Eliminate fraud through oversight
  • Protect the company’s money from rogue agents
  • Protect the shareholders of public companies
  • Ensure accountability of actions

Accounting for Mere Mortals Such as Security Folk

Accounting for Non-Accountants photo by happyeclair.

As a result of their goals, accountants have an interesting set of values:

  • Signatures are sacred
  • Separation of duties is sacrosanct
  • Auditing is designed to act as a deterrent to fraud
  • “Professional Skepticism” is a much-valued trait
  • Zero-Defects is a good condition

In other words, accountants live in a panopticon of tranparency, the concept being that through oversight and transparency, people will not become evildoers and those that do will be caught.  Pretty simple idea, makes me think about IDS in an entirely new light.

Words that accountants use that mean something entirely different from the way you or I use them:

  • Fraud, Waste, and Abuse: They’re talking about spending money, I’m usually talking about people doing something ethically wrong.
  • Investigation: They’re looking at the numbers to see how a particular number was created.  Me, I bring the nice people with guns when I do an investigation.
  • Incident: Their version is what I would call an event.  When I call something an incident, we’re headed towards an investigation.
  • Security test and evaluation: To them, it’s a compliance audit.  To me, it’s determining the frequency that the system will fail and if we have a way to fix it once it does.  Remember this, it’s a critical difference.
  • Control: I think their version has something to do with having oversight and separation of duties.  Me, when I see this word, I think “countermeasure to a specific threat and vulnerability”.
  • Audit: An activity designed to prove that fraud has not happened.  Usually we don’t use the word unless we absolutely have to.
  • Technical: They’re talking about the highly-detailed accounting rules.  I’m talking about if you know how to build your own server and OS using lumps of raw silicon and a soldering iron.
  • Checklist: They’re talking about a sacred list that condenses all the rules into an easily-auditable format.  Me, I’m thinking that a checklist is something that will fail because my threats and their capabilities don’t fit into nice little lists.
  • Forensics: Their version is what I would call “research to find out where the money went to” and involves looking at a bunch of numbers.  My version has something to do with logs, memory dumps, and hard drive images.
  • Risk Management: This has something to do with higher interest rates for high-risk loans.  For me, it’s looking for countermeasures and knowing what things to skimp on even though the catalog of controls says you have to have it.

In short, pretty much anything they could say about our line of work has a different meaning.  This is why I believe it’s a problem if we adopt too much of their methodology and management models because they are doing similar activities to what security people do, only for different purposes.

In order to understand the mentality that we’re working with, let’s give you a couple of scenarios:

After-Work Optional Training Session: The accountants not only make you put your name on the attendance roster but you have to sign it as well.  Are they worried that you’re committing fraud by showing up at training that you were not supposed to, so they need some sort of signature nonrepudiation to prove that you were there?  No!  They just make you sign it because they believe in the power of the signature and that’s just how they do things, no matter how trivial.

The Role of Security: To an accountant, the role of security in an organization is to reduce fraud by “hack-proof” configurations and monitoring.  This is a problem in that since security is economics, we’re somehow subordinate to the finance people.

Let’s look at the world of the typical security practitioner:

  • The guidance that security professionals have is very contradictory, missing, or non-relevant.
  • Really what we do comes down to risk management, which means that sometimes it makes more sense to break the rules (even though there is a rule that says break the rules, which should freak your brain out by now if you’re an accountant).
  • We have a constantly changing environment that rules cannot keep up with.

Now this whole blog post, although rambling on about accountants, is aimed at getting a message across.  In the US Federal Government, we use a process called certification and accreditation (C&A).  The certification part is pretty easy to understand–it’s like compliance, do you have it and does it work.  CPAs will readily understand that as a controls assessment.  That’s very much a transferable concept.

But in accreditation, you give the risks to a senior manager/executive and they accept the risks associated with operating the system.  The CPA’s zero-defects world comes through and they lie on the ground doing the cockroach.  Their skills aren’t transferable when dealing with risk management, only compliance with a set of rules.

Once again, the problem with security in Government is that it’s cultural.

And don’t get me wrong, I like accountants and they do what I do not have neither the skills nor the desire to do.  I just think that there aren’t as many transferable skills between our jobs as there might seem on the surface.



Similar Posts:

Posted in Odds-n-Sods, Rants | 4 Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: