Digital Forensics: Who should make the keys?

Posted October 22nd, 2008 by

Paraben is a leading vendor for digital forensics products (http://www.paraben.com/). However, within this huge international market, Paraben specializes in digital forensic products for mobile devices such as PDA and phones. Paraben just recently released a very nice product called the Cell Seizure Investigator (CSI) Stick (http://www.csistick.com/index.html).

Aside from the overly-dramatic marketing embedded in the name of the product, this seems to be another solid addition to the Paraben product line. The device is designed to make a forensically correct copy of the data on your mobile phone–including call records, address books, and text messages. The devices look basically like a USB flash memory drive with the addition of an adapter/interface unit.

The copying process is largely automatic and the CSI Stick is quite reasonably priced at $99 -199, depending on the software bundle. The market reaction to this product is also quite positive. My friends in the industry who have used the device consider it an indispensable time-saving device. I can hardly wait until I get my have on one myself. In the past when, I was tasked to recover such data it was much more time consuming and hardware intensive process.

Equally fascinating, is the release (if you can call it that) of a product with a similar form-factor from Microsoft. The product is released on a flash drive and is called COFEE (Computer Online Forensic Evidence Extractor — http://www.microsoft.com/presspass/features/2008/apr08/04-28crantonqa.mspx).  Microsoft indicates that COFEE contains 150 commands that facilitate the collection of digital evidence from computers that it is physically connected to. In addition, COFEE can decrypt passwords, and collect information on a computer’s Internet activity, as well as data stored in the computer. Microsoft has indicated that COFEE has been made available to law-enforcement agencies only. And, according to one report, law-enforcement agencies in 15 nations have been provided with the device.

My initial reaction to this news was that it was not an unexpected development and that the announcement would be greeted with inevitable jokes about the need for Microsoft to also release a companion product called DONUTS. In fact, the reaction of the technical press has been largely negative and suspicious. Most of the concerns seem to center on privacy and individual rights. However, there isn’t a single capability associated with COFEE that I have been able to confirm, that doesn’t exist in some other commercial or open-source product. I do wish that I could get my hands on a trial or lender copy of COFFEE so that I could confirm this position.

Locksmith Sign photo by Meanest Indian.

While I admit that I have always been concerned about the safeguarding individual’s civil liberties, I am largely puzzled at the negative reactions. One element of the outcry that I do understand is an emotional one and that centers on the concept that a company that is paid to protect your secrets should not also be selling the tools and techniques to compromise those secrets. On an emotional level this makes sense.

However, the real world is very different. For example, every major automobile manufacturer cooperates with locksmiths to insure that there are low-cost and non-destructive means to circumvent you car locks in the event that you lock you keys in your cars or just loose you car key outright. Without getting into the details of defeating car locks, may automobile manufactures even provide specialized equipment and technical materials directly to locksmiths to facilitate this process.

If there are concerns that Microsoft my be caught in a ethical conflict of interest, we need to look at similar conflicts in other industries, and that’s food for thought.



Similar Posts:

Posted in Rants, Technical | No Comments »
Tags:

Evolution of Penetration Testing: Part 2

Posted October 13th, 2008 by

In part 1 on this blog I outlined the fact penetration testing evolved from a grey-art practiced by hackers into a more formal process.  This evolution has created a bifurcation within “boutique” penetration test service providers.

On the one hand tools-oriented budget firms offer little value added beyond simply running simple vulnerability scans.  On the other more profession and experienced firms offer the same tests and scans but also offer analysis that can be offered as direct actionable input into an organization’s existing security governance structure. 

The fly in the ointment is that not all security consumers or security organizations are created equally.  Some IT security organizations can be characterized a compliance-based.  That is to say that they establish and follow a set of rule that they believe will put them on the road to IT security.

On the other hand, most IT security organizations are risk-based and technically oriented.  They also follow a formal structure but, addressing risk with the appropriate application of process, procedures, and technology.  In  graphical terms the situation would appear to line-up as depicted in table 1.  Table quadrant 1 representing a weak security organization supported by, “Tool-boys” is noted in red because the risks associated with this coupling.  Quadrants 2 and 3 are noted in yellow because of the risks associated with either a weak security organization or weak testing input.  

Table 1

 

“Tool-Boys”

Technical Pen Test Firms

Compliance Based Security

1

2

Technical/Risk-based Security

3

4

 

However, in the real world the table should look more like Table 2. With the increasing acceptance of Compliance-based security models, a set of independently administered vulnerability scans suffices to “check the box” for the requirements for a penetration test.  This is good news for these budget “boutique” firms. 

Table 2

 

“Tool-Boys”

Technical Pen Test Firms

Compliance Based Security

1

2

Technical/Risk-based Security

3

4

 

 

However, as might be expected, it is bad news for IT security in general since all networks live in the same security ecosystem.   Market drivers that encourage poor security practices hurt us all.

 

 

 

 

Hacker Store photo by LatinSuD.



Similar Posts:

Posted in Rants, Technical | 4 Comments »
Tags:

Evolution of Penetration Testing: Part 1

Posted October 13th, 2008 by

Penetration testing is a controversial topic with an interesting history. It is made all that much more controversial and perplexing because of an common disconnect between the service provider and the consumer.

Penetration started as a grey-art that was often practiced/delivered in an unstructured and undisciplined manner by reformed or semi-reformed hackers. Penetration testers used their own techniques and either their own home-grown tools or tools borrowed or traded with close associates. There was little reproducibility or consistency of results or reporting. As a result, the services were hard to integrate into a security program.

As the art evolved it became more structure and disciplined and tools, techniques, and reporting became more standardized. This evolution was driven by papers, articles, technical notes that were both formally published and informally distributed. In the end, a standardized methodology emerged that was largely based on the disciplined approach used by the most successful hackers.

Hakker Kitteh photo by blmurch.

At about the same time open-source, government and commercial tools began to emerge that automated many of the steps of the standardized methodology. These tools had two divergent impacts on the art of penetration testing. As these tools were refined and constantly improved they reinforced the standard methodology, provided more consistent and reproducible results and improved and standardized penetration reporting. All of this made penetration testing easier for the consumer to absorb and integrate into security programs. As a result, regulations and security protocols emerged that required penetration and security assessments. Nmap and Nessus are excellent examples of the kind of tools that help shape and push this evolution. And, because of their utility they are still indispensable tools today.

However, Nessus also helped to automate both data collection and analysis, it has lowered the bar for the skills and experience needed to conduct portions of the penetration testing methodology. This lowered the cost of penetration testing and made them much more broadly available. Thus, giving rise to so-called “boutique firms.” The problem with penetration testing “boutique firms” is that they fall into two broad categories; specialized highly professional firms led by experienced and technical security professionals who can translate automated tool output into root-cause analysis of vulnerabilities, and security program flaws. The second category of firm consists of opportunist firms with just enough knowledge to run automated tools and cut and paste the tool output into client reports. The later firms are some times called “tool-firms” and their employees “tool-boys.”

The later flourish for two reasons. The first is that they can offer their services at rock bottom prices. The second reason is that security organizations are often so ill-informed of the intricacies of the penetration testing process that can’t make a meaningful distinction between the professional firms and the tool-boys except on the basis of costs.



Similar Posts:

Posted in Rants, Technical | 2 Comments »
Tags:

Et Tu, TIC?

Posted October 7th, 2008 by

Let’s talk about TIC today, dear readers, for I smell a conspiracy theory brewing.

For those of you who missed the quick brief, TIC is short for “Trusted Internet Connections” and is an architecture model/mandate/$foo to take all of the Internet connections in the Government (srsly, nobody knows how many of them really exist, but it’s somewhere in the 2,000-10,000 range) and consolidate them into 50.  These connections will then be monitored by DHS’s Einstein program.

No, Not That Kind of TIC photo by m.prinke.

Bringing you all up to date, you’ll need to do some homework:

Now having read all of this, some things become fairly obvious:

  • If you have the following people needing connections:
    • 24 agencies, plus
    • DoD with 2 points of presence, plus
    • Intelligence agencies with a handful of Internet connections, means that:
  • That basically, everybody gets one Internet connection.  This is not good, it’s all single point-of-DOS.
  • Agencies have been designated as Internet providers for other agencies.  Sounds like LoB in action.
  • Given the amount of traffic going through the TIC access points, it most likely is going to take a significant amount of hardware to monitor all these connections–maybe you saved 50% of the monitoring hardware by reducing the footprint, but it’s still hardware-intensive.
  • TIC is closely tied with the Networx contract.
  • In order to share Internet connections, there needs to be a network core between all of the agencies so that an agency without a TIC access point can route through multiple TIC service provider agencies.

And this is where my conspiracy theory comes in:  TIC is more about making a grand unified Government network than it is monitoring events–Einstein is just an intermediate goal.   If you think about it, this is where the Government is headed.

We were headed this way back in ought-two with a wonderful name: GovNet.  To be honest, the groundwork wasn’t there and the idea was way ahead of its time and died a horrible death, but it’s gradually starting to happen, thanks to TIC, FDCC, and Einstein. 

More fun links:

If you want to get a reaction out of the OMB folks, mention GovNet and watch them backpedal and cringe,–I think the pain factor was very high for them on GovNet. So I think that we should, as a cadre of information security folks, start calling TIC what it really is:  Govnet 2.0!  =)



Similar Posts:

Posted in Technical | 2 Comments »
Tags:

NIST and SCAP; SCAP @ Large Part 2

Posted October 2nd, 2008 by

There is another challenge that SCAP addresses without it being officially on the SCAP program’s agenda.  With the advent of SCAP we now have a common reporting criteria by which we can now judge SCAP certified products.  If you have ever used an automated vulnerability scanner as part of a penetration test or security audit, you know that not all vulnerability scanners are created equal.  Some have much lower false positive alert and reporting rates than others.  Likewise, it is known that false negative alert and reporting rates vary.  And, because of the various technical approaches taken by the scanners, some provide much more consistent results. The challenge has been that without a common criteria to test against, it is difficult for a small or even fairly large security organization to find the resources to effectively test these products in a fair apples to apples test.

This is where NIST has a real opportunity on its hands.  With the release of the SCAP protocol, we have the criteria by which performance comparisons can be made.  What we are lacking is a common test environment.

Benchmark photo by bzo.

Let me veer off-topic for a moment to provide some background.  In the last few years the Linux community has created various “live distributions” for various specialized requirements.  What live distributions are, are CD, DVD or Flash-media-based operating systems that are executed upon boot.  That is to say that they boot and run directly from CD or DVD.  So, by using a Linux live distribution, you can run Linux off of you home Windows-based laptop without ever installing Linux to your hard disk.  This has opened up a world of specialized possibilities for this community.  One of them is the standardized training environment.  For example, security testers have created DVL (damn vulnerable Linux http://www.damnvulnerablelinux.org/).  DVL is a live distribution that with well documented security vulnerabilities, this distribution is used as a training aid for teaching vulnerability assessment and mitigation. There are other similar efforts created with the same intent such as the excellent DE-ICE training targets (http://de-ice.net/hackerpedia/index.php/De-ICE.net_PenTest_Disks).

NIST could follow-up on the release of the SCAP protocol by also building and releasing a common testing environment based perhaps on live distribution technology. Such an environment with well documented vulnerabilities would allow for the creation of objective benchmarks to be created to rate the accuracy, reproducibility, completeness of the results of SCAP certified vulnerability testing and reporting products.  This would aid government agencies, businesses and even individuals in their purchasing decisions.  It would also allow provide vendors with an objective and common test environment in which they can test and improve their products.  I admit this would be a significant undertaking for NIST.  However, I would suggest that such a test environment could be designed in such a manner that it could be built and released as a series of inter-operable modules based on live distribution technology.  The initial release might only offer a relatively modest set of tests but with the release of each module building on the results of previous releases, a highly demanding and sophisticated test environment could soon be realized.  Because of the importance and utility of such a project, industry and outside security experts might want to participate in and contribute to such an endeavor.

 



Similar Posts:

Posted in NIST, Technical, What Works | No Comments »
Tags:

NIST and SCAP; Busting a cap on intruders Part 1

Posted October 1st, 2008 by

I was attending a conference at NIST (the National Institute of Standards) concerning the SCAP program (Security Content Automation Protocol; pronounced ESS-cap).  SCAP is focused on providing the Federal government with automated, common,  interoperable security solutions.  Specifically the SCAP program has developed a common set of standards for reporting security vulnerabilities for use in automated security scanners, security appliances and reporting systems.

Well, why do we need SCAP?  If we use the Godfather of all vulnerability management tools, the NESSUS vulnerability scanner as an example, we have seen that industry has produced a number of similar products.  Each has its own strengths and rich feature set.  However, none of them use the same “language” for detecting or describing or reporting a potential vulnerability.  This not only means that these various products can only be used to operate with each other with some measure of difficulty but, trying to aggregate and manage the result of reports from these systems can be tedious.

“Tim Bray at XML 2005” photo by Roland.

As a result of these efforts and vision of the dedicated employees at NIST, industry is already scrambling to get their related products SCAP certified.  And, Federal agencies are also specifying in contracts that products must be SCAP certified in order to be qualified for purchase.  This is real progress and great news for the tax payer who will get real better value for their tax dollar.  But, it is not a revolution — yet.  Where I see the revolution emerging is in six-month to a year time frame when industry takes note of the SCAP program and we begin to see SCAP certified and SCAP interoperable products being ordered.  It will not be long after that that we may see the SCAP protocol used in even consumer-level products like personal firewalls.  This ability to give us all a common language will significantly reduce the cost of building and supporting vulnerability scanners and vulnerability reporting tools.  This cost reduction will allow resources to be freed up to address prevention and mitigation concerns in a more meaningful manner.

For example, industry has tools that enable network and security support professionals to detect a mis-configuration in a desktop machine in their network and correct it.  But, only the largest and most well funded security IT security departments have such tools.  With the advent of SCAP, these kind of services will be much more affordable and supportable and thus more common.  In fact, because much of this can be automated, I can even envision the McAfee, Symantec, and others who are well placed in the vulnerability scanning market to offer support services over the wire to smaller businesses and to consumers.  Moreover, as this technology improves and becomes commoditized, I can see ISP’s offering security scanning and mediation as a service to their customers.



Similar Posts:

Posted in NIST, Technical, The Guerilla CISO, What Works | No Comments »
Tags:

« Previous Entries Next Entries »


Visitor Geolocationing Widget: