Fred Cohen &Associates
Specializing in Information Protection Since 1977

Legal Update

This Legal Update contains select extracts from our "Legal Update" newsletters...


2010-03 - At the RSA conference earlier this week, a Federal magistrate from New York headed the e-discovery mock trial and the subject was Rule 26f meet and confer (which was simulated with the magistrate as a greek chorus) and Rule 16b more formally in front of the judge. The most interesting this is the extent to which some districts are now insisting that experts attend the meet and confers as well as the hearing, and the extent to which the judge was simply going to rule against you of the expert on the other side was present and said something and you didn't have an expert there to state your side of the matter.

Also interesting was the particular argument for non-release of content based on HIPAA in that situation, since the requested information was the name, address, phone number, and date and time of a solicitation call made with respect to a drug. As an audience questioner, I inquired as to why the experts didn't identify that this is not in fact electronic Protected Health Information, absent the identification of it as related to the drug in question. After all, a list of names, addresses, phone numbers and dates and times is hardly, on its own, health related at all. The lawyers were a bit taken aback by the fact that they hadn't noticed that, and their experts were both disagreeing with me (even though they were supposed to be on opposing sides in the mock trial and the lawyers were agreeing). The judge ruled that the mock trial should return to order and moved on to other issues... and I got to say "with all due respect, your honor" without the risk of punishment.

Meanwhile, for the first time in memory, the NIJ has now requested proposals for research in the science of digital forensics (and other forensic disciplines) with enough money behind it to make it worth applying.


2010-02 - The 9th circuit recently ruled (sort of) that in searching a computer, the plain view doctrine does not apply in the same manner as it does in physical searches. This means, among other things, that - at least in the 9th circuit - law enforcement is being forced to do a lot of things differently and digital forensic evidence available to law enforcement is a lot less complete than it was only a few months ago. It appears that, in issuing warrants, magistrates in the Bay area are requiring police to explicitly forgo any further warrants on anything they may happen to find while searching for something within a warrant, and some agencies are starting to propose a second group that does initial searches to cull out things relevant to the case and only provide those to the people authorized to do the searches related to the cases at hand. This then doubles the work load and further slows law enforcement from finding criminals. From a legal standpoint this also potentially opens up all sorts of historical cases as well as creating problems for current and future ones. But this is not why I brought it up.

The problem for digital forensic evidence examinations (DFE) is even more complicated. As it turns out, since DFE is so easily created, altered, and destroyed, and since it is trace but not transfer, the only way to reasonably rely on DFE or to question it as a document, in many cases, is to use the redundancy in DFE provided by the way most computer systems operate. This redundancy depends on, among other things, seemingly unrelated records. For example, an email without the context of surrounding emails cannot be checked to determine whether or to what extent its binary representation is consistent with the mechanisms in use within the system. It cannot be verified against software in use, systems configurations, or logs. And it cannot be tested against other conditions in the environment, for timing characteristics, ordering, or any of the other things that allow DFE examiners to validate the content. This then opens up all such evidence for potential challenges and makes it far harder to defeat those challenges or make certain that the traces presented are valid or should be admitted as evidence.

This ruling is currently being considered for review by the whole set of judges in the 9th circuit - something that has apparently never been done before. Of course I don't know how it will all come out, but from a scientific point of view, the issue of getting reliable evidence for criminal cases, and potentially the impacts on civil matters, just became far more complicated.


2009-12 - I saw an interesting article that I thought I would share, in pertinent parts. It comes from: http://www.guidancesoftware.com/Blogs-digital-forensics.aspx?blogid=1623 - which I briefly summarize here:

Those of you who have been getting these emails for some time are aware of the recent reports by the National Research Council in terms of the need for information on tools, methods, calibration, error information, demonstration of peer reviewed scientific processes, and related details in such reports. In combination with the FRE and rules 16 and 26, this starts to provide increasing clarity around the nature of experts, their reports, and challenges to those reports and experts.

In my recent experience, I have seen expert report after expert report providing only summary information, without a stated basis, with no specifics about the data on which the summary is based, with no information on errors, reliability, calibration, and with no scientific basis for the methodology (or no methodology identified at all), no way to verify that the methodology was properly applied, and no peer reviewed literature cited. While the digital forensic evidence (DFE) arena is not a settled area yet, it seems both logical and reasonable that these standards should and will be applied to DFE. But this can only happen if the lawyers who spot these problems use them to limit the supposed experts from testifying or using their unspecified methodologies. Whether it goes to admissibility or weight is, of course, a key issue, and also one that is somewhat unsettled and depends on how the information is presented to the court.

But to finish the picture, we need to admit that there is little or no funding for scientific research in digital forensics, few venues for peer reviewed publication, no strong long-held scientific methodology other than the methodologies of related fields, like mathematics, digital systems and computer engineering, statistics, and some of the more mathematical areas of computer science. While many have rallied for increased focus in this area, there is no national effort to address these issues, other than the efforts by law enforcement and efforts sponsored by the National Institute of Justice, which is clearly biassed toward prosecutions. We still see "secret science" brought to court, and this is not the sort of methodology that can stand up to the light of day.

What can you do about it? You can force the issue when it is relevant, and make certain that your experts and their reports meet the highest standards of scientific rigor. Of course, doing this within the constraints of costs and schedule is the trick...


2009-11 - Recently, a group of government (law enforcement) folks calling themselves the "Scientific Working Group on Digital Evidence" (http://www.swgde.org/) introduced a document titled "Position on the National Research Council Report to Congress - Strengthening Forensic Science in the United States: A Path Forward". This is noteworthy for several reasons, including without limit:

or is I like to say, oriented toward the prosecution...

They appear to be reasonably supportive of the NAS report, including seeking to require certification for DFE examiners (which seems to me to be problematic from a standpoint of expert testimony).

Even though I generally agree with the spirit of their documents as a whole, I find it somewhat troubling that the government appears to be sponsoring (at least passively - perhaps more actively) a group that is making what amounts to political statements under the guise of technical statements and that the leaders of a "scientific" group are not in fact scientists in the relevant field (again, according to my source only).


2009-10 - The ABA has recently released an article titled "Ten ways lawyers kill their own experts"... I summarize (with commentary):


2009-07 - In a U.S. Supreme Court ruling handed down on June 24 in the case of Melendez-Diaz v. Massachusetts, the Court held that "certificates" of forensic findings submitted as prima facie evidence were admitted in error. In a controversial 5 to 4 vote that reversed the judgment of the Massachusetts Appeals Court, the Supreme Court held that admission of notarized forensic analyst's reports violated the defendant's 6th Amendment right to confront witnesses against him under the Amendment's Confrontation Clause. The Court determined that, in the absence of live testimony by forensic analysts who could then be cross-examined by the defendant's counsel, such evidence was precluded.

While the general commentary in the media seems to be that this will create lots of problems for law enforcement and that the analysts will testify to the same things they wrote down, I find it a positive result, in that the right to face an accuser and those presenting evidence against you is certainly an important one, and I think it will increase the rigor and diligence required to admit DFE, which I strongly support.


2009-04 - The "Digital Forensics Certification Board" has created a certification for digital forensics. (http://www.ncfs.org/dfcb/index.html) This particular certification program was developed with National Institute of Justice (NIJ) funding and it will eventually be applying for recognition by the Forensic Specialties Accreditation Board (FSAB), which is currently recognized by the American Academy of Forensic Sciences. We have decided to participate in the development of this certification including working on the testing committee to help develop a testing methodology for certifications. As a founding member (the founders will not themselves be approved for a few months, only individuals with substantial experience are permitted to enter the founding process, and this includes many of the individuals who work at government laboratories), we are highly supportive of the process and notion of creating a national level certification, even though we are always dubious of such certifications because they are not sufficient to demonstrate real expertise. Nevertheless, this one seems to be pretty sensible, and is being identified as a sort of "drivers license" approach.


2009-03 - The report on forensics just came out from the American Academy of Sciences and it is blistering indeed. While it barely mentions digital forensics, it is clear that certain things are missing from the vast majority of forensics reports, including information on reliability, the tools used, and the methods applied. While I have tried to make certain I include these things in my reports, there are still no standards for reporting, and there is strong resistance in the forensic community, particularly for practitioners.

I should also note that I had a long call today with the head of the certification group that certified digital forensics labs and examiners, and he indicated that the sort of work that I do does not have any certification process and is not likely to in the foreseeable future. In fact, he indicates that he has lots of problems with examiners who get wrong answers, failed write-blocking hardware, software that doesn't do what it is supposed to, reporting that fails to properly state things in the proper terms, and so forth.

A critical issue in digital forensics is that there is no theoretical base for the work in most cases, and as a result, unlike physical evidence that has the theory of trace evidence, transfer, etc. most practitioners cannot explain why things are what they are and the basis for almost any meaningful conclusions that they offer.

I have now increased my attention and focus on an upcoming book and several papers I am writing that are intended to put forth such a theory. They include examples that are directly relevant to the techniques I am increasingly using to work with digital forensic evidence, and while the Frye standard is no longer primary in most jurisdictions, publications of methods in peer reviewed articles is still one of the criteria for accepting a method. I am also now working rather feverishly on a book that I hope to have partially completed in a few weeks - in time to use its contents for patents I am hoping to apply for regarding many of these techniques. One of the key elements that runs through these efforts is that I am trying to define a theoretical basis for digital forensics that identifies the basic properties, such as traces, definitions of consistencies and inconsistencies in traces and between traces and events, and other related issues.

I hope to ask those of you who are interested and not working with me on any current cases that may be impacted to review some of this material before it goes to final publication. Please let me know if you are interested and I Will start to send you items to look over and comment on. I am also, of course, soliciting reviews from well known digital forensic scientists around the world to try to make certain I don't say anything foolish and to help get long-term buy-in to the theoretical approach from a wider audience in the community.


2009-02 - This update is the result of recent publicity surrounding the automated marking of documents by laser printers independent of the content sent to those printers. I will call these markings "tagents". The Electronic Frontier Foundation (EFF) has published results of analysis of printouts from different laser printers indicating that they believe that those devices encode the serial number of the device and, in many cases, the time of printing, on each document they print. This is done by printing individual yellow dots in patterns on each page printed.

As the results in this area accumulate, it becomes increasingly apparent that such tagents are present, and we have performed experiments to confirm their existence on select outputs from select devices. We are going to continue doing experiments on these tagents over time to confirm further results in this area, and we believe that we are now able to reliably detect and decode information in these tagents from the printed sheets from several devices. However, to date we are unaware of any case in which there has been a legal proceeding demanding or receiving reliable data on these tagents from the manufacturers of these printers and, as such, the use of these results in court may remain dubious until such time as a witness can be put on the stand from a supplier to testify definitively in this regard.

The use of these tagents for legal cases include, without limit, (1) linking a printed page (original writing only) to a printing device type, serial number, and time and date (setting), (2) challenging or establishing the authenticity of original evidence or those that make claims in that regard based on the presence or absence of specific tagents, (3) identifying intentional attempts to circumvent the utility of such tagents, and (4) examining scanned images of printed documents to detect the presence and content or absence of these tagents and thus reveal information about the scanning device or methods used and the origin of the original documents.

There are many related issues associated with printing devices that have been used in the past, including such things as tool marks, flaws in physical devices, fonts, alignment deviations, colors and ink used, evenness of printing across the page, and so forth, all of which go to the issue of questioned documents with a digital nexus. But this particular technology represents a very simple and direct method of finding information about original evidence that may be of use in many cases, is relatively inexpensive to apply.


2009-01 - This update is the result of a recent publication, actually a paper delivered earlier this week at the 42nd annual HICSS conference. It was titled: "An evaluation of agreement and conflict among computer forensics experts" and authored by Gregory H. Carlton and Reginald Worthley. It is, as far as I am aware, the first study of how different DFE experts do the collection of DFE from media and systems, and the results, based on more than 100 such folks from the HTCIA, indicate that "However, a recent study identified widespread conflict among professionals in the field of computer forensics." Indeed, this paper shows that there are some aspects of forensic evidence collection over which the sample expert community strongly disagree, with some saying you MUST and others saying you MUST NOT do the same thing in order to properly do the job.

The legal implications of this paper are, in my view, stunning, to those who are interested in challenging expert testimony among digital forensic experts, and I have asked the researchers to work with me in developing further studies in this field. I will be trying to get the full set of original data soon and hope, over time, to form sets of specific questions and a more detailed methodology that will help to clarify the expertise and implications of different responses to those questions in terms of meeting evidentiary guidelines.