WHITEPAPER: Making an eDiscovery Molehill out of a Data Mountain

white paper bytes 0 Comments
Nov 18

“If it really costs millions to do that, then you’re going to drive out of the litigation system a lot of people who ought to be there,” U.S. Supreme Court Justice Stephen Breyer said during a Georgetown law school panel on how electronic evidence is transforming the justice system.

A criminal trial currently taking place in a Dallas, Texas, federal courtroom has illuminated one of the key issues civil litigators have been struggling with for years—the high cost of e-discovery.  The defense team is struggling to find the resources to review over 400 million pages of documents that the government recently dumped on them. 

Since its inception e-discovery has been pegged as a major debit when budgeting complex litigation matters.  However, while the complexity of electronically stored information does not always allow quick and easy solutions, if done with an eye toward an end goal the result can be efficient and affordable solution. 

In the Dallas federal prosecution, the U. S. Attorney’s Office of the Northern District of Texas collected over 200 terabytes (TB) of evidence during the course of their investigation.  They dropped 8TB of data on the defense--or roughly the equivalent of 400 million pages of documents--just weeks prior to the trial date.  While the U.S. Attorney’s Office had the resources available to cull out the irrelevant data and search for case-altering documents, the defense did not have that ability. With 8TB of “evidence” dumped in its lap, the defense team had to find a way to review hundreds of millions of pages of documents in time for trial.  The defense looked to the court for help.

The court appointed a Special Master who received bid proposals that ranged from $110,000.00 to $4 million!  Roughly a $3.9 million disparity.  So how can two bidders look at the same data set and come up with bids that are $3.9 million apart? 

What it likely means is that the high bidders submitted bids that would put all or most of the 8TB of data on the processing conveyor belt which would result in a review dataset so massive that it would take months to review.  In addition to the dramatic increase in review costs, production and storage costs would also steeply increase. The same soaring costs that Breyer fears will lock the doors to the courthouse.  The smarter solution requires some experience and some tools.

Experience tells us that only about twenty percent of the data is potentially relevant, non-duplicative information and less than ten percent of the overall dataset is actually relevant to the case.  So how do you get to that relevant core of data?  By utilizing forensic tools before ever putting one byte on the conveyor belt.  These analytical tools can build a smarter set of data before ever reaching the review application.  Culling by date ranges, search terms and file type will build a relevant review set that will create a more efficient review downstream.  Savings will then appear in the form of decreased review time, and deep cuts in production and storage costs. 

Typically, litigators use forensic tools as a mechanism for investigation in order to uncover metadata, find a deleted document or reveal the last modified time of a document.  Forensic tools are used to find that one hot document.  However, forensic tools can be used for a greater purpose.  Forensic tools in the right hands can assist litigators in building smarter cases.  And smarter cases mean better solutions for the client. Solutions that can make a $110,000 bid every bit as smart as a $3.9 million bid. Solutions that can keep the courtroom doors unlocked.

 

 

David S. Weber is General Counsel for Digital Discovery and serves as a computer forensics consultant and eDiscovery expert to corporations and law firms. Please call 888.774.1506 to speak with him directly about your organization's computer forensics and eDiscovery needs.

0 COMMENT(S)

LEAVE A COMMENT