Building Cost Savings, Predictability And Defensibility Into Your E-Discovery

Tuesday, March 25, 2014 - 13:38

The Editor interviews Rose J. Jones, Director of E-Discovery Project Management and Client Services at King & Spalding, LLP.

Editor: Please tell us about your professional background.

Jones: I am currently the director of E-Discovery Project Management at King & Spalding. I have an Industrial and Systems Engineering degree from Georgia Institute of Technology and a JD from Georgia State University. I have been able to apply my engineering skills to my practice of law, which has allowed me to streamline the discovery process by identifying and removing unnecessary steps in the process, while maintaining and continuously improving the quality of the work product. I have been practicing in the discovery arena for over a decade now. I focus my practice on assisting clients develop an overall discovery strategy that we can implement before litigation even arises.

Editor: How important is cost predictability in the e-discovery process to you and your clients?

Jones: Cost predictability is extremely important to both our clients and King & Spalding. As you know, the costs relating to reviewing and handling documents in litigation continue to rise. Some studies indicate that document review costs account for 70 percent of the cost of discovery. For our clients, having that cost predictability is critical for budgeting purposes and to ensure that the costs are proportional to the case. From our perspective, as a law firm that provides e-discovery services, it is important for us to be able to offer alternative fee arrangements for our clients to help achieve these goals. We can achieve this through preferred pricing from our e-discovery technology providers, as well as hourly and per unit pricing.

Editor: What are some cost-saving measures that can be taken ahead of the need for ESI collection?

Jones: Substantial cost-saving measures can be taken here. We recommend that our clients have a record retention policy – along with a method to schedule the enforcement of that policy. Obviously you want to address current challenging issues, such as the use of social media and BYOD (bring your own device) practices and how they will affect your overall strategy.

We have also found it advantageous for our clients to use contract terms when negotiating a business deal with a potential business partner to limit the scope of discovery in the event of a later suit. For those clients that are subject to serial litigation, we have also found it to be very helpful to have a national e-discovery advisor or an oversight committee to create and coordinate policies and practices from collection to production. By having a single point of contact who understands the corporation’s data, companies can avoid reinventing the wheel every time they are faced with litigation.

Editor: Who should be on the litigation team?

Jones: All the key players and stakeholders should be on the litigation team. You need in-house counsel litigators as well as the IT department on the team. If the outside counsel handling the merits of the case is different from those handling the discovery work – indeed, in some cases, we serve as e-discovery counsel while a different firm handles the merits – you need them on the team. Certainly you should include the key custodians who actually know the information relevant in particular matters.

Editor: Information management is the first stage in the EDRM. What are some best practices at the second stage, identification?

Jones: As we just discussed, IT involvement is key at this stage. IT needs to understand the significant legal ramifications of preserving the data. Also, they know how difficult it may be to access certain data, and they can help the team understand those challenges.

Pre-litigation system mapping – i.e., taking an inventory of the data systems that are likely sources of relevant information – is a very useful tool. When people talk about identifying documents, we often think of electronic mail, but there is so much more. We have to consider databases and information that may be stored in the cloud, on shared networks or housed by a third party. A system map that has already been thoroughly reviewed at the outset of a case can deliver considerable time savings, as well as significant cost savings.

Editor: During preservation and collection, search terms and keywords are determined. Who should be included in the development of these to save costs down the road? How can defensibility be ensured?

Jones: When you are developing your keywords, it is very important to involve the key players, the custodians who actually have the data that is relevant to your matter. They are the ones who can articulate the terminology they use to address a certain product or project. They will also be able to identify what types of data sources are going to be in play, and they can help you identify keywords that may be different between the various data sources. For example, a number may be used to identify a particular product in a database, while lay terminology for that same product is used in email.

A defensible keyword process is multi-step and iterative. As discussed, include key players. Also, when running your keyword searches, be sure to have someone who is experienced with using ECA (early case assessment) tools on your team. This will help ensure that you are utilizing the full range of syntax devices, including wild cards, synonyms, acronyms and misspellings. Next, make sure that your keywords address the document requests and the pleadings. Test and re-test your results. When you analyze the results of your keywords, make sure you are testing not only to understand the volume of data but also to determine the recall and precision of your search terms. Are you effectively capturing documents that are likely to be responsive? Conversely, are you identifying false positives – that is, documents that hit on the search term but are not necessarily going to be relevant? Sampling the non-hits is essential: examine the documents that are going to be left behind and make sure you are not missing any keywords that should be added to identify potentially relevant data in that set.

Editor: What tools and/or processes can assist with cost predictability during the processing/review/analysis stage? I am thinking here of advanced analytics and preferred metrics.

Jones: Advanced analytics are going to be very helpful after you have identified the documents that are going to be analyzed for production, whether that is a raw set of collected data or a keyword searched set of data. Predictive coding can be extremely helpful in cutting costs for your client. If you use a very well thought-out, well-documented process that utilizes the appropriate sampling to make it statistically valid, predictive coding can greatly reduce costs in a highly defensible manner.

Other advanced analytics that have provided efficiencies include concept clustering, which brings together similar documents across custodians and business groups. Also very powerful, email threading (in which responses, forwards, etc. to one email are grouped together and presented to one reviewer) creates efficiencies by eliminating some duplication and by allowing the individuals reviewing the thread to become subject matter experts on that particular concept. As they are reviewing, they learn more information about that set of documents and then they can go back and apply the knowledge that they learned, consistently and efficiently to the entire set. We have found that to be extremely helpful for our clients.

Editor: What is the process you use for evaluating the many different vendors and tools in the e-discovery marketplace?

Jones: When you are evaluating vendors, it is essential that you fully understand what the tools can actually do. You need to separate puffery from performance. Obviously, the first step would be to have a demonstration of the review tools. If you are interested, the next step would be to interview not only the technologist who is conducting the demonstration but also the project manager. You should make sure he or she is truly knowledgeable not only with predictive coding and the tools but with e-discovery as a whole. You will also want to check references. We like to check references of those who have been very successful and enjoy working with a particular vendor, but we also like to check references of those who decided not to use a vendor, and to ask those people what the deciding factors were against using that vendor.

It is also crucial that you do an apples-to-apples comparison when looking at the costs. What one vendor calls “hosting” may be what another vendor calls “database maintenance.” They may quote a very similar charge per month, but if you do not fully understand each of the components and normalize them to fit the characteristics of your matter, the comparison will be meaningless. We have had clients who look at the price as opposed to the overall cost. By normalizing the costs across the entire life cycle of the review, you get a better picture of what your overall costs are going to be.

Editor: Do you find inconsistency in the naming of tools and processes in the e-discovery marketplace?

Jones: I would say that there is a lack of consistency to some extent. But more often than not, the inconsistency I see is with respect to assumptions on bids. For example, one vendor’s bid may give you a quote for “monthly” hosting of x dollars per gigabyte per month – but only one month of hosting is included in the bid. Meanwhile, another vendor’s bid allows for y dollars per gigabyte per month, but the hosting is for 12 months. This is exactly the kind of discrepancy you need to normalize so that your data hosting is appropriately accounted for, in terms of monthly cost as well as the time period you anticipate needing for data hosting. That is where we typically see the inconsistencies.

Editor: How do you convince a client to spend money on the appropriate tools at the beginning of the e-discovery process to save significantly on the overall e-discovery and review spend?

Jones: We have found charts and metrics to be very effective. We have been able to gather metrics from our use of different tools – that is, conducting a linear review versus a more advanced review that utilizes advanced technology. We can demonstrate what our experience has been and, critically, how those initial upfront costs for the advanced technology can be shown across the entire review process to be cost effective. The benefits you receive, including greater efficiencies throughout the review process and lower costs associated with the review, far outweigh the initial costs of the best advanced technology.

Editor: Have you been successful in doing this?

Jones: We have. In fact, we have been very successful. Our biggest successes have been with clients who take a long-term, overall approach to e-discovery and information governance. If you look at the quoted price of a particular piece and not the overall cost, another provider may appear to be more cost effective. However, when we have gone head to head with other providers based on overall costs, we were able to demonstrate our effectiveness with higher quality and increased efficiencies, with our total overall costs being lower than our competitors' time after time. 

This article courtesy of Kiersted Systems,

Please email the interviewee at with questions about this interview.