Editor: Rich, please describe your practice.
Green: My practice is dedicated full-time to technology and outsourcing matters with emphasis on information technology, but the group I co-lead with my partner Ed Hansen in the New York office is technology agnostic, i.e., everything from alternative energy technology and equipment to manufacturing and construction industry technology. IT governance was always a key part of the practice (things like vendor risk assessment and management, contracting processes and discipline, etc.), and out of that, really over the last decade or so after implementation of HIPAA in 2003 and some changes FFIEC made after the Y2K situation, we evolved into data security, information governance and privacy. Today, with all the big data and other challenges companies face, client matters for cybersecurity and privacy are getting a lot of our daily attention.
Editor: The new era of cybersecurity has opened up a new dimension to the practice of law. In which of the areas of your practice, such as retail or healthcare, have you found it has the greatest impact?
Green: First, cybersecurity is really not new for us. The names have changed over the years, but we’ve been at it for more than a decade. As I indicated, it was an imperceptible evolution from the IT governance work we were doing in the late '90s to where we are today. In a lot of ways we evolved right alongside our clients on these matters, and that’s been invaluable in the work we do. From a vertical market perspective, there’s no question that healthcare and financial services are the two core areas that need the most attention. Companies in those industries hold so much of our data as consumers, as patients, and as financial account holders. Next in importance are the critical infrastructure industries – public utilities, telecommunications companies, and others affected by the NIST Cybersecurity Framework. Retail probably looks like a new area to those outside the practice because of the Target situation, but retailers have really been in the game for a while because they use so much consumer data in customer acquisition and retention. The Direct Marketing Association, the Mobile Marketing Association and other groups have been around a long time trying to provide rules for self-regulation to avoid FTC and other governmental regulation. I think what we’re seeing change now in retail post-Target is that companies are starting to understand that their robust attention to privacy issues needs to come out of the marketing department and into the retail store as a technology matter. UPS just reported another in-store breach this week. So what I suspect you’ll see over the next two to five years in retail is a twofold change. First there will be an accelerated push for getting the vastly more secure embedded chip credit cards into the hands of U.S. consumers, and second, there will be a rise in the kinds of intrusion detection and prevention technologies that the financial services industry has been using for awhile.
Editor: How does a company governed by multiple regulators comply with privacy demands and protection against cyber attacks?
Green: I think the answer is actually pretty simple, though not easy. The first thing to focus on is the basics. Whether it’s the Heartbleed vulnerability or the Target breach, we see so many companies chasing the news rather than executing on a defined security strategy on a daily basis. If you take some basic steps, use only high-quality data centers to store data, keep only the data you actually need for as long as you need it and encrypt as much data at rest and in-transit as you can – then you’re about 75 percent of the way toward not just complying with all the various statutory regimes we have in the United States, but also with mitigating your real-world risks. In terms of what we mean by high-quality data centers, they’re those that not only have all the SSAE 16 AT 101 Type 2 and ISO 2700x certifications at the level which have their actual operations audited, but don’t make you negotiate for weeks to get a contractual commitment that they’ll continue to meet those standards and disclose audit results. Taking the next step, it’s important that your outsourced vendors are following these same procedures, which builds in a further safeguard. If you outsource some payment card processing to a large company, you should ask them contractually to store your data inside a facility that meets these same highly regarded standards. If you have multiple regulators and are doing that basic blocking and tackling that I just described, you’re really covering your bases on all regulatory regimes in a material way.
Editor: What are the first steps you take when helping a company set up a compliance system that will offer protection from cyber invasion?
Green: In a lot of ways it’s really the same answer. You would be surprised at how many very sophisticated multinational companies are not covering the basic steps. So the first thing we’ll do is ask to see the basic policies that are in place already. Sometimes there aren’t any, but what we most often find is that they exist, but they’re a hodgepodge of loosely enforced rules some of which sit in the human resources department, others in IT and still others with legal and sourcing – but no one is riding herd over the whole thing and executing on a security strategy in a coherent way. And that’s a big and very common problem. From there, we’ll start looking at your industry-specific needs. For instance, if you are SEC- or FINRA-regulated as a broker dealer, investment manager, or investment advisor, we recommend that you investigate an intrusion detection and prevention system, which is a very sophisticated and somewhat expensive piece of software that actually is looking at what is going on in all parts of your network every day. If you’re a healthcare company, you’re probably looking at even more encryption than is needed in other industries. For instance, are you encrypting not just your backup tapes and your main database, but even the laptops used by administrative staff? Again, you start with the basics, and once our team is satisfied that these are being complied with, we start looking at the industry-specific risks that need to be covered.
Editor: Do you feel that companies are more vulnerable when making use of the cloud in storing privileged information?
Green: The answer is unequivocally yes. The cloud is probably one of the highest-risk depositories there is. However, this is the case not because the cloud per se is riskier – the problem is that no one really knows what the cloud is. If you talk to six CIOs and ask them what they think “the cloud” means, you’re going to get six different answers. If you’re familiar with the software industry, you know that Larry Ellison, who’s the CEO of Oracle Corporation and obviously someone who knows a thing or two about IT, had a very famous rant a couple of years ago, saying the cloud is essentially a marketing term and no one knows what it means. He’s absolutely right. People talk of cloud and hosting and this or that “as-a-service.” Again, it comes down to the basics. At the end of the day, whether you call it cloud or something else, what we’re really talking about is remotely storing and accessing your data and possibly some of your software applications at an offsite center. And as I’ve said, when that’s the case, you need to do your basic homework. If you do that, and you know things as simple as the physical address where your data is, the quality of that facility, and whether that quality matches your compliance obligations and the nature of the risk, then the cloud is no more risky than any other strategy you would have. If you don’t do that homework, you’re looking for trouble.
Editor: It has been said that to protect against cyber damage, corporations should cultivate a culture of risk awareness. Assuming you agree with this proposition, what do you think corporations need to do to achieve such a culture?
Green: I absolutely agree. It’s important to point out that risk awareness isn’t good just for its own sake, but because the entire regulatory and compliance regime in the United States follows a risk-based approach. Whether it’s critical infrastructure under NIST, or financial services under SEC, or healthcare under HHS, all of them require you to take a look at your risks of unauthorized use and disclosure of data and address them. Risk awareness fits very nicely into that risk-based regulatory regime and gets you a long way towards compliance and safety. Regardless of your industry or regulatory regime, it is vital to identify and assess your risks, write down what you think you need to do to mitigate those risks and then actually execute what you’ve written down. I think an important part of a risk-aware culture, and one that should have benefits beyond just cybersecurity, is to close the gap between IT executives, the CFO and the CEO and the operational folks and really do a better job of understanding the cost-benefit of taking some of the basic steps. If you say the cost of encrypting all of my data might be X, it might seem very costly sitting there as a line item on a budget, but when you then look at the cost of what Target suffered, it’s small money. The IT execs need to have the freedom to have those kinds of discussions without worrying about the zero-sum game of “well my budget is $X, so if I propose a defensive measure like encryption, I’ll lose the ability to buy something else we really need.” That kind of corporate culture is a huge problem from a data security perspective.
Editor: And a company’s encryption of data is done by its IT department?
Green: Yes, it’s always going to be implemented by IT. The question really is where does the policy of encrypting emanate from and does that part of the company have the kind of authority and budget to make it happen. In the highly regulated sectors, like registered investment advisors, where the role of compliance officer is mandated by regulation, it can often come from that respected office rather than from IT, but in most other industries you’re going to need to bridge that gap between the IT department, the senior information technology executive and the C-suite to make sure that everyone understands that all the preventative costs, not just encryption, are often minor relative to the possible reputational and real dollar harm that can be done. I want to make an important point here: encryption is not the solution to all your problems. It’s definitely got some weakness in situations like application-level attacks, but typically the overall cost-benefits are tremendous, particularly when you look at various regulations. Under HIPAA, for example, if there’s an incident where some equipment or storage media is lost or stolen, you get a safe harbor as long as you’ve encrypted.
Editor: In your opinion, how should a corporate victim of a cyber breach event best address any ensuing business disruption, the response cost, any negative publicity, reputation damage, the threat of ensuring litigation and potential liability?
Green: I think the same answer applies to all of them. The first step is to make sure you understand exactly what has happened. Again, it’s something that sounds simple when you’re talking about it in the abstract, but when clients are in that intense post-incident environment, it’s sometimes hard to get people to slow down and think clearly. For instance, do you have an actual breach or do you just have an incident that occurred on your network that didn’t result in a compromise of data? Taking your time and contacting the right experts to do the forensics to find out exactly what took place before you go rushing to give notice to regulators or to customers is most important. For instance, we’ve had a number of incidents at clients over the years similar to what just happened at the Community Health Systems breach, i.e., just because you’re a healthcare industry company doesn’t mean healthcare data was compromised in every incident. It may, in fact, be simply personally identifiable information that doesn’t have all the special rules that attach to electronically protected health information. So taking those early steps before you even get to the disclosure analysis is really the key. And that response and disclosure analysis is state by state, regulator by regulator.
Editor: Is insurance available to protect against the direct and indirect costs of a cyber intrusion? Does D&O insurance protect against cyber liability?
Green: Insurance is available. In fact, my group has formed a task force with our insurance coverage practice at the firm. We help clients look both at preventative insurance before an incident and as part of their lessons learned after an incident. Clients have to be cautious with these policies since they are quite new, and there are only a couple of these carriers who operate in a robust manner. It’s difficult to talk about the specifics because some of the offerings are proprietary to the carriers we’ve worked with, but suffice it to say not all policies are created equal. For the same relative cost, some will go so far as to offer technical components that help you with your own compliance and incident prevention. So we urge expert IT and insurance/risk manager review of those policies. For instance, whether you or the carrier gets to determine whether a breach occurred is often a contentious issue, but only if you’ve spotted it first! Many companies just sign the policies up and realize after the fact that they’ve got an “incident” that’s going to cost them money, but don’t have something rising to the level of an insured “breach” under their policy. As to D&O policies specifically, it’s very difficult to say. There are so many riders and exclusions and all the other things that go into those policies. People think IT is bad with jargon, but the insurance industry really does speak its own language. So personally I rely on my insurance coverage partners in our group.
Editor: In planning a response to a cyber breach (assuming one occurs), do you encourage your corporate clients to include disclosure of the effects and potential effects of the breach to employees? To customers and suppliers? To investors? To the press?
Green: Disclosure to consumers and authorities is actually mandated by state and federal law. Depending on which law you’re dealing with, you’ll have obligations to make disclosure to the affected data subjects, to employees and to regulators. You may even have obligations under contracts to tell your corporate customer base or supply chain. You also may have obligations to call the state police, as is the case in New Jersey, or you may have an obligation to call the state attorney general, as is the case in Massachusetts. The mandate to make disclosure is dictated by the statute of the state in which the breach occurs – or federally for HIPAA, SEC, FINRA and the like. You don’t have a lot of leeway. What you do have leeway on is taking the steps to understand what is the universe of data that might have been compromised, understanding the nature of the incident and whether it really resulted in unauthorized access and how you communicate with the regulator. A lot of companies will have incident response policies that say “notify authorities as required by law,” and that can set up a lot of internal debate, particularly about timing of disclosure, to the company’s detriment. As these statutes have matured and regulators, particularly in the states, have matured their own reactions to disclosure, we find that a disclosure light on facts made too soon after an incident will yield a very different, less favorable, result than one made a reasonable amount of time later when you’ve really got a handle on what happened and how you might prevent it in the future. As with most of what we’ve discussed, it’s another one of those things that sounds simple when you say it, but can be difficult to get companies to implement in practice.