The Impact of Legal Action Against CISOs
As cyber crime has grown, so, too, has the pressure on the security team, and most of all, its leader — the chief information security officer (CISO). As the head of the cybersecurity program, the CISO is ultimately responsible for the program’s success or failure. Until recently, a failure (or perceived failure) has meant potentially being fired and possibly getting one’s name dragged through the mud on cybersecurity social media forums. All that changed when Joe Sullivan, the former CISO of Uber, was charged with a federal crime after a cyber attack on the company.
Sullivan’s legal predicament rattled the community. Many security professionals started asking whether accepting a CISO role was worth the risk. Could a similar fate befall them? Are criminal charges justified when a breach occurs? Could they actually go to jail if they’re at the helm when their employer is attacked?
For this article, I spoke with colleagues Ben Rothke, Sr. Information Security Manager at Tapad and Kevin Johnson, CEO and Security Consultant at Secure Ideas, to get their take and explore the subject from various perspectives.
Over the last two decades, the corporate world has increasingly focused on cybersecurity. As businesses have diminished many analog processes and procedures, organizations have realized that they must put time, effort, and significant resources into protecting digital platforms and data.
At the forefront of this effort is the CISO. The CISO’s job is twofold: Govern an effective and accountable security program, and communicate the security program’s strategy and actions to non-security line-of-business owners and company stakeholders. Beneath these umbrella requirements are numerous nuances that impact how the security program is run: the company’s mission and culture, its technology footprint, brand and reputation, compliance obligations, industry, geography, socio-political issues, expected operational baselines, and so much more. With the attack surface expanding constantly, managing a hardened security program is a big, messy, complicated job.
The job of the CISO (or CSO) has never been an easy one. The role was created less than 30 years ago as a reaction to a series of cyber incidents targeting Citicorp (now Citigroup). In 1994, security teams were essentially offshoots of the IT department — mainly composed of eager IT staff who wanted to dabble in something new and challenging — reporting into a CIO. Steve Katz was the world’s first CISO, and it was the first known time any security professional had C-level responsibilities and attention.
For a long time, the role floundered, stubbornly refusing to give up its technical practitioner roots. CISOs fought for “a seat at the table,” but weren’t given the same level of respect and credibility as other C-level executives or business units — because they weren’t acting like business leaders. As time went on and the role of CISO codified, smart CISOs and security practitioners started to realize that trying to explain how much malware had been blocked or why it was difficult to lock down admin access wasn’t working. Being a CISO meant taking on C-level responsibility and learning about business — and actually understanding business in a way that shaped how the security program would operate.
The modern-day CISO
By the early 2010s, the security industry had experienced a sea change. While CISOs still struggled for that proverbial seat at the table, security was being taken seriously enough that it was discussed in board meetings, included in companies’ annual reports, and true security leaders were commanding considerable salaries. NIST had been advising on cybersecurity since 2003, and several cybersecurity regulations had been passed. Being a CISO was still a hard, complicated job, but there were expectations. And there were frameworks in place to help CISOs build and evolve their programs.
Still, much of the community continued to bemoan their fate of being “misunderstood” and “disrespected.” “Security is hard,” they said, while stubbornly calling humans “the weakest link” and repeating, “You can’t fix stupid” anytime a user clicked a link or opened an attachment. This is not to say security practitioners gave up and stopped doing the hard work of trying to advance security. Great technological progress was made in those years, but the level of corporate maturity wasn’t always there.
By the middle of the decade, and because cybersecurity had become so central to business operations, things had finally changed; good CISOs gained a comprehension of what was required to do the job. Sometimes that meant politicking, which is never fun for anybody. But to be effective and to affect change, it was necessary. The stakes were too high and the weight of breach implications was too heavy.
By the time of the second Uber breach (and, arguably, even at the time of the first breach two years earlier), it was well known by everyone in the security community that security teams, and especially the CISO, were required to act professionally, honestly, and with the highest levels of skill and integrity. Yet, Sullivan attempted to mitigate the damage, or maybe just the public fallout, by covering up some not-so-savory details about what had happened. He tried to turn this into a story of heroism and creativity instead of owning up to his actions.
Let’s be clear: other executives at Uber are also responsible and/or complicit for what happened. The public will never entirely know what went on behind the scenes, but Sullivan, no way, no how, acted alone, in a vacuum.
The reaction
When the news came out, the security community’s reaction was split. Some were on Sullivan’s side. Some said he deserved to be charged. As details emerged — and it was clear that Sullivan wasn’t being prosecuted for the breach, itself, but for his actions and the attempt at cover up — CISOs stood by his side.
The question is: Why?
The court ruling put Sullivan on probation and levied a $500,000 US fine against him; he wasn’t sentenced to jail time, but the judge made it clear that there would be no such leniency the next time a similar situation was brought before him. Sullivan now runs a charity.
Then came SolarWinds and the Wells Notice. While the details are not publicly known, the Securities and Exchange Commission (SEC) filed a letter warning company executives, including the CISO, of the SEC’s intent to file civil action for the executives’ roles and actions following an attack on the company.
Once again, the security community was shaken. Numerous articles, blogs, and social media posts cite CISOs' concerns about how executives are now being held responsible at a personal level for cyber attacks and breaches. The overarching tone of these posts is fear: CISOs will now have to be “more concerned” about managing risk. CISOs could be held responsible if an attacker is able to execute a successful attack. A CISO could be fired, personally fined, or their career ruined if the security team fails to notice an important alert or doesn’t act on it quickly enough.
We know that, at least in the case of Uber, this is not what happened. It’s easy to gloss over the details, but Sullivan was not held legally responsible for the breach. He was convicted of withholding information (a.k.a. lying) in a court of law during a criminal investigation and for trying to hide the breach as it was happening.
With SolarWinds, time will tell, but it’s my guess that the Wells Notice and any ensuing charges are for a cover up, not the cyber attack itself.
Should CISOs be held liable?
Personally, I think this is inarguable: you lie, you break the law, there are implications. Does being a CISO at the time of a breach make you a criminal? Certainly not. What about if you are the CISO of a completely negligent security team at the time of a breach that impacts consumers, the security of the nation, the greater economic environment, or similar? This is a trickier question to answer, but my perspective is that, if you accept the role of a CISO (or any other high-ranking security position) you are accepting responsibility. A lot of it. You are committing to running the best security program you can. If something stands in your way of doing that, let the executive team know. Document it. Request audits and outside security testing. Document that, too. Work with your legal team — internal and possibly even external. Have them document that.
Is this excessive? It may feel like it, but other professions and professionals have been doing it for decades. Doctors are required to fill out so much paperwork about everything they do — and even get “permission” from insurance companies at times — that sometimes they have to do it during off hours. Often it cuts down on the number of patients they can see in a day.
Most highly regulated industries’ executives have similar experiences. And as for regulation, I won’t tackle the issue of “is regulation good or bad,” because that’s a huge topic and I have mixed feelings. But, unfortunately, regulation is necessary because humans cut corners and do bad things. All the cybersecurity regulation that’s been passed or proposed — that’s not because everyone in the security industry has done a stellar job. Is it a hard job to do, even with the best of intentions? You betcha! Do you accept that responsibility when you accept a job in security? Yes, you do. One hundred percent.
Other PoVs
Now, all I do is write about security and help my employer grow its market presence. As such, I asked two of my longtime colleagues their opinions on the matter. Ben Rothke and Kevin Johnson are both security practitioners who’ve held various levels of responsibility for the security of organizations’ digital estates.
Talking to Rothke, he started with a few good questions:
“Are there CSO’s who are criminally negligent? Certainly. Are the majority of the CSOs who have been charged, in fact, guilty? Likely not. And that’s precisely why many [in the field] say that “CSO” stands for “chief scapegoat officer.”
Rothke then cited what’s known as “Spaf’s Law,” and the first principle written and promoted by Professor Gene Spafford of Purdue University. The rule says, “If you have responsibility for security but have no authority to set rules or punish violators, your own role in the organization is to take the blame when something big goes wrong.”
Rothke continued, “Many of the CSOs who need a defense attorney are in that predicament; they were put in that position due to their own naiveté, or due to nefarious chief executives.
“That is why anyone interviewing for a CSO position needs to know if they will be working for a mature company that takes information security seriously and will empower them to enforce security and compliance, or if the prospective employer simply wants someone to be the security punching bag.”
Johnson took a slightly different point of view during our conversation. He noted that SolarWinds appeared to be negligent in their breach notification. They tried to downplay the facts and impacts. In other words, they tried to cover up how severe things were until they couldn’t anymore.
“The truth is,” Johnson said, “We don’t know if the SEC even knows how severe the breach is. SolarWinds held back a lot of information, making their actions seem suspect. We do know that the attackers had access to source code — if you ask me, that’s a big deal for a company like SolarWinds and the situation they and their customers are in.”
When it comes to the Sullivan case and conviction, Johnson is even more adamant:
“If a CISO is being prosecuted for a lie, for a cover up, there is no question that this is a good trend. There are too many good people in security to let the bad ones get away with criminal behavior.
“It’s been far too long that security teams have not been held to higher requirements and standards. Organizations, and the person leading that business unit, have to take responsibility, have to do better. I see, day in and day out, security teams that don’t follow basic cybersecurity requirements, that aren’t even covering the basics, like understanding their networks and testing them regularly. I’ve seen major companies that don’t know what’s plugged into their networks.
“How can they protect their networks if they don’t even understand their entire tech estate?
“We need personal responsibility in security. We should have always had it. People are treating accountability as a new level of responsibility but it’s not. It’s always been there. It’s just that now CISOs are being held liable for their actions.”
Your take
If ten people read this (long) article, I suspect there will be ten other points of view. And, of course, this (again long) article isn’t a treatise. There are many more details I could include. It’s long enough, though. If you’ve gotten to the end of this and have thoughts, please reach out, make a comment. I’d love to get others’ takes on the subject and even explore other angles. It’s how we all grow.