This story was largely buried by headlines about the coronavirus.
However, it serves as a warning to all organizations about the insider threat risk they should be watching for.
Google insider threat case a lesson for any organization
Anthony Levandowski is a former Google executive who worked on the Waymo self-driving program. On the way out the door, he helped himself to a whole lot of proprietary information.
He allegedly used that data to create a self-driving truck startup called Otto, which was quickly purchased by Uber.
Last week, he told a judge he wants to plead guilty.
CBS5 in the Bay Area has the best synopsis of what kind of insider threat this once-trusted employee became.
...while Levandowski was considering leaving Google, and prior to his departure in 2016, he obtained and stored thousands of confidential files with the intent to use them for his personal benefit after his departure from the company.
Specifically, on December 11, 2015, Levandowski downloaded approximately 14,000 files from an internal, password-protected Google server known as "SVN," which was hosted on Google's network. Then, on or about December 14, 2015, he transferred those SVN files from his Google-issued laptop to his personal laptop.
In addition, prior to his departure from Google, he downloaded a variety of files from a corporate Google Drive repository to his personal laptop.
Within months after Levandowski's departure from Google, he created a new company that was then purchased by Uber.
Questions raised by this insider threat case
Did the insider, in this case, need access to all of this data to do his job? That is one question worth asking.
Another that comes to mind: can any of your employees download 14,000 files from a password-protected server without being detected? If so, that is a problem.
Many data loss prevention (DLP) programs would alert the security team to this amount of data being downloaded.
Then again, maybe Google did detect the downloads but it was explained away.
Companies often ignore insider threat warning signs
I was fortunate enough to do a fireside chat with Dr. Larry Ponemon at a recent SecureWorld conference.
Here is what he told our audience about the way many organizations view warning signs from someone who may be going rogue:
We found that companies err on the side of goodness. They don't want to accuse somebody without full evidence of a crime, so they write it off as negligence.
And we discovered insider threats are not viewed as seriously as external threats, like a cyber attack. But when companies had an insider threat, in general, they were much more costly than external incidents. This was largely because the insider that is smart has the skills to hide the crime, for months, for years, sometimes forever.
That finding reminds us of the SecureWorld story about the Columbia Sportswear IT director charged with setting up an alias account so he could secretly hack the company's network. And the almost unbelievable case of the insider threat ignored at the FDIC.
And we're still not sure what Tesla's insider threat program looked like or whether the company had one. But according to Elon Musk's email to employees, the damage sounded significant.
Google's insider threat exchanges guilty plea for leniency
As part of his plea agreement, Anthony Levandowski is looking for a deal to cut down on the 33 counts of theft and attempted theft of trade secrets he is charged with.
"If the court accepts the plea agreement, Levandowski will plead guilty to one count and the judge will dismiss the remaining counts at sentencing."
That would leave Levandowski facing a maximum sentence of 10 years in prison and a fine of $250,000, plus restitution, which is forecast to be a seven-figure sum.
Can your organization detect the insider threat and stop it? Or will it only find out what happened after the damage is done?
[New resource: Daily security briefings, the SecureWorld Remote Sessions]