Ransomware, hackers, and nation-state threat actors have long dominated security teams’ focus. These external threats often feel more urgent and more dangerous, carrying greater potential consequences for the business. There’s no question about intent—it’s malicious. From a psychological perspective, it’s straightforward for security teams to identify the enemy and prepare for a fight. But what happens when the threat is internal? What if it’s coming from a teammate, a colleague you eat lunch with, or even the executive you report to? Security teams can’t take the same approach as with external actors, mainly because insiders aren’t faceless enemies to take down, even if their impact can be just as damaging.
Insider risk occurs when sensitive corporate data—intellectual property, digital assets, trade secrets, crown jewels—move to untrusted places like personal devices, email, or cloud destinations. Such data movement presents considerable competitive, financial, privacy, and compliance risk. The scope of the problem is significant. According to Aberdeen Research, one in three reported data breaches involve an insider. The cost of a data breach from an insider can account for up to 20% of a company’s annual revenue. Insider risk is not a new problem, but it has become increasingly urgent due to drivers like digital transformation, hybrid-remote work, and the “great resignation,” not to mention a noticeable uptick in the use of contractors and recent layoffs.
Insider risk can occur anywhere within a company, by anyone. It can come from former disgruntled employees stealing artificial intelligence trade secrets or someone poached by a competitor taking mobile chip design secrets on their way out the door. It can even come from the C-suite, as one company learned recently when its CFO accidentally shared a document to the entire company titled “Restructuring.” Unintentional data exposure can cause employee unrest, or even trigger US Securities and Exchange Commission (SEC) Regulation Fair Disclosure (Reg FD) filing requirements for public companies, if the leaked data could affect shareholders.
For the security team, it may be inappropriate to take a combative approach—intended for outside threats—with a CFO over an unintentional data share. There is a better way.
An empathetic approach to employee investigations
The way we should approach an external risk—like malware, for example—versus that from insiders is vastly different.
There are many factors to consider when managing insider risk, especially as they relate to the desired business outcome. Insider investigations should not fall solely within the purview of the security team and often require the collaboration of security, HR, and legal. According to Gartner, “Survey data…indicates that over 50% of insider incidents are non-malicious,” which means that, more often than not, the employee at the root of the incident was simply trying to get their work done, making a mistake, or taking a shortcut. Treating them as though their actions were intentionally malicious is the wrong approach and could backfire. Those involved in the investigation must take an empathetic approach devoid of judgment. Otherwise, the risk of that employee making the same mistake again or becoming disgruntled and disenfranchised rises significantly.
Approaching insider investigations with empathy requires a psychological shift. It is the first step to building trust, so the best outcome for the organization can be reached. Here are five important elements of an empathetic approach to insider investigations:
- Connect to understand: When an event happens, the first outreach can be as casual as, “Hey, we noticed you moved a document to your personal cloud account. Did you mean to do that?” Their response will often be one of surprise, because it was a mistake, or they didn’t realize this wasn’t allowed. Possibly they simply needed to get work done, and this was the quickest way.
- Explore unconscious biases: All humans have conscious and unconscious biases that affect our actions and decisions. The HR team can help other stakeholders explore these biases and work to mitigate them. It’s important to treat all individuals equally, whether they are peers, the CEO, or someone in a group or culture different from your own.
- Reassure to support partnership: If the event was a mistake, let the employee know they are not in trouble. It’s likely the employee believes they are and may wonder if they could lose their job. It’s a natural human instinct to become defensive and deny behavior. Reassure them that this event can be reversed and you are here to help. They are more likely to be honest about what they were trying to do and you’ll be in a better position to help—, and to recover any exposed or leaked data.
- Educate: In the event of a negligent or accidental incident, it’s important to provide the employee with information about the right way to act in the future. Guidance at the time of the error is highly impactful and more likely to be remembered than, say, an annual training session. You can reinforce the conversation with short one- to three-minute videos about a specific situation.
- Take action: It’s important to approach each investigation with empathy, but there’s always a portion of insider breaches that are truly malicious. In these cases, documentation is important. If it’s determined that the employee took risky action deliberately—and if it’s clear they present an ongoing risk to the organization and its data—then it’s time to assemble all key stakeholders from security, HR, and legal to provide a recommended course of action to the executive team.
Approaching insider investigations with empathy helps build a culture of trust, open communication, and respect. It builds and perpetuates a positive security culture—and best of all, it will help keep your organization’s most valuable data safe and secure.
This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.