Ice Lounge Media

Ice Lounge Media

Rediscover trust in cybersecurity

IceLoungeMedia IceLoungeMedia

The world has changed dramatically in a short amount of time—changing the world of work along with it. The new hybrid remote and in-office work world has ramifications for tech—specifically cybersecurity—and signals that it’s time to acknowledge just how intertwined humans and technology truly are.

Enabling a fast-paced, cloud-powered collaboration culture is critical to rapidly growing companies, positioning them to out innovate, outperform, and outsmart their competitors. Achieving this level of digital velocity, however, comes with a rapidly growing cybersecurity challenge that is often overlooked or deprioritized : insider risk, when a team member accidentally—or not—shares data or files outside of trusted parties. Ignoring the intrinsic link between employee productivity and insider risk can impact both an organizations’ competitive position and its bottom line. 

You can’t treat employees the same way you treat nation-state hackers

Insider risk includes any user-driven data exposure event—security, compliance or competitive in nature—that jeopardizes the financial, reputational or operational well-being of a company and its employees, customers, and partners. Thousands of user-driven data exposure and exfiltration events occur daily, stemming from accidental user error, employee negligence, or malicious users intending to do harm to the organization. Many users create insider risk accidentally, simply by making decisions based on time and reward, sharing and collaborating with the goal of increasing their productivity. Other users create risk due to negligence, and some have malicious intentions, like an employee stealing company data to bring to a competitor. 

From a cybersecurity perspective, organizations need to treat insider risk differently than external threats. With threats like hackers, malware, and nation-state threat actors, the intent is clear—it’s malicious. But the intent of employees creating insider risk is not always clear—even if the impact is the same. Employees can leak data by accident or due to negligence. Fully accepting this truth requires a mindset shift for security teams that have historically operated with a bunker mentality—under siege from the outside, holding their cards close to the vest so the enemy doesn’t gain insight into their defenses to use against them. Employees are not the adversaries of a security team or a company—in fact, they should be seen as allies in combating insider risk.

Transparency feeds trust: Building a foundation for training

All companies want to keep their crown jewels—source code, product designs, customer lists—from ending up in the wrong hands. Imagine the financial, reputational, and operational risk that could come from material data being leaked before an IPO, acquisition, or earnings call. Employees play a pivotal role in preventing data leaks, and there are two crucial elements to turning employees into insider risk allies: transparency and training. 

Transparency may feel at odds with cybersecurity. For cybersecurity teams that operate with an adversarial mindset appropriate for external threats, it can be challenging to approach internal threats differently. Transparency is all about building trust on both sides. Employees want to feel that their organization trusts them to use data wisely. Security teams should always start from a place of trust, assuming the majority of employees’ actions have positive intent. But, as the saying goes in cybersecurity, it’s important to “trust, but verify.” 

Monitoring is a critical part of managing insider risk, and organizations should be transparent about this. CCTV cameras are not hidden in public spaces. In fact, they are often accompanied by signs announcing surveillance in the area. Leadership should make it clear to employees that their data movements are being monitored—but that their privacy is still respected. There is a big difference between monitoring data movement and reading all employee emails.

Transparency builds trust—and with that foundation, an organization can focus on mitigating risk by changing user behavior through training. At the moment, security education and awareness programs are niche. Phishing training is likely the first thing that comes to mind due to the success it’s had moving the needle and getting employees to think before they click. Outside of phishing, there is not much training for users to understand what, exactly, they should and shouldn’t be doing.

For a start, many employees don’t even know where their organizations stand. What applications are they allowed to use? What are the rules of engagement for those apps if they want to use them to share files? What data can they use? Are they entitled to that data? Does the organization even care? Cybersecurity teams deal with a lot of noise made by employees doing things they shouldn’t. What if you could cut down that noise just by answering these questions?

Training employees should be both proactive and responsive. Proactively, in order to change employee behavior, organizations should provide both long- and short-form training modules to instruct and remind users of best behaviors. Additionally, organizations should respond with a micro-learning approach using bite-sized videos designed to address highly specific situations. The security team needs to take a page from marketing, focusing on repetitive messages delivered to the right people at the right time. 

Once business leaders understand that insider risk is not just a cybersecurity issue, but one that is intimately intertwined with an organization’s culture and has a significant impact on the business, they will be in a better position to out-innovate, outperform, and outsmart their competitors. In today’s hybrid remote and in-office work world, the human element that exists within technology has never been more significant.That’s why transparency and training are essential to keep data from leaking outside the organization. 

This content was produced by Code42. It was not written by MIT Technology Review’s editorial staff.