Ask the Experts Blog Series
The annual cost of detecting and resolving insider threats averaged $8.76 million in the US and is rising, according to an April 2018 Ponemon Institute study. Large organizations often spend more than small ones to mitigate insider incidents, with negligence accounting for about two-thirds of the incidents—careless employees or contractors are typically the worst offenders.
The Ponemon report also found that 23 percent of incidents actually involved an insider having criminal intent, and 13 percent of those involved stolen credentials. These are the least reported incident type and are the most damaging.
The report goes on to explain that the average time to contain an incident is 73 days, with only 16 percent of these incidents being contained within a month. And the cost of containment rises in proportion to the amount of time required to contain an incident.
Ponemon confirms what you likely already know—the frequency and cost of insider threats of all types is rapidly increasing.
Yet such a study exposes only known data regarding insider incidents. Within the 159 organizations surveyed, how many incidents went undetected? There is no way to know for sure, but frequently cyber threats and incidents go undetected and unreported—sometimes for years. What is known is that it’s likely your organization faces possible exposure to insider threats—and it’s probably larger than what you’ve anticipated.
Defining an insider
Let’s first define insider and insider threat, since there is no definitive agreement about what these terms mean. An insider has been legitimately empowered to access one or more of an organization’s network assets. Examples can include employees, contractors, vendors, partners, and ex-employees whose access has not been disabled. It can also include developers having backdoor access, and even members from mergers or acquisitions whose existing systems are being transitioned for inclusion in your network.
An insider threat is an accidental or intentional act perpetrated against your organization’s assets, either by an insider or by an outsider impersonating as an insider. An insider incident is the loss resulting from one or more such threats.
What are common insider threat indicators?
Let’s focus first on some of the more common insider threats. Here’s a list of some predictable behavioral indicators to watch for:
- Suspicious VPN activity – Connecting at unusual times from unusual locations or IP addresses.
- Employees being notified of a layoff or other major negative event, affecting both those who are directly impacted and those remaining (but now with diminished morale).
- Departing employees who have just given their notice. It’s not common for a person to create new accounts to gain access before their credentials are revoked, often prior to giving their notice.
- Someone downloading substantial amounts of data to external drives, or using unauthorized external storage devices.
- A person accessing confidential data that isn’t relevant to their role.
- Emailing sensitive information to a personal account.
- Someone attempting to bypass security controls.
- A person requesting clearance or higher-level access without actual need.
- A former employee maintaining access to sensitive data after termination.
- Network crawling, data hoarding, or copying from internal repositories.
- Deviations in a person’s normal working hours pattern.
As you can see, insider threats aren’t just one event, but a series that ultimately leads to a threat to an organization. The key is to proactively identify such risky behaviors and patterns, then alert your security operations center (SOC) so it can monitor for possible insider threats.
Conventional approaches to stop insider threats
Frequently, many organizations focus on data loss prevention (DLP) in their attempt to stop unauthorized data transfer (data exfiltration). However, many DLP solutions produce results that are time and resource intensive. They typically require analysts to create individual rules for each policy and violation, triggering hundreds or possibly thousands of events every day. This makes uncovering actual threats or incidences very involved—like finding the proverbial needle in the haystack.
Another conventional insider threat detection issue is not being able to understand insider intent. As mentioned, most insider threats are caused by negligence—not malice. If you find this to be true within your organization, you might consider training as a corrective action, rather than taking punitive steps. But in order to take such appropriate action, you need to understand the intent.
Using a behavior-based approach to proactively detect insider threats
Your goal should be to stop insider threats before they become incidents. Unlike external threats, insider threats typically evolve over a long period of time. To discover them, you have to be able to monitor user behavior that isn’t within a normal range, such as copying and staging files for later exfiltration, or the creation of new accounts for future use.
“It is vital that organizations understand normal employee baseline behaviors and also ensure employees understand how they may be used as a conduit for others to obtain information.”
Combating the Insider Threat, US Department of Homeland Security
A necessary tool in identifying insider threat behavior is a user and entity behavior analytics (UEBA) solution that applies data science across all of your existing threat intelligence and indicators of compromise (IOCs). It incorporates the tactics, techniques, and procedures (TTPs) of known previous attacks.
Exabeam UEBA takes a behavior-based approach. It automatically stitches relevant events together to pinpoint malicious indicators more easily and rapidly detect insider threats. Exabeam Threat Hunter lets analysts then search for threats based on the identified behavioral artifacts and user context.
An intellectual property theft example
Let’s review a common scenario where an employee is leaving their job soon and is planning to steal intellectual property (IP). Using an Advanced Analytics dashboard, analysts can search for users that exhibit predictable malicious behavior by simply selecting from the risk reasons dropdown. This is one way to get the defaulted users and put them on a watch list. This is one example of the many different insider threat investigations that you can conduct.
Figure 1 – Using Threat Hunter to search for users having an abnormal number of emails sent to personal accounts in the last 90 days
In Figure 2, one user has a risk score of 129 because they’ve sent a large number of emails to their personal account. Located beneath their score, the UEBA timeline provides analysts the related events and malicious indicators to expose insider threats (Figure 3).
Figure 2 – Threat Hunter shows one user with multiple events related to abnormal email size sent to their personal account
Clicking on the user’s timeline, analysts can examine all the events and track down the malicious indicators. Next let’s examine how a UEBA timeline can help analysts expose the insider threats.
In this example, we discover a user with a risk score of 129. We can then look at a timeline to check if there are other indicators of insider threats. Usually, if you are searching for one indicator, there are other related events that lead to a threat—not just one event. The user trend allows analysts to track down all the relevant events that are stitched together in one timeline. You can also filter by other activities such as by the file, web, network, etc.
Figure 3 – A user card showing the user trends with all the malicious activities with filter options by user activity.
As you can see, one indicator alone doesn’t amount to an insider threat. If the user uploads an abnormally large email to a personal account, this alone wouldn’t rise to a level of concern. But when combined with other abnormal behaviors, this event series is flagged as suspicious and risky. Now, if we look at all the risk reasons for this user, you will see that there are many behavior artifacts that result in an insider threat.
Figure 4 shows data exfiltration (sending corporate data to personal accounts) coupled with abnormal access to other systems (lateral movement). This exposes a threat pattern in action.
Exabeam provides numerous models to thoroughly profile user behavior. It’s very effective in detecting unknown behavior artifacts.
Figure 4 – Various malicious indicators stitched together in a timeline and surfaced to an investigator as an insider threat
Based on abnormal events surfaced by Exabeam Advanced Analytics (AA), analysts can easily put a watch on the user, ceasing access if they exhibit additional anomalous behavior. Advanced Analytics provides ample evidence to confront suspected users, negating any fear of false accusations. Most importantly, Advanced Analytics is a robust tool that enables investigators and SOC analysts to keep insider threats in check.
More insider threat scenarios are available in the Exabeam practitioner training series: