Dealing with an Insider Threat

By Alex MacLachlan October 11, 2016 compliance, data-breach, forensics, incident-response

Bob just killed the company, or did he?

For many of us we spend more time with our co-workers then we do with our immediate families. Because of all this we don’t want to think that inside the pursuit of common goals, comradery, extended family and our team that there could be a bad egg amongst us. It is unfortunate, but recently Edward Snowden the notorious contractor that copied and leaked classified CIA documents, has shown us that an insider threat can be lurking anywhere and can strike at any time.

In most companies, if you want an employee to do their job you have to give them access to resources. As an individual works at a company their access to systems and data may grow, in smaller businesses everyone may have access to everything. While larger businesses may have roles and groups that users belong to, but unfortunately that does not always work out as some people need more or less access than others in a pre-defined group or role. Unless some kind of dynamic peer group based access can be achieved, maintaining the right amount of access to resources may be a challenge in any organization.

What if a user who has authorized access to a resource has not used it in quite some time but then all of a sudden access it today? What is the risk of that access? Is it the type of activity you should be concerned with? What was the resource that was accessed and what was done with it? As more organizations move to implement User Behavioral Analytics (UBA) a new dimension is added to security operations teams and incident responders. The question of monitoring user behavior is clearly a privacy issue in some countries, but with proper de-identification or data masking systems in place legal challenges can quickly be overcome and organizations can protect their digital assets. But what is the proper incident response process and what do you do?

Activity such as accessing SharePoint and opening 10 documents on average per day may seem normal, but what happens if all of a sudden that same user account is seen opening 1,000 documents. This example is clearly not normal and should require careful review.

In triaging a situation like this the first thing you should think about is what were the resources that were being accessed and what is the overall risk level to the company is for that access. For example, if all those documents were public then maybe there is no issue but what if all those documents were key intellectual property for the company or Personally Identifiable Information (PII) that is required to be protected by law, then what? Maybe this type of infrequent but large volume access is normal for a particular team, time or infrequent use.
According to Leslie K. Lambert Chief Security and Strategy Officer for Gurucul, a leading user behavior analytics (UBA) and identity access intelligence (IAI) company, “Having multiple data points to pivot and view quickly to determine what is high risk versus what is the norm can be the difference between a multi-million-dollar breach and 3 minutes of a normal day’s work.
It’s not just large access of data that may be a threat to a company, it could just be the access of a single key file or database element. People can quickly become high risk users on our networks in the form of a departing user, such as people that have put in their 2 weeks’ notice, a terminated or newly hired employee.

Regardless of their status or situation, when an event such as this occurs, specific actions need to take place. This could be as simple as calling the user or even the users manager to see if the access is known about and properly within the realm of the user’s function. But in the case of a higher risk resource being accessed by a previously identified high risk use case then it may constitute an immediate block of that access or even a de-provisioning of a user credentials until proper authorization is received. In situations like this it may be better to ask forgiveness then to as permission as the right step is to limit a company’s data loss exposure. Further, in extreme cases where access is determined to be unauthorized or coming from a location where the user is known to have never visited then legal, HR and IT leadership may need to be contacted in order to either be informed or consulted regarding the current situation and the next steps needed. Whether automated or manual intervention is required during an incident, being armed with the right data at the right time is critical to an organizations success in preventing an insider threat.

In the case of a single high risk document, or any documents being accessed or moved by a potentially malicious user, an organization who has implemented tools such as a Data Loss Prevention (DLP) system can further understand the company’s exposure or to even block or limit additional losses. Organizations should not just rely on DLP but also more traditional tools such as firewalls, logs and SEIMs to help provide a complete picture of the potential incident.

In determining if you have the right person when performing an investigation there are key elements that need to be in place. First and foremost a company needs to have a policy on user account access and the prevention of account sharing. A key deterrent for insider threat is having employees clearly understand that they are solely responsible for what happens with their access credentials. Next, a clear acceptable use policy must be created and adopted by the organization. Not only having these documents but being able to quickly find out if a user has signed and agreed to them is critical as incidents arise. During the incident critical information needs to be documented. IP addresses, file names of documents accessed and in what order they were accesses are essential but also any and all usernames, times of access and also possibly where the user was located during the questionable access. Location is critical as a user could be on the corporate network or remoted over a Virtual Private Network (VPN), in either case, a clear answer of location should be established as best as possible. If the user is remote on a VPN, performing a geo location on the source IP address could be the needed clue to determine if its possibly an authorized user or an outsider using stolen credentials. A clear timeline of not only the incident but also the actions of the analysts and security engineers should be clearly documented and should include time stamps for each action or discovery.

Thinking and talking about insider threat is a difficult topic, we don’t want to think about being betrayed by our friends, colleges and co-workers but the reality is that it happens. Having the right process in place will help organization not only quick decide if Bob in accounting just stole company secrets or was complying with state tax laws. Don’t wait until it’s a problem. Build your playbook and engage stakeholders in your company to figure out what you want to do at the unfortunate time something like this does happen.

Click the button below to schedule your one-on-one demo of the D3 Incident Management Platform.

Alex MacLachlan

Alex MacLachlan

Alex is the Director of Marketing at D3. He oversees D3's marketing, communications, and digital programs. He enjoys fishing, "checking the analytics", playing golf and watching hockey - in that order.


Comments

Add a comment:

email

username

url

your comment

Your comment will be revised by the site if needed.