The golden rules for Data Loss Prevention (DLP) systems

Golden Rules for DLP

November 18, 2020 | By Moshe Dadush Infrastructure Security Manager

 

Golden Rules for DLP

As some of you may have experienced, Data Loss Prevention (DLP) systems tend to get noisy, sometimes disrupting the incident management practices. We have often seen systems that monitor and alert thousands of notifications in a day – primarily making the job quite challenging, especially, to find a needle in the haystack.

Using a DLP solution this way can have a multi-fold impact:

  • a large amount of potential False Positive events which could further prevent the policy from being set to enforcement mode
  • the potential loss of storage space and unnecessary logs getting managed to cause operating efficiencies
  • frustration across the team managing the system

 

As per our experience with working and managing DLP systems, we have created the following “must have” rules which in our understanding provides great value for an organization:

  1. Indexing customer data: Indexing a database containing the business and customer data allows for precise monitoring of sensitive information (as defined by the business). In most cases, assuming the indexed database is high quality and does not contain “junk” records, this rule provides great value to reduce false positive events.

 

  1. Intellectual Property (IP): IP is one of the most crucial and targeted assets in any business and for some organizations, it is a “crown jewel” and any loss or theft of this asset can lead to massive impact for the organization, financially as well as from a reputation standpoint. IP can thus be protected via several methods that most of the DLP product supports. These methods include database indexing, file fingerprinting, and using machine learning tactics.

 

  1. Indexing employees Data: Most of the organizations hold a large amount of information especially about its employees – from SSN\ID, health insurance, health status to marital status, financial data, residential address, and much more. Organizations thus have a responsibility to protect such sensitive internal from theft or loss.

 

Failure to retain this information can also be subject to legal action, in some geographies. As per the same modus operandi for monitoring customer’s data, indexing the employee data allows precise monitoring reducing the number of false positive events.

 

  1. Indexing Highly Sensitive Files: Most DLP products in the market have the ability to index/fingerprint files in the organization. In this method, we tune the product to “learn” the sensitive files that exist in the organization (HR, Finance, Legal…) and monitor those identified sensitive data when it leaves the organization.


As the system is not always aware of the differentiation between sensitive and non-sensitive information, it is highly important to make sure that we only index folders with sensitive data.

 

  1. Encrypted Files: Many organizations direct or mandate their employees to send sensitive files across the recipients in an encrypted form, to reduce the risk of the file getting into the wrong hands. As per our experience, we have seen multiple instances wherein, such directives motivate internal threat actors to use the same activity to exfiltrate data from an organization – by encrypting the data and sending it outside the organization’s perimeter.

 

To tackle such instances, we highly recommend to block encrypted files and allow encryption using a business third-party solution. This would help to inspect the encrypted traffic via the same solution before the data goes out of the organization.

 

 

Visit our blog or Follow Us on Facebook Page for the latest news and insights on cybersecurity.

Stay Safe with TrustNet!