How to select a DLP solution: 9 unusual considerations

Data loss prevention systems have become more complex, and each has evolved differently. Here’s how to select a DLP system based on required functionality and its real-life ability to prevent data leaks.

An open padlock with green check mark allows data to flow freely [lax security/data loss prevention]
Matejmo / Getty Images

Data loss prevention (DLP) is an increasingly popular solution in toolkits of enterprise information security teams. However, the criteria for choosing DLP systems are often blurred and resemble a checklist for buying an antiseptic. As a result, they pick a brand that employs more intrusive advertising and promises bells and whistles, all with an appealing price tag.

The caveat is that these systems deliver plenty of features that aren’t listed in marketing brochures. The cost of a wrong choice here is much higher than when buying household chemicals. Infosec experts are better off scrutinizing the characteristics beyond a formal comparison of available functions.

Nowadays, many DLP systems offer different sophistication levels to choose from. An extra feature mentioned in the comparison table is always a lure, but it does not necessarily mean that the perk will meet the customer’s expectations. In the rush to launch their products, some software publishers neglect to ensure proper quality, user experience, and reliability.

The main advantage of a technologically mature enterprise solution is that the vendor can properly maintain it and release new functionality. DLP systems are typically selected based on a three- or five-year planning horizon, so compliance with current requirements is not the only selection criterion. You need to understand where the vendor is heading and predict what tasks and problems your company might face down the road.

The following considerations will point you in the right direction in terms of understanding how good the solution is and whether your investment will get you a product that fully meets fundamental enterprise software standards.

Content blocking

A DLP system is geared toward thwarting data leaks caused by employees’ slip-ups or malware. Terminating a dubious operation proactively is the only way to achieve this goal. However, some companies choose to leverage these solutions in monitoring mode rather than directly tamper with data movements.

At the same time, all information is stored in archives, and security officers often respond to leaks post factum. That said, the need for content blocking functions may lose priority since their active use is not planned.

By taking the less intrusive monitoring route, infosec professionals may also try to steer clear of false positives or failures of the DLP system. For instance, if the solution overreacts to a suspicious event by disrupting email or another equally critical service, the consequences could outstrip the aftermath of a data leak.

This pitfall is mainly inherent to unsophisticated DLP systems that provide scarce configuration options. A data breach that occurs due to the lack of blocking mode can be tedious and expensive to recover from. Plus, a growing number of international regulations (such as the European Union’s GDPR) require that organizations use content blocking to prevent data leaks, and companies run the risk of paying huge fines for non-compliance.

Some communication mechanisms supervised by DLP tools cannot be blocked for technical reasons. For instance, passive monitoring is the only option with Telegram and WhatsApp messengers due to the peculiarities of data encryption techniques they use. Nevertheless, a DLP system is inefficient if it does not support blocking email, printers, USB ports, and HTTP/HTTPS-based web services when anomalous activity is detected.

Policies and content analysis

Not all present-day DLP systems were originally designed as full-blown protection instruments. Some developers incorporated one security layer from the get-go and complemented it with extra controls to monitor the remaining communication channels later. Since the restrictions and unique features of the original module must be aligned with the subsequent architecture enhancements, the outcome could be at odds with the expected efficiency of the final product.

The peculiarities of how a DLP system implements content analysis can therefore give you clues regarding the starting point of its evolution. For instance, if an endpoint agent on a monitored device uses the SMTP protocol to submit data to the DLP server for analysis, then it is safe to deduce that the solution only contained an email inspection module at its dawn. In this scenario, the agent might not receive the analysis verdict from the server. If it does not, then content blocking is not possible for situations where a file is printed or saved to a thumb drive.

Another weak link of such implementation is that it hinges on a permanent connection to the server, which means that network congestion can disrupt the process. It is a good idea to check if a DLP solution can prioritize network traffic.

With a well-thought-out architecture, content analysis should be carried out at the location where the policies are in effect, at the level of the endpoint agent. This way, there is no need to submit big amounts of data over the network, and the traffic prioritization challenge is not a part of the security equation.

No connection to the enterprise network

Besides implementing content analysis and applying enterprise policies, the DLP agent needs to send event logs, shadow copies of different files, and quite a bit of other information to the server. These valuable details should not be lost if the server’s data repository is inaccessible. Normally, such information remains on a local disk and is submitted to the server as soon as the connection is resumed.

Different policies should kick in depending on the server connection status. They need to be specified when the connection is established, when the endpoint is connected via a VPN service, or when the connection is down. This is particularly important when an employee with a company-issued laptop is out of the office, for example, on a business trip or working remotely from home.


Different users may have different perspectives regarding the convenience of system usage and administration. Some prefer using a command line to manage a DLP, while others would rather set policies and rules by means of scripting languages. In many cases the availability of a streamlined and intuitive interface could be interpreted as a hallmark of a quality product whose important modules are equally well-designed.

When it comes to user experience with the interface, take several nuances into account. First, an all-in-one administrative dashboard is preferable. Web consoles are the most common these days. They are supported across different platforms, do not require any additional software, and are a breeze to use on mobile devices.

If a product provides separate consoles for working with different modules, it means that it was not created as a single, comprehensive system. Chances are that these components were integrated into the solution by different software engineering crews or vendors and then chained with each other during the development cycle.

One more facet of a well-tailored system boils down to what is called “omnichannel” policies. As an illustration, if you need to set up a policy to manage legal contracts, then you can do it once and just specify the channels (email, USB drives, web services, etc.) it should cover.

In a crudely designed DLP system, you must separately create similar policies for every channel. Although at first this appears to be no big deal, it turns into a mess as the number of policies grows. When there are dozens of them, and their rules define multi-pronged conditions including timeframes and user groups, it becomes hugely cumbersome to keep them all synchronized.

Server count requirements

One more telltale sign of an immature DLP design is an excessive number of servers the system requires for proper operation. For instance, if a pilot project for up to 100 employees needs more than one server, such an architecture could use improvements and will most likely require extra resources at the production stage.

A top-notch solution ensures well-balanced scaling in all directions. For large organizations, there should be an option to isolate some system components and allocate separate server resources to each one. However, for smaller computer networks, the number of servers to keep the DLP system running smoothly should not be blown out of proportion.

Flexible implementation

A DLP system worth its salt should provide enough room to maneuver in terms of implementation mechanisms. Not only does this make it easier to intertwine the solution with the existing digital infrastructure, but it also allows you to strike a balance between the functionality and processing load while maintaining control of multiple channels.

Several systems across the DLP spectrum provide the option of controlling all channels at the level of the endpoint agent. The vendor can benefit from a tool like that by reducing development time and decreasing software engineering expenditures.

This architecture does not quite measure up to enterprise-level standards. It makes more sense to administer network channels at the gateway tier. For large-scale DLP deployments, this could be the only reasonable option. Aside from using the endpoint agent as the source for controlling data movements, an effective DLP tool provides the following alternative implementation vectors:

  • Mail server integration. This tactic additionally allows you to monitor your internal email.
  • The option of receiving mail correspondence from a technical mailbox.
  • Using the Internet Content Adaptation Protocol (ICAP) to integrate with the existing internet gateway.
  • A separate mail transport server.

Major vendors offer their own proxy servers that integrate seamlessly with the DLP system to supervise both HTTP and HTTPS traffic.

Cloud infrastructure

Since numerous companies are switching to the remote work mode and at the same time want to save money on security services, cloud DLP solutions will definitely grow in quality and quantity. Already today, it is often required to keep a DLP system’s server components in the cloud. This happens, as a rule, with pilot projects or in small organizations.

The archive of a DLP system contains all corporate secrets, and not many business owners are willing to place it in an uncontrolled environment. However, if the DLP system does not support the option of hosting server modules in the cloud, this may have adverse consequences at some point.

There are also issues with controlling cloud storage and services. They may occur, for example, when an organization uses Google Workspace (the rebranded G Suite) or Office 365 mail services. There are plenty of nuances here. For example, to access mail servers, you can use both your browser and a classic client such as Microsoft Outlook. For each option, you must apply different protocols.

Also, when using cloud storage in an organization, it is necessary to ascertain that a DLP system regularly scans all cloud folders to monitor confidential information stored there. To solve such problems, a whole new cluster of solutions called cloud access security brokers (CASB) has splashed onto the scene. In terms of the tasks being solved, these systems are conceptually close to DLPs.

Integration with other enterprise security systems

In this context, I do not imply integration with Microsoft Active Directory, as this should be a feature built into any present-day DLP system by default. Instead, I mean integration with the following types of solutions:

  • Security information and event management (SIEM). That is arguably one of the most common compatibility gaps in the corporate security ecosystem. Every infosec specialist wants events from the DLP to be integrated with SIEM. Whereas the vast majority of modern SIEM systems are capable of downloading data from a DLP database, this tandem is much more effective if the DLP solution supports protocols like Syslog and CEF out of the box. If it does, you can configure the DLP system to stay on top of information that ends up in the SIEM database.
  • Enterprise digital rights management (EDRM). These types of systems complement DLP solutions and vice-versa. When combined, the two form a rock-solid protection layer as long as proper configurations are in place. In this scenario, the DLP has no issues interpreting EDRM policies and can use some of their rules in its own policies regarding activities like generating reports or searching the archive. Moreover, the DLP system can fully leverage some EDRM policies according to predefined principles.
  • Data classification systems. The best DLP tools should be able to process the markings and labels in documents embedded by data classification systems such as Titus and Boldon James. This allows you to take a shortcut in terms of data classification if this tedious process has already been completed by an ad hoc service.

Multiplatform compatibility

A good DLP system should be compatible with different endpoint platforms. The functionality of the most rudimentary tools is restricted to Windows support. This could be insufficient, though. Who knows, maybe someday we will face a massive switch to Linux. The silver lining is that several DLP systems already offer agent modules for Windows, Mac, and Linux.

Mobile devices running iOS and Android should also be mentioned. For technical reasons, it is currently almost impossible to create a fully fledged agent for smartphones and tablets, especially ones made by Apple, which also suffer from various attacks. Under the circumstances, a web console is an optimal way to interact with a DLP on the go. Therefore, when implementing the BYOD strategy, you can (and should) use a mobile device management (MDM) solution. It will allow you to use the capabilities built into the mobile OS, create privacy policies, and minimize the risk of data leaks.

DLP systems have evolved heterogeneously. This factor affects the degree of their integrity, well-coordinated functioning, and the ability to support individual information distribution channels. In the modern world, the price of such a solution and the subsequent maintenance costs make a difference. Yet, a reliable DLP is worth the investment as it tackles different security problems.

Copyright © 2021 IDG Communications, Inc.

7 hot cybersecurity trends (and 2 going cold)