Register for our upcoming webinar!

Security, Privacy, and Compliance in 2023: Predictions, Expectations, and Preparations for Data Management in a Post-Pandemic Environment

SEARCH BY:
Blog  |  September 27, 2022

Between a Rock and a Hard Place: Automating Data Loss Prevention

In our previous blog, we looked at the emergence of technology to automate the continually changing requirements for privacy compliance. Automation can also be applied to create a “data harbor” to automate data loss prevention (DLP) within organizations through an approach that involves next-gen data protection as a service.

 

Next-Gen Data Protection as a Service

Encryption is a great tool for securing data, and it has been around for decades. But it has limitations, as evidenced by the four cybercrime statistics we provided in the initial post in this series. Think those victims of cybercrime didn’t have encryption in their solutions? Of course, they did! Today’s data protection challenges require a next-level approach to data protection.

Next-gen data protection as a service, also known as next-gen DPaaS or NGDP, involves three components to end-to-end protection of data:

  • Encrypt: Any good data protection program involves encryption of data – at-rest and in-motion.
  • Fragment: A second component to protecting data is to fragment that data into valueless pieces. Each of the fragments is encrypted with a different key, so there is no single point of failure if an unauthorized user gains access to a single key.
  • Scatter: Take the data fragments and scatter them across multiple separate storage locations, creating a virtual environment known as a “data harbor”. That makes it virtually impossible to locate all the pieces of a data set, much less break through encryption of each of the fragments.

Imagine taking important documents and storing them in a secure facility that is difficult to break into. Then, you shred those documents (in a cross-cut pattern, of course!). Then, you take some of those document shreds and you store them in other secure facilities. Imagine how difficult it would reconstitute those documents. You would have to break into multiple places, find all the pieces in multiple locations, then figure out which pieces go to which documents. That’s what an NGDP approach does to protect your data.

Data Harbor

A data harbor is a collection of geographically and physically separated storage locations used to store data that has been fragmented into valueless pieces whereby no single location contains all the fragments for the data to be reconstructed into its original form. Like an actual harbor that contains a series of boat slips where different boats are docked when they’re not in the open water, a data harbor separates the data into different locations, essentially making it necessary to breach each one of them to even have a chance of getting to your organization’s data.

Four Characteristics of an Effective NGDP Solution

While a data harbor sounds secure, authorized users still need to be able to access that data in a secure manner when they need it – otherwise, it’s not useful. Here are four characteristics that an effective NGDP solution needs to have:

  • Multi-Factor Authentication (MFA): A secure solution requires two or more verification factors to gain access, so MFA is a “must have” for an effective NGDP solution today. According to Microsoft, your accounts are 99.9% less likely to be compromised when using MFA.
  • Transparency: Users of solutions today have enough to worry about without having to worry where their data is stored and how data fragments can be reassembled for use. An effective NGDP solution needs to ensure a transparent user experience with that data.
  • Latency:Users don’t need to be waiting around for data fragments to be reassembled. To be an effective NGDP solution, any latency of retrieving data within systems must be negligible and unnoticeable.
  • Zero-Trust Security: In 2010, John Kindervag, an analyst at Forrester Research, introduced the term “zero trust,” which was based on the idea that an organization shouldn’t trust any resource whether it was inside or outside its network. The model of end-to-end, zero-trust security has become a standard expectation of next level security that denies access to applications and data by default. In today’s standard remote and hybrid work environments, a zero-trust security philosophy is important to ensure that security is about the data, not the location from which it’s being accessed.

Conclusion

Automating data loss prevention today requires a next generation approach. An NGDP approach to data protection that effectively encrypts, fragments, and scatters your data for protection – while transparently and quickly assembling that data when it needs to be used – is your best bet to avoid becoming a cybercrime statistic.

Many organizations are stuck between a rock and a hard place when it comes to protecting data and meeting their data protection obligations. An approach that includes leveraging best practices and programs, along with technology automation mechanisms for privacy compliance and data loss prevention to protect your organization’s (and your clients’) sensitive data is your organization’s best bet to get “unstuck” from that position. Best practices + technology = data protection freedom!

For more regarding our automated data loss prevention (DLP) solution offered in conjunction with our partner Calumu, click here.

Read the full blog series here: Part 1    Part 2    Part 3    Part 4    Part 5    Part 6

>