The news release is fairly straightforward: "RSA Helps Global Corporations Collaborate Securely With New Release Of RSA® Data Loss Prevention Suite".
As press releases go, it's sort of ordinary-looking material -- what could possibly be exciting here?
And that's exactly what I wanted to share with you ...
Theme 1 -- Collaboration Is Becoming Business Critical
Given that I talk to a wide swath of the IT industry on a regular basis, I get to see how things become relatively more important over time. And collaboration environments that support newer forms of knowledge worker engagement -- well, that's definitely a rising theme in many industries.
That in itself appears to be the result of two related trends: an increasing preference for knowledge worker business models that are based on unique intellectual capital (rather than processing efficiency), and the need to collaborate using a variety of models, both across the enterprise, and -- more frequently -- outside of the enterprise.
The current thinking around email is just a harbinger of things to come. Add in all the content platforms, social platforms, portals, communities, instant messaging, microblogging, etc. -- you can see a rising tide coming from multiple directions.
As the degree of collaboration increases, associated risks can rise exponentially. The nightmare of the wrong information in the wrong place at the wrong time becomes a virtual certainty: people make mistakes, platforms fail -- having a really bad day becomes inevitable.
In this context, DLP technology can be strategic: creating the ability to collaborate with confidence.
If your organizational willingness to collaborate is hampered by the amount of risk you're willing to assume, DLP moves the boundary much farther out -- creating the ability to use extended collaboration modes to leverage human capital in new ways.Here at EMC, we want to collaborate in more meaningful ways with people outside the company. But collaborating around the really cool stuff requires the assumption of additional risk. We look at DLP technology as a way to substantially reduce the risks of extended collaboration, which in term enable entirely new collaboration propositions.
Theme 2 -- Enterprise Collaboration Is Headed To The Cloud
Consider this scenario: two or more companies want to collaborate on something. Where will be collaboration platform be located? Most likely, they'll want to do it on an external service -- a "cloud".
Or consider this, when enterprise IT types start listing off the different environments for which they would consider an external service, "collaboration" usually makes the top 5 list. It seems to be clearly headed outside the firewall in some form.
In many cases, that's already happened. Here at EMC, almost all of our external collaboration platforms are done with outside service providers, although internal collaboration is still done on IT-owned assets.
And that's before we throw in all the Twitter, blogging, Facebook, etc.
Theme 3 -- Clouds Have To Be Better Than The Environments They Replace
I mentioned this before in a previous post. I see real traction in the market for cloud-based offerings that are better than what the IT people can typically do themselves. Yes, they're cost-effective, but that's just part of the appeal. They're "better IT".
Security (including DLP) doesn't have to be "good enough" -- it has to be better than what could be done using a traditional approach.
And here's where DLP shines in two important ways.
First, it can provide a level of information protection for cloud-based services that are very difficult to match in traditional environments that have a preponderance of legacy to go deal with. Conveniently, most service provider clouds are remarkably legacy free :-)
Second, the "back end" of any DLP service (the monitoring, correlation, auditing, workflow, etc.) can be a heavy burden on a traditional IT organization, making it very attractive to out-task to a managed service provider.
Assembling The Pieces
So you can see the picture that's starting to emerge.
- A strong business mandate for a new generation of collaborative business processes, both inside and outside the firewall.
- A corresponding desire to manage new information risks associated with these new forms of collaboration.
- A strong architectural preference to consume these collaboration capabilities as an external service, rather than as a traditional IT implementation.
- A pronounced advantage for service providers to do better in this regard, simply because they (a) have alegacy-free environment easing the implementation of DLP (think everything in a VM), and (b) the ability to better amortize the back-end over multiple tenants.
The cloud wins because it's better, not because it's cheaper.
If I were to make a personal list of "top ten cool technologies for enterprise IT", DLP would certainly make the grade. And, at first blush, it looks pretty simple: scan for unwanted data, and drive a workflow.
Well, enterprises are complex places, and nothing so simple is practical for any meaningful set of use cases. RSA has offered up an expanded way of thinking about desirable DLP attributes.
First, the simplistic notion of "look for unwanted data" has to be replaced by a more robust set of concepts around policy and classification of risk. These, in turn, can be very industry-specific, which leads to robust policies being developed by knowledge engineers.
To my way of thinking, when one acquires a DLP environment, most of the value comes in this very detailed and specific IP around specialized areas of managing information risk. Indeed, most of the engineers in our DLP group are best categorized as "knowledge engineers" who bring this very specific liguistic, vertical and regulatory knowledge into the product.
Risks are directly correlated to the identities of the people involved, so without some robust mechanism to identify people -- their roles and associated risk profiles -- you end up with an ineffective solution.
Once risks are identified, the challenge moves to workflow -- not only presenting administrators with real risks and associated contexts, but automating as much as possible so that administrators end up dealing with interesting exceptions rather than floods of mundate alerts.
Scalability also comes into play here as well, and in multiple dimensions. Face it, you're scanning each and every piece of data in your environment. Done poorly, you'll build a new data center just for that purpose. Done right, the resource requirements can be relatively modest. There's also domain scalability -- being able to look at all forms of information, wherever they might be.
And, finally, it's important to drive as much commonality as possible in terms of policy definition, interpretation and execution. The RSA ecosystem includes not only its own products and adaptors, but those of the broader EMC as well. Add in VMware, Microsoft and Cisco, and I'd argue that we've got a "critical mass" in this regard.
Knowledge Engineering And Policy Templates
I thought this would be a great slide to share showing the relationship between these pre-packaged use case templates (over 150 as of this release) that not only cover broad horizontal information domains (e.g. HR, finance, etc.) but specialized vertical requirements.And, of course, we've got an example of one of our knowledge engineers that keeps this policy library at the cutting edge of risk management. These skill sets are hard to find -- hence my earlier comment around the underlying differentiated value of DLP being in these embodied policies.
Many business leaders focused on risk mitigation also appreciate that these policies (and their supporting frameworks) are best-in-class solutions, so they can't be accused of not taking all reasonable measures to mitigate information leakage risks.
No simplistic pattern-matching algorithm will do here. To find information of interest, the RSA DLP suite has to employ a very wide range of techniques. Most of the enterprise focus is keeping good people from having a bad information day. All of us are fully capable of pushing the wrong information at the wrong time.
But there's also the possibility of bad intent. It turns out that human beings can be very skilled at hiding information when they're so motivated. So DLP has to be able to sniff out situations where information is being intentionally obscured or masked: compression, encryption, substitution of letters and words, and more. Intellectually, this is fascinating stuff.
Too much information is also a bad thing. Generate too many false positives, and you'll easily overload the workflow bandwidth, raising the risk that important exceptions aren't fully acted upon. Indeed, tuning the environment to generation the appropriate level of back-end workflow is part of the "secret sauce" of any DLP implementation.
It's one thing for the CFO to be handling sensitive financial information. It's another thing entirely for a temp worker to be doing the same thing. The best source for real-time identity information in most corporate environemts is Active Directory -- a useful starting point for building a more robust profile linked to what's found in AD.
AD also gives clues as to how people tree up in the organization. A surprising number of DLP notifications are to someone's immediate supervisor or manager, although serious stuff might involve legal, HR, etc.
I know, this all sounds so Big Brother, but information is pretty important stuff, just like money. If someone tried to inappropriately liberate a big pile of money from an organization, we'd expect a stern response.
Well, information isn't that much different. In any organization, there are clear policies around who is allowed to handle various forms of money. What you're seeing is the same control frameworks being applied to the more liquid concept of "information".
The Magic Of Workflow
Not only to individual events need to be managed, overall patterns need to be monitored as is true in other aspects of security. A large number of small events can be just as concerning as a single large event.
However, DLP events can end up being part of an even bigger picture. For example, enVision logs and correlates more traditional security events -- authentication failures, for example. Correlating "soft" security events and "hard" security events creates an even more compelling picture.
And, with the addition of the Archer GRC framework, the idea of an even higher-level risk assessment and management framework becomes very compelling.
Back To The Cloud
Most enterprise IT organizations know they need this sort of capability, but find it difficult to make the up-front investment in people, process and technology.
But imagine your cloud provider had this capability, and could provide it either as an add-on to hosted workloads (think collab apps running at the provider), or as a back-end service for traditional enterprise IT deployment models?
Would you find that an interesting proposition worth exploring?
I know I would ...