Americas

  • United States

Asia

Oceania

Data Genomes and Persistent Security – Protecting Information at its Inception

Opinion
Jan 28, 20103 mins
Data and Information SecurityIdentity Management SolutionsIT Leadership

Below is an excerpt from a paper I wrote back in May of 2008. The actual paper is linked below. DARPA here I come https://www.theregister.co.uk/2010/01/26/cyber_genome_project/:

Data is flowing everywhere in multiple formats on multiple devices replicated, backed up, modified, and stolen. Much like the quote from Jurassic Park by Jeff Goldblum’s character “life will find a way,” data will find a way to escape. Data wants to be free and unencumbered. Once released from the confines of the database, the data loses all semblance of control over its use and format. Anyone who receives it can clone it, divide it, and modify it until it loses its true essence and meaning. Regardless the sensitivity, data takes on a life of its own sometimes becoming the source for a brand new structured database. This occurs many times without data validation and verification of its authenticity and accuracy. It follows the garbage-in garbage-out (GIGO) model only there is one flaw in this age old computer term; it is garbage-in gospel-out since anything the comes from a structured entity is considered to be both accurate and true.  The integrity of the data is not brought into question since its source, was a structured database. 

There is a belief by some optimistic souls that in time, unstructured data will acquire structure of one form or another. This may come in the form of metadata, or data about other data, or the ability to provide highly secure views to this data. Technology companies are devising and selling software that crawls the local area network based upon pre-defined policies searching for sensitive information. The concept is to automatically classify the data once it is found moving it to pre-defined storage locations based upon other pre-defined rules based upon risk and criticality to the organization. The criticality may be regulatory driven or business driven. Regardless the driver, organizations are looking at this method as a lifesaver. Combining this method with other tools that define who has access to what folders, and you have a pretty good idea as to who has access to the sensitive information. The problem with this method is that it does not get to the heart of the issues at hand. It addresses security issues after the fact. It is improving the situation but it is nothing more than an influenza vaccine that may or may not work and of course, must be renewed periodically since the threat changes to the host.

New approaches must be developed to protect sensitive data at its inception, at its birth, and remain with it throughout its life. If we accept the premise that there are no longer perimeters in our corporate environments, then where should we spend our time and focus in protecting what is of value to us?

Read more here: Data Genomes and Persistent Security – Protecting Information at its Inception – May 2008 (posted January 2010)