SullivanStrickler’s tagline is “Providing access to the world’s legacy data”. Over the course of three interviews, we cover the growing problem of legacy data – not just the growing volumes, but the increasing expectation that organisations know what data they have and can find what matters – and the solutions offered by SullivanStrickler.
I open by summarising the problem: data is kept for good reasons (such as pending litigation or regulatory requirements), and bad reasons (no-one is responsible for managing it and the lawyers have said “Keep everything” without thought as to the implications). Old formats and redundant systems make it near-impossible to comply with obligations which increasingly bring financial penalties and corporate embarrassment. How can you say you have found everything relevant when you don’t know what you have? If you can’t assess the risk, how do you know what resources to apply to that risk?
The theme of the interview is set by Brendan Sullivan’s opening remarks describing the evolution of tape backup from its beginnings to the model established by SullivanStrickler.
The core technology, he said, is the ability to decipher how the original software stored the data during the backup process, and the ability to process the resulting data in a format useful not only for legal and regulatory requirements, but also for use within the enterprise itself. There has been a surge in the requirements to manage information, and while attention is commonly focused on the “reactive” drivers – legal and regulatory factors – companies are also seeking solutions to address proactively the rapid increase in the amount of data stored.
The off-site storage model, Brendan Sullivan says, began as a short-term scheme of rotational backups which were moved off-site for security. The live data still existed and there was no particular need to have intelligence about the data or to apply to it any data analytics or business analytics.
That purpose was left behind as companies kept everything, to a point where they cannot make decisions about it because they do not know what they have. SullivanStrickler saw room for a new type of storage which provided analytics, remediation and the ability to produce data without the cost and delays of moving it back on site. Most of it was not wanted, was not useful or was, indeed, a liability. SullivanStrickler provides analytics at the metadata level and about back-up sessions sufficient to enable decision-making without delving into the raw content.
The rest of the interview elaborates on this. Fred Moore addresses the question whether tape can be an alternative to the cloud as an archive medium (answer, yes), and describes the process used by SullivanStrickler as “reawakening the archive”.
Brendan Sullivan describes the first stage of the process as “selective migration”, reducing the number of tapes to manageable proportions. The result is available in review style in a web browser. He says:
“in one multi-stage process we’re migrating, we’re refreshing, we’re compressing, shrinking, we’re remediating or defensibly deleting, and we’re porting and creating data level intelligence to that unstructured archive at the same time”.
To me, the interesting thing about this interview is that it puts flesh on the (often glib) assertion that organisations must get rid of stuff they don’t need so that they can find what they do need. This is easy to say, but one rarely hears about the practicalities. What actually happens? Where is my remediated data, and how do I search it? How will it help me face the increasing demands, not just from courts and regulators, but from customers, employees, and others who are entitled to know what is held about them and then receive production of it?
The remaining interviews in this series elaborate on some of these points. Together they aid understanding not just of the problems faced by data-rich organisation, but of the potential solutions.