In the age of digitalization the technical possibilities for collecting, storing, analyzing, and sharing information have taken on new dimensions. However, the trend of digitalization comes with high growth of data volume and produces an increasing amount of inactive or obsolete data. Therefore, separating active from inactive data becomes a necessary step within content management strategies. The necessity of a solid data archiving solution remains ubiquitous.
For this we talked to Antje Dwehus, Principal Consultant and archiving expert at fme group since 2006.
Hello Antje. According to the experience gained during your decommissioning and archiving projects at fme group, what are common triggers for retiring an existing software application?
Antje Dwehus: To illustrate the need and the reasoning behind the triggers I often draw the comparison of driving an old car. Of course, at first a well-maintained car does the job, it gets you from A to B. On the other hand, when looking under the hood, there are disadvantages and high risks involved. Here are some of them:
The same applies to software. Outdated software often has high maintenance costs and it is difficult to find experts in case of a problem. Current security standards and regulations cannot be met which leads to high risks. Also, performance often becomes a problem since the software is not designed for today’s data volume. In addition, innovations cannot be integrated which severely impacts competitiveness in the digital world. Overall, the older the software gets, the more unpredictable, expensive, and risky it becomes.
Thus, a well thought through archiving strategy is paramount not only to manage corporate and regulatory compliance requirements and e-discovery use cases.
These are convincing reasons to rethink one’s archiving strategy. If a company finally makes the decision to move data under retention to an enterprise archive, what makes a sophisticated archiving strategy in your opinion?
Antje Dwehus: In my experience, the most important factor is to consider the analysis of the data before archiving it. Don’t assume that a simple 1:1 migration to an archive will bring benefits. It will only shift the problem to another system.
When dealing with a complex system, we sometimes need to dig deep, define the archiving units (in other words object types and data structure) and write down the applying regulations. Next, we determine what is needed and what can be disposed of. Despite the actual data, roles, and processes must be considered. A sophisticated archiving strategy includes e.g. the role of a retention manager responsible for managing retention policies and legal holds and approving purge lists.
That sounds great but also complex considering the amount of data within large organizations. The process of transferring data into an archiving solution becomes a lot easier when using a smart ETL tool. Why do you think a good ETL tool is key to successful decommissioning projects?
Antje Dwehus: Without an ETL tool the risks of errors during the migration are high. A project-specific custom implementation is always error-prone and should be avoided during migration projects. Especially, when handling data under retention the data integrity, proof of completeness, and the chain of custody are essential. A good migration software includes these features, just as well as the possibility to enrich metadata and to create reporting and documentation of the migration process.
Having talked about triggers of decommissioning projects and the best approach for such challenges, what is a popular archiving solution out there and how is it suited to different decommissioning goals?
Antje Dwehus: We believe OpenText InfoArchive is one of the best decommissioning solutions on the market because it is a scalable enterprise archive designed in accordance with open standards to enable effective lifecycle management of large volumes of static data and content generated by any business application. It offers a comprehensive set of compliance features that ensures customers to manage information in compliance with even the most strictest regulations. This includes retention policy management, legal holds, purging, encryption, data masking, and full audit of user activity. In addition, it provides intuitive access to data and content.