Migration Matters – Do the opportunities outweigh the risks of content migration?

Colleague | migration-center Team

migration-center team

June 16, 2021

Content migrations between ECM systems, file shares, databases, and other legacy applications usually take on very complex dimensions in a business context. Today we want to find out what challenges but also opportunities content migrations offer to organizations.

For this we talked to Florian Piaszyk-Hensen. Florian has over 16 years of experience in the content migration environment and has been in charge of the product department in which migration-center is developed and distributed since 2005.

Colleague | Florian Piaszyk-Hensen

Florian Piaszyk-Hensen – Director fme Products

Hello Florian, as we just mentioned, content migrations are complex projects. They can be executed manually or with the help of automated tools. Please tell us more about the different possibilities to approach migration projects.

Florian Piaszyk-Hensen: Depending on the individual objectives and requirements of the migration project, there are basically three possible approaches; start afresh, migrate some or migrate all data. Organizations with limited legacy data often start afresh with a new ECM platform. Other organizations are running an old platform side-by-side with a new solution. Those customers typically decide to migrate just the active parts of their legacy data required for current business operations to the new system and keep the inactive data within the old platform for compliance reasons. However, it’s highly recommended to think about future costs and impacts of this decision. All three options can become cost drivers or have the potential to endanger the introduction and acceptance of the new system.

Once customers have decided to kick-off a migration project, they must plan on how to migrate the data. Often, they are offered a customized solution based on frameworks or open source components by their service partner. Unluckily, customized solutions mostly are too static in terms of reusability, not flexible enough to react quickly to change requests, nor robust enough to guarantee painless migration. Since they are frequently coming along with a poor performance and the inability to meet industry-specific regulations, individual approaches entail tremendous risks and additional costs for the migration project.

Could you elaborate a bit on the content migration challenges you just mentioned? We would like to understand the difficulties organizations face when tackling such projects.

Florian Piaszyk-Hensen: To guarantee a painless migration project, there are a lot of challenges that need to be addressed. In order to keep this answer short, let me focus on the main ones. The first challenge concerns the possible impact on daily business operations. Generally, customers are not able to shut down their business for a few weeks just to migrate data to a new system. Only if all involved systems are running during the whole project, end users can work properly.

Another critical aspect is related to the volume, complexity, homogeneity, and accuracy of legacy data – the major drivers for the actual project duration. Especially in case of complex data structures or business applications, the pre-analyze phase and the coordination with the application owners are important but also time-consuming elements of a migration project.

Finally, another difficulty is to decide on the right migration approach. Typically, there are three options available: big bang, step-by-step or delta/wave migration. To find the right approach, it is important to take all relevant aspects into account because it is nearly impossible to change the migration approach during a running project without negatively affecting costs and duration.

That sounds like a lot of work, high expenses, and a lot of unpredictability. Depending on the industry, there are additional regulatory requirements that apply to content migrations. What do clients have to consider when planning a migration project in highly regulated environments?

Florian Piaszyk-Hensen: Yes, computer system validation (CSV) is especially crucial in highly regulated industries like the life sciences industry since respective products impact public health and safety. To ensure that computerized systems and software do exactly what they are designed to do in a consistent and reproducible manner, both the European Medicines Agency (EMA) and the Food & Drug Administration (FDA) have published guidelines which have an effect on Good Manufacturing Practices (GMP), Good Laboratory Practices (GLP), and Good Clinical Practices (GCP), and an impact of law.

If data and records are migrated to another system, the migration should be planned, conducted, and verified accordingly to ensure the data set is complete, consistent, and accurate over its entire lifecycle. Thereby, the migration procedure should be tested or confirmed before transferring the data out of the system. It should be validated that the data is not altered in value and/or meaning during the migration process (see Good Automated Manufacturing Practice (GAMP 5) and EU GMP Guideline: Annex for Computerized Systems (Annex 11)).

Data migration is not only about copying data from system A to system B. A proper data quality must always be ensured. What is migration-center’s approach for data quality enhancement and cleansing?

Florian Piaszyk-Hensen: Metadata are often incomplete, incorrect, or simply do not exist in the source system. To enhance the quality of metadata during the migration process, much more effort is needed because metadata must be added manually or external data sources are required to complete the missing information. To protect the investment made in the ECM application and to sustain the integrity of the application’s data, it is advisable to go the extra mile and enhance the content with meaningful metadata.

migration-center provides a huge set of capabilities to enrich the data and information quality during a migration project. With individual metadata transformation and mapping capabilities, the integration of external information sources and artificial intelligence-based auto classification modules, our product enables clients to clean-up and improve the quality of metadata. This leads to faster retrieval of information and compliance improvements.

Now we have already talked about the numerous challenges of complex migration projects. Are there also positive aspects to data migration?

Florian Piaszyk-Hensen: Whether it is about implementing new business applications, replacing existing solutions, or decommissioning data, a migration project offers the opportunity to save costs, increase productivity, or simply modernize the IT infrastructure.

Many companies for example run ECM applications for decades and over these entire years, users typically create terabytes of »manually« classified documents in existing ECM systems. A migration offers the opportunity to clean, harmonize and enrich metadata, enhance security, and restructure the data according to the current business needs.

According to Gartner, 60-80 % of data in operational applications is inactive – this also applies to documents in ECM systems. Therefore, a migration project is a good occasion to reduce existing data volume by archiving inactive, not business-relevant content to an enterprise archive.

01 | Blogpost | Migration Matters - Do the opportunities outweigh the risks of content migration?

Data reduction: Migrating active content to a new ECM solution while archiving inactive content to an enterprise archive like OpenText InfoArchive

That sounds quite promising. What do companies need to pay attention to in order to benefit from the opportunities?

Florian Piaszyk-Hensen: First, a detailed analysis of the legacy application and data is essential. The project team needs to have a good understanding of the data structure and quality before being able to make sophisticated decisions for the upcoming migration project.

The second important step is to define the new classifications and mapping strategy. In many cases, the mapping requires more work steps to restructure the legacy data compared to a simple 1:1 mapping. If the quality of the source data is not as good as thought, a data quality enhancement is required. This can be achieved by transforming the existing data, including data from external data sources (e.g. SAP, databases, applications etc.) or using modern classification technologies (e.g. artificial intelligence classification services like AWS Comprehend).

Customers planning to migrate legacy on-prem ECM applications to the cloud should carefully analyze if all legacy data will still be relevant within the new system. Inactive data in old systems should better be transferred to an enterprise archive instead of overloading the new application.

Finally, customers should double-check right from the beginning which industry-specific rules and regulations they need to fulfill before executing a migration.

In my opinion, if customers manage to pay special attention to all those aspects in the conception and project planning, the opportunities outweigh the risks of content migrations. My additional recommendation is to use professional out-of-the-box migration products like migration-center to reduce costs, manage risks, increase productivity, and fasten the go-live of new applications. Tending to result in long-running projects or even in a project fail, custom solutions usually increase expenses and complexity of a migration project unnecessarily.