6 major pitfalls to avoid in your content migration

Part I/II

Thumbnail | Blogpost | 6 major pitfalls to avoid in your content migration - Part 1
Heroshot | Front top | No shadow

Author
Thomas Berger
Project Leader @ fme AG

July 31, 2023

A content migration project is often more complex than it initially seems, and that is in the nature of things: There are so many key people and users involved in a centrally used document management system that at least one person in each department must or should take on a role. More or less.

An illusory project starting point

  • The management team and the (key) users in each department decide which new content platform is the best fit to replace the current one. Everyone agrees.
  • The users of every involved department are aware of their content and how it can be transferred to the new system. They know the requirements and what is important in terms of usage and metadata.
  • The IT department is familiar with the technical requirements such as systems, storage space, or security and will adapt to them: They are in control of the source system, set up the target system, and ensure a secure connection between these systems for the actual content migration. In addition, they explain to the users what will change, show them the limitations of the new system, and teach them how to work with it properly.
  • The quality of the data is excellent in every aspect. Each document is structured, well organized, and has all the relevant metadata required for the business processes in question.

You know what I am getting at. In our experience, this example never actually occurs in a document migration project. Deviations do happen and will most likely lead to one of the following pitfalls.

Common problems with content migrations

1.) Underestimating project complexity

A common mistake, and not only in the migration context, is to underestimate the complexity of a project. You must have a thorough understanding of the business processes, the capabilities of the content management system, and preferably the technical details.

Illustration | 16

The solution here is to bring all stakeholders to the same table and conduct an in-depth analysis of all existing key points.

For this blogpost, we asked our most experienced migration experts about their learnings regarding the estimation of project complexity. Here are the most frequently mentioned misconceptions:

  • Customers underestimate the complexity of the migration and view it as simply “copying files from A to B” (rather than seeing the potential of the migration and rethinking the way they manage content towards an improved UX)

  • Clients overestimate their own capabilities and resources, which are a mandatory requirement to perform the migration project.

  • Often, the migration effort is mistakenly estimated based on the number of files and volume rather than the complexity of the business process.

  • Source and target systems have different functions and capabilities that must be aligned during the migration. Particularly complex source systems like OpenText Documentum require a lot of thought, design, and possibly coding on the target side.

2.) Unrealistic expectations

This is a broad area, but it is vital to set expectations at a realistic level from the very beginning. These include expectations regarding the new content system, the process of migration, metadata conversion, and the migration strategy.

In order to maintain realistic expectations, continuous communication is key. Introduce the appropriate platforms early, highlight any limitations or issues up front, and report on as many steps in the migration process as possible.

What our experts say:

  • It is recommended to show the customer’s business users at an early stage examples of how the migrated data will look and behave in the target system based on the defined requirements/mapping specifications.

3.) Insufficient focus on data quality

Data is the bread and butter of our business, and that’s exactly the reason why we have document management systems. However, the best DMS is only as good as its data quality. Usually, the quality of (meta) data deteriorates over time, but in everyday work this is often not visible and does not pose a problem. It becomes very visible, though, during a data migration. Especially if the source is a file system or similar flexible system, the metadata structure does not follow any scheme – or each department has its own.

Illustration | 15

To tackle this pitfall, a substantial pre-analysis of the data quality must be performed.

Many times, this step gets overlooked or has already been done, but not thoroughly enough. Therefore, it is imperative to re-examine the results and make appropriate decisions. The good thing is, data quality issues can be overcome more or less easily with the right tools and techniques – provided the analyses are well done and the subsequent procedures are well planned, of course.

As business processes as well as transformation rules and metadata mapping depend on good data, this is overly important.

Lessons learned from our experts:

  • Customers sometimes overestimate their own data quality.

  • Two of the biggest traps relate to insufficient quality of legacy data and missing metadata. Both scenarios can only be resolved by a detailed analysis of all source data. The time, effort, and cost required for this solution are usually not estimated or underestimated.

  • Data quality is highly dependent on the data source. If you have a well-structured EDMS system with thorough processes, the data quality is usually very good. In most other scenarios, the quality varies between acceptable and very poor, thus forcing the customer to decide whether to clean the data or rather continue with the “garbage in, garbage out” principle.

Was this blogpost helpful to you? Keep posted, the second part will be coming soon!
It will give you practical insights to troubleshoot metadata mapping, IT, privacy and security, and scheduling challenges.