Successfully migrating content within the life science industry: two exemplary projects

Colleague | Kevin Wehe

Author
Kevin Wehe
Associate Consultant @ fme AG

June 26, 2019

In the recent past, we had the opportunity to work in two parallel migration projects at one of the top five global pharmaceutical companies. The project names were “Dallas” as well as “Annapolis”.

Our approach – PoC

Our approach for every migration is to have a PoC – Proof of Concept – to display the ability of delivering the project within the estimated time and budget. However, most importantly, we can have a look at the target and source repository and the metadata itself to determine possible obstacles during the main project. Following that approach it is possible to offer fixed priced projects if that is necessary within your company.

During the PoC we set up a development system for our tool of choice – migration-center – to scan the data. With migration-center, no data manipulation will be done within the source system. In addition, the scan can happen while the system is still up and running. This enables users to be still productive until close to the actual migration date when a system freeze becomes necessary to prevent data loss.

Facing the main project and its challenges

After the PoC was done, we started with the main project in which we can use already implemented rules and mapping definitions from the PoC. We scanned the full scope of documents and analysed for eventual problems with the metadata.
For the Annapolis project, the scope of the migration was 2.6 million documents and approximately 11 million corresponding audit trails. The source repository was, in this case, a little different from one would expect. We had to scan from two different repositories and consolidate the data into one single Documentum D2 LSQM (Life Science Quality Management) repository. Another challenge was the geographic locations of source and target repository. We evaluated in the PoC that scanning via the network was not feasible due to high latencies. To solve this problem we used a migration-center instance in the same network segment as the source system. After this initial scan the server was transferred to the target location. From there we implemented our rules and had first iterations to import data. For the quality and productive environment, we utilized the delta capabilities of migration-center and used the D2 connector.

On the contrary, the Dallas migration was smaller in the total size of documents; however, there were challenges due to some delicate details within the business requirements. We migrated around 110k documents and 2.5 million audit trails from an older D2 version into the new 4.7 D2 version. To be faster we chose the approach to use the OpenText Documentum layer for the import instead of the path via the D2 layer where we could have used its business logic.

This approach has different benefits but it needs to be carefully considered, because as mentioned: We do not use the implemented business logic and therefore need to ensure that the documents will have the appropriate values set.

Audit trails

I quickly would like to talk about the audit trail import during the projects because you can chose between two different approaches to import those records. The first and most common used approach would be to import the audit trails with the documents. The audit trails would end up in the dm_audittrail table and be available through the system overview. The second approach would be to extract the audit trails from migration-center and import them via DQL (Documentum Query Language) script into a legacy audit trail table. This would have the benefit that this table would separate old audit trails from the ones that will be created in the new system. Besides separating old and new audit trails this would have the huge benefit of performance increase in searching for audit_trails. In most ECM system customizations the legacy audit trail table can be implemented to enable auditors from the authorities to review the old history of the document.

Making sure your QA responsible is happy

For the validation of computer systems, the guideline following GAMP 5 (Good Automated Manufacturing Practice Supplier Guide for Validation of Automated Systems in Pharmaceutical Manufacture) is applied to ensure that the system works as intended. Therefore, we develop during the migration project IQ’s (Installation Qualifications) and OQ’s (Operational Qualifications) to ensure that later the PQ (Performance Qualifications), sometimes referred to as UAT (User Acceptance Tests), are passed without any errors.

For both, the ‘Annapolis’ and ‘Dallas’ projects, these documentations were provided and the migration in the quality and production environment were executed following these documents. Therefore, no migration related errors occurred, and we finished both migrations successfully.

For both projects, which were executed in parallel by our migration experts, we started November 1st ’18 and finished at the estimated date March 15th ‘19. Both systems are live and used globally by users all over the customers sites.

If you are in need to decommission a legacy application – be it due to high license costs, change in corporate strategy regarding document management systems or due to running on an old infrastructure, where an update is no longer feasible, contact us now and get our migration experts to solve your data migration problems.