The disruption to workplaces and to teamwork during the peak of the pandemic spurred many organizations to propel their IT transformation projects higher up the business agenda.
As a result, we have seen initiatives to adopt more modern, cloud-based applications, content management repositories and collaboration platforms increase sharply over the last two years.
Microsoft SharePoint Online, accessed securely via the cloud, has been the subject of a particular surge of popularity, offering dispersed teams a means of sharing and collaborating on data and documents very fluidly from any location.
The move to Microsoft SharePoint also represents a chance to reduce dependence on old, legacy repositories and systems – those that have become unwieldy over the years, and become a headache to maintain.
In seeing cloud migration, and specifically SharePoint Online adoption, as the path to simplification and new agility, business functions often underestimate what will be involved in retaining (and ideally improving) the searchability and usability of content, and in preserving the richness of capability that had been built into their outgoing legacy systems over many years.
With all of this in mind, we have distilled some important advices for companies looking to embrace SharePoint Online to modernize the way teams work.
- Be aware that moving content between two very different systems can be challenging, and build cross-functional teams.
In my previous blog, I noted that content migration is rarely a straightforward ‘copy across’ exercise. It’s important, then, to establish what will be involved so this can be planned and budgeted for properly. A legacy-SharePoint migration is really about moving an entire business process.
As well as input from the immediate business function, specialist technical expertise will be required. That means people au fait with the old system; the new system; the network in between; the tools to perform the migration; and critical prerequisites such as data analysis, data preparation, and so on. It also demands the involvement of IT strategy responsible and those who will maintain the new SharePoint system.
From day one, it’s advisable to have in place a cross-functional team – distinct from the business-as-usual and from the main IT team – which owns and is responsible for the entire migration and its initial scoping.
- Analyze your source data and allow for fundamental differences between mature, legacy systems and the fresh new SharePoint environment.
The systems being retired may be large, inflexible and expensive to maintain, but their beauty up to now has been their honed ability to do a given task well. As soon as you try to move their sophisticated content management structures across to SharePoint Online, some immediate challenges may emerge. SharePoint won’t exactly match the capabilities (Relations or Renditions for example) of the outgoing system, it won’t look the same, and it won’t behave in the same way.
A prerequisite to determine how much work this will create in the preparation and migration will be to scan the existing content and analyze it. It will soon become apparent that the two systems also have different restrictions, lengths of path names, and so on, and that the legacy system contains a significant amount of poor-quality data (this may be incomplete, filed incorrectly, and/or named inconsistently, and there may be duplicated or defunct content).
- Understand the target use cases and therefore the data model required in SharePoint, as the basis for defining the technical migration rules.
Once content has been assessed for its complexity, it becomes possible to determine what will be required in the form of a data model for the target SharePoint system, which in turn will inform the technical migration rules that will need to be specified.
Even within common functions such as Regulatory, Quality and Safety, use cases can vary enormously. It may be that SharePoint is going to be used primarily as a collaboration and document management platform, where business logic is fairly basic and sheer volume of incoming content is the main challenge. Or the target application(s) may involve fewer documents, but a lot of metadata, business logic, and inter-data relationships.
Each scenario will have a bearing on the scope and complexity of the migration, including the preparatory work that will be involved. In both cases, it will be important to seek expert help to (a) design the data model; and (b) establish and set up the best approach to the migration.
- Plan & package your migration carefully, based on data gathered in a pilot, accepting that the work is unlikely to fit neatly across a weekend.
In an ideal world, migrations would take place over a weekend or extended public holiday, so that users switch off on their last day of work and come back to a new, functional system on their first day back. Large migrations of legacy systems rarely fall into that category, which means a plan needs to be developed which fits around the business-as-usual.
A high-volume migration is likely to involve hundreds of incremental stages across a significant period of time. The issue then becomes how to cater for content/data which continues to change in the meantime. The solution to this is typically a delta migration. This usually takes the form of a big-bang transition, in which most of the content is migrated, then a smaller delta catering for any data that has changed, followed by a final delta perhaps the day before the go-live.
The point is that all of this needs to be thought through from a business impact perspective, ahead of time. The delta will also have a bearing on the chosen migration technology. Certainly, it will be important to fully test the system performance and validate the migration approach and execution before finalizing the plan.
- Consider and embrace the limitations of the migration tools & interfaces available.
Although Microsoft provides some decent technical interfaces/APIs for migrating content to its cloud infrastructure (Azure), these come with some practical limitations.
For instance, its interfaces for large-scale migrations, which are designed to balance very high data demands for optimally-packaged content overcoming throttling don’t allow for the delta approach (later updates). To ensure a robust migration, then, it makes sense to lean on experienced data migration experts who can bridge such gaps.
For projects with SharePoint Online as the target, fme delivers an entire migration service wrapped around Microsoft’s own APIs. This encompasses all of the early analysis, planning and specifications, then execution drawing on our honed best practices.
Our own rich toolset plays a vital role in all of this, in comparing legacy system content with the set-up in SharePoint, and is able to transform that data to fit the new target data model – using extensive rules that we have helped to describe, specify, set up and configure. Our highly sophisticated rules engine, meanwhile, expedites the set-up of complex rules, and the enrichment of content before it is migrated (using AI, as appropriate).
Crucially, we can run these rules as a simulation without any impact on the source or target system.
This leads us neatly to a final – but critical point – that, even with the most extensive and rigorous planning, migrations can still go wrong. That’s why it’s crucial to cement the execution plan with a series of small pilots (e.g. representing different teams/use cases), before going all in. As long as no steps are skipped, a simplified, SharePoint-enabled future lies just over the horizon.
To discuss an upcoming Microsoft SharePoint content migration project with our experts, just reach out to us.