What's one of the most often-cited aspects of digital transformation? The need to migrate from legacy technology... And it's not difficult to see why. Legacy systems are a threat to organisations because they constitute technical debt – they are hard to maintain, often present a security risk, and cannot flex to changing business needs. In this context, digitally native competitors have a huge advantage.
But what exactly does migrating legacy tech mean? Is it necessarily a move to the cloud? What are the risks involved?
Here we examine the benefits and challenges of legacy migration, and how, practically, organisations can take on this challenge.
Cloud vs legacy migration
According to TPXimpact CTO, Stuart Arthur, when we talk about cloud migration and legacy migration, we're often talking about the same thing.
“These days anything that's not hosted in the cloud creates risk and holds you back,” he says. “If they don't make use of the cloud, organisations lose out on the benefits of resilience, scalability, and speed of development. So cloud and legacy migrations are intrinsically linked for the most part - there are businesses that don't adopt cloud for good reason, but that's rare and the exception rather than the rule.”
Two different strategies and a middle ground
Stuart outlines two different migration strategies for organisations upgrading their legacy systems, depending on their particular set of circumstances and aims.
The first and most basic is a “lift and shift” approach.
“With lift and shift, the application that's on-premise doesn't get changed much,” he says. “The migration process involves reviewing that application, and seeing if there are any changes needed to get it to work on a cloud platform like Azure, Amazon Web Services, or Google Cloud Platform. You would then typically set up virtual machines (software that runs programs and apps) that mimic the structure of the on-premise system.”
This route is suitable for organisations who need to transition to the cloud as quickly as possible, so that they don't have to manage their legacy estate anymore or for a commercial reason, such as a licence renewal. Taking the same application and making it work on the cloud provides some benefits even though the system is run through virtual machines.
The second option is more comprehensive. It involves taking the application that was on-premise and completely scrapping it, in order to build a new, cloud-native solution.
“We'd rewrite a new application on the cloud, using the native functions and capabilities of the cloud platform,” Stuart says. “It does all the same stuff that the on-premise application used to do, but now it's utilising things like serverless functions — it's completely modern technology that can be adapted more easily.”
This is usually the most effective solution, both in terms of cost and in creating a robust, flexible technology system that is easy for technical teams to work with. This approach also presents an opportunity to review the application against user needs and provide a better overall experience.
If there is a middle ground, Stuart suggests that it lies in transitioning to the cloud over a longer period of time, which is particularly relevant when migrating larger monolithic platforms or applications.
“You could gradually rewrite certain parts of an on-premise platform and get that working on the cloud, continuing to do this over time. It's a bit more of a hybrid approach,” he says. “You might choose this option if you need to get new digital products live that depend on a larger legacy platform, or you want to start realising the benefits of the cloud as early as possible.”
Navigating the situation
One of the main difficulties transitioning from legacy technology is the need to keep critical systems running while the migration takes place. A risky, “big bang” approach is usually out of the question — teams should instead proceed in a much more iterative way, building smaller API based services and realising value as early as possible.
Another challenge is the huge variety of legacy systems out there, with each containing unique structures, constraints and workarounds that have been added to over the years. This means that there is no one single route for legacy migration and it’s often hard to understand business logic when the application code is so poorly structured.
What happens in practice?
Although the details are highly context dependent, Stuart shares an example of a recent legacy migration project, as an example of what the journey might look like.
“First, we did a review of the platform,” he says. “It was on-premise, needed a lot of servers to run it, and because it was proprietary and out-of-date it needed a specialist team to work with it, which created a bottleneck in the transformation programme – with any changes having to go through one party. In our review, we pointed out the risks and the opportunities, including things like cost of change, ways of working constraints, and security issues.”
“We then did a Discovery phase to evaluate the different options. One option was to migrate the platform as it was to the cloud. The problem with that approach was that it would have needed even more server resources than before – at great cost – but the code would still be hard to work with and require specialist teams behind it.”
“The second option was to write the integration layer from scratch on the cloud using native cloud capabilities. This proved to be much more cost effective — it didn't require as much server resource, there were no licence costs, and any team could work on it."
"We did a quick proof of concept of this method, took the client through the data that proved it was really cost efficient and value adding, and helped them to build the business case to go ahead with the project.”
A thoroughly modern approach
By migrating legacy technology to the cloud, organisations benefit from the powerful, specialist inbuilt functions of cloud platforms. This is particularly important when it comes to security, with cloud providers offering services such as secure API gateways and firewalls to protect systems and data.
It also gives organisations all the advantages of writing code and building systems in modern ways.
“With cloud native applications you can hook code up to continuous integration and delivery pipelines, and more easily test it for security, which gives a business the assurance that any change is being quality checked,” says Stuart. “It's fully auditable and automated, unlike legacy tech where it is very hit and miss - platforms are very tightly coupled, so if you change one thing it breaks something else...”
“With a modern, modular approach you can also have multiple teams working on different parts of the integration layer, so the system is easier to maintain and adapt," he adds. "There really are so many benefits to be found in migrating from legacy systems.”
Our recent insights
FAIR data - what is it and why should you care?
One of our senior data consultants, Dr Alasdair Gray, explains what FAIR data is, who’s using it, why it’s so useful and some common misconceptions around it.
Big Data or Machine Learning — What's best for your organisation?
What's the difference between big data and machine learning, and which should you be using to satisfy your business requirements?
How to get started with low code
When should you consider low code solutions, what are the benefits, and how to begin?