Unusual Use Cases for Serverless Part 2: Serverless to the Rescue: Revitalise Your Outdated Software (Without a Rewrite)
Greetings from Legacy Land
If you've ever wanted to scream when someone tells you, “oh, we can't change that because it's always been done this way,” then perhaps thinking about your systems in a different way to introduce new functionality might just be your solution. Serverless is an approach which offers a way to inject modern functionality into your outdated software, without the headache of a complete overhaul. In this second part of the unusual use cases for serverless series we find out how to take advantage of this burgeoning tech to uplift systems to meet modern requirements of businesses.
Introduction
Any organisation that has been around for any substantial length of time will tell you that they’ve developed software barnacles (going with a ship and nautical theme for this post) relying on legacy software which brings with it the usual thieves of time and performance: lack of modern features, inability to connect with newer systems and high maintenance costs. However, outdated doesn’t need to mean useless. Just think of any enterprise you can find and you see all sorts of systems churning away: some mainframe somewhere in a datacenter or a crusty old server sitting in a room somewhere that has a sticky note on it that says “do not turn off”. Finding anyone to actually tell you how these systems work can be nigh an impossible task, let alone what they do. Many people think of serverless as just another way to build rest APIs, but there are so many other great examples of using serverless tech to build successful systems. In this second part of the series: Unusual Use Cases for Serverless, I want to discuss how we can modernise legacy software to build a bridge between the old and new without a rewrite.
Use case scenarios
Let’s jump into some scenarios where you might find legacy systems which are ripe for modernisation with examples.
Inventory & Ordering: Legacy inventory system gets real-time updates and triggers re-orders via serverless functions when stock is low.
Customer Communication: Customer data from a legacy CRM is fed into a serverless function to generate personalised email campaigns.
Scheduled ETL: Serverless function moves data daily between the legacy system and a cloud data warehouse for historical analysis.
I’m kinda throwing ideas around here but I hope with these suggestions you’re beginning to see some ideas for how we could revitalise some of these old systems. Let’s dive into these scenarios in a bit more detail. Since we’ve covered customer communication in the previous post, let’s focus on the other two for now.
Inventory & Ordering
The problem
Inventory updates in the legacy system are manual, stock-outs are discovered after the fact, and re-ordering is a slow process involving multiple systems or even paperwork.
How can we modernise?
A serverless function could be created which queries the inventory data in the system. Depending on the database that’s used this could be invoked by changes to the relevant database tables or on a schedule. In Google Cloud, you can use time-based triggers with Cloud Scheduler to invoke a function at a specified time, based on a CRON schedule to make queries to the data set, or if the database supports emitting events when a change happens then the function could respond to those changes.
Depending on the desired architecture of the system you could have the function write to a messaging topic or a datastore which could then trigger another function to place orders. Adhering to a single-purpose approach of serverless code, the downstream function listening to the changes on that topic or datastore could generate orders to restock low levels, it could send alerts to teams that stock levels are low or update another system such as a website that stock is low of that item.
Benefits
In this example, using serverless functions to hook into these legacy systems offers a number of benefits to revitalise the workflows and processes, without need to rewrite! We’ve got proactive stock management by ordering automatically or at least alerting about low stock, we can help reduce lost sales and improve the customer experience. We also reduce the workload on staff to do such a check or report by making it an automatic thing, and this improves the visibility into inventory levels across the org. It doesn’t have to end there though. Let’s talk about the next example, Scheduled ETL, where we further improve this system.
Scheduled ETL
The problem
Valuable business insights are locked in the legacy database. Manual data extraction is time-consuming, prone to error, and limits the frequency of analysis.
How can we modernise?
With a similar approach to the inventory and ordering, we could make use of scheduled functions which run on a cadence that makes sense for the business, be that nightly, weekly, etc to query the legacy system, potentially at a time of day where there’s low usage, so as to minimise impact, as some legacy systems do not cope well under load to extract specific data sets needed for analysis. This might be for example to query for the past week’s sales to understand trends.
The function could also perform some basic cleaning of the data or transform it, or if a bigger transform is needed it could be piped to Cloud PubSub where another service could run it through a data pipeline of some description. Finally, the data can land in a data warehouse of some sort, such as Snowflake or BigQuery where it can be analysed away from the business critical system.
Benefits
With this approach to extracting, transforming and then loading the data into a data warehouse, it enables teams to quickly and more easily run reports to enable up-to-date decision making. The automated process can reduce the toil needed to do this manually at odd times of the day as well as the time spent actually doing the task. Additionally some other benefits include enabling historical trend analysis which may not have been possible before in the legacy system due to data size extraction limits or query limitations. This also opens up possibilities for more advanced analytics and machine learning capabilities down the track, enabling the organisation to take advantage of ML workloads and produce AI capabilities around their organisational data.
This is all very nice, but why?
So we’ve discussed some of the benefits of implementing these use cases, but what else is there, sell it to me! Ok, well there are a number of benefits to these use cases we can discuss. With the approaches we’ve discussed above, we can now start to do some interesting things with the newly available data, such as inventory availability, and sales reports etc.
We can build some new serverless APIs, or really any kind of API to query this data and surface up trends to the users of our system. Here’s some generic bullet points, so you can think about how they might apply to your system.
Extend Functionality: Add new features without touching legacy code.
Improved UX: Build modern interfaces on top of the old system.
Enhanced Integration: Connect legacy systems to the web, SaaS tools, etc.
Cost-Effective: Avoid expensive rewrites or replacements.
Gradual Modernisation: Incrementally modernise on your own timeline.
Some of my favourite examples in these dot points are the ability to extend functionality and introduce new features without having to affect the existing system, demonstrating the ability to enhance the legacy system and gradually modernise it.
Let’s use an example. Suppose the legacy system is a shopfront that is selling clothes. Previously the application or website only allowed users to find items, buy them and checkout. That’s fine right? But it’s being held back, where more modern systems might be able to add more functionality. Now that we’ve started improving our system with the serverless ETL pipelines and automatic ordering from upstream suppliers, we can offer new functionality for our internal and external customers! Some ideas include:
Implementing a “What’s hot” section, showing trending items of clothing
We may want to clear out winter stock which has high levels of inventory by putting certain items on sale.
We can show customers when stock levels are running low. With a label such as: “Last few”, or even disable selecting certain sizes when stock has run out.
Importantly, the ability to view the effect of these marketing efforts by viewing sales trends which are now available in the data warehousing
Trends can be analysed and compared to see effectiveness by understanding data that was previously locked away.
Conclusion
While completely overhauling legacy systems might be what your developers are telling you, it's not always feasible from a time and/or cost lens. Serverless offers a pragmatic solution, bridging the gap between old and new while delivering tangible business value. Legacy software seems like it could be “too hard” but I say, don’t let it hold you back – inject modern functionality in small, targeted ways. Legacy systems shouldn't be a ball and chain, limiting your creativity and locking you into a mindset of “well, it’s always worked this way why change it”. Serverless offers a way to experiment, to unlock the hidden potential within your existing infrastructure but in an extremely cost-effective way. Incremental improvements can be immensely valuable, but also consider how serverless can allow you to build entirely new experiences on top of your well-established foundations.