Moving legacy application architectures to the public cloud bring with it many challenges. In many cases, it’s a question of finding a problem for the solution public cloud offers. On the surface, the architecture of legacy applications doesn’t appear complementary to public cloud design and consumption. However, there are use cases where legacy applications fit natively into the public cloud.
The initial challenge with public cloud and legacy applications is the consumption model. Public cloud presents an elastic consumption model. Customers pay for the resources consumed. For cloud-native applications that auto-provision and release resources as demand fluctuates, the pricing model makes sense. Nearly all legacy applications rely on a static infrastructure. Applications such as Oracle databases and SAP don’t have DevOps hooks into the infrastructure. If an Oracle database shrinks in size, the underlying storage doesn't return to the pool of resources. There isn’t a model to reclaim the cost of unused resources.
Ephemeral use cases
However, there are elastic use cases around these applications. Test, Development (Dev), Break Fix and DR are all examples of ephemeral environments. I served as the infrastructure architect for a Fortune 200’s SAP infrastructure. It could take up to 9-months to fulfill a provisioning request for a new development environment from the application team. The effort required to clone data, provision hardware and perform integration testing and security audits couldn’t be overstated.
All of the effort and expense needed to perform the provisioning and release of the resources is a legitimate challenge. While the application’s architecture isn’t cloud-ready, the process calls for cloud-like agility. If organizations can figure out a method to automate the provisioning of infrastructure, data and application testing, organizations benefit directly from the new agility.
DevOps all the things
The practice of DevOps lays out the steps to leverage public cloud services for ephemeral legacy app environments. The first step is the document the requirements and steps for provisioning these environments. A typical set of steps may look as follow.
- Step 1: Provision compute & network infrastructure
- Step 2: Clone Data
- Step 3: Install application
- Step 4: Configuration testing
- Step 5: End-user acceptance
Each step may break down to a dozen or more activities each. Retrofitting these applications for automation is a much more difficult task. For example, if you leverage a specific naming convention for data volumes for SAP (SAPDATA) chances are you’ve created these volumes manually in the past.
Cloud engineers must create a way to collect the variable information for the volumes and pass it along to the automation tool for provisioning the volumes. The same challenges present themselves when automating test and integration.
Like all transformations, it’s a journey and not a sprint to the finish line. Every organization needs a place to start. Abstraction of data is the best place to start in my opinion.
Cloning data from one environment to the next is one of the most difficult challenges and time-consuming challenges of provisioning these ephemeral workloads. A typical process may include performing a database restore into the new environment and cleansing the data of sensitive data.
Gestalt IT hosted me for Cloud Field Day 3 (CFD3) where Delphix discussed how their solution enabled the automation of masking sensitive data as part of the abstraction process. The platform provided other orchestration that assisted in the cloning of data from one environment to another.
Delphix isn’t alone in this market. Gestalt also hosted me for Tech Field Day 3 where Actifio presented a data abstraction solution aimed at legacy workloads.
These solutions offer a starting point to the operationalization of public cloud for Dev, Test, and DR. During CFD3, Delphix customer Sai Adivi of Dentegra shared his companies journey to leveraging public cloud for a similar application.
All things equal
If your organization operates with a lift and shift to cloud mandate, these techniques may blunt some of the cost associated with lift and shift. While there may be an increase of cost in operating production applications, taking a cloud-native approach to these non-production use case may help keep the operational cost under control.