Cloud NativeDevOpsExpert Voices

Considering Cloud Repatriation? Don’t Forget Your Data!

0

Buyer’s remorse is a heck of a thing that we’ve all probably experienced at some point in our lives. You go into a store, see the latest shiny object and know that you just have to have it. You take it home, wear or use it for a few days, and then realize that it’s not everything you dreamed it would be. Before you know it, you’re bringing your purchase back to the store.

That’s fairly easy to do when it comes to returning a laptop or new pair of shoes, but it becomes a bit more complicated when pulling applications out of the cloud and back into an on-premises datacenter. And yet this process, known as cloud repatriation, is gaining steam as more organizations realize the limitations of the public cloud. In fact, IDC reports that 80 percent of companies are undergoing cloud repatriation activities.

Why cloud repatriation?

Get latest updates in your inbox, subscribe to our daily newsletter.

There are a few valid reasons as to why cloud repatriation has become so prevalent. Sometimes users end up with limited, sporadic, or impeded access to applications and data that could be more readily accessible in a private cloud environment. Enterprises managing big data workloads may discover that it’s easier to analyze that data in-house. They may also find that the public cloud, despite being convenient, is not necessarily the most cost-effective long-term solution.

In some cases, repatriation is a necessity. Certain pieces of data must be stored on-premises for better governance and control. For instance, organizations will likely want to keep their most highly sensitive data in-house.

The trick is finding the right balance between public and private clouds. For many organizations, that means keeping their feet in both.

The benefits of the hybrid cloud

As the pendulum swings back in favor of on-premise environments, many companies don’t want to return everything back to the store. They know that some applications work better on-premises (latency-sensitive SLAs that require constant connectivity, or applications requiring VPN connectivity, for example) while others can safely exist in the public cloud without any issues. They want to enjoy the efficiencies of the public cloud while maintaining control over their data and ensuring that their applications are always available.

Thus, hybrid cloud deployments–where some applications are kept on-premise while others are hosted in the public cloud–are gaining in popularity. According to research firm MarketsandMarkets, the hybrid cloud market will be worth more than $97 billion by 2023, a CAGR of 17% over a five year period. Hybrid cloud environments offer the best of both worlds by providing organizations with the right balance of applications and data in just the right places.

Managing a hybrid cloud requires an enormous amount of flexibility. Enterprises must be able to freely move their applications and data between their public and private clouds, and data must always be accessible to those who need it. This environment calls for a scalable storage infrastructure that supports seamless data migration between external and internal clouds.

Object storage as a support structure

Object storage may be the ideal support structure for hybrid cloud environments. Object storage breaks down data into discrete units (“objects”) and is kept in a single repository, as opposed to being kept in files or blocks, providing unlimited scalability.

Object storage also readily supports different deployment models with the same type of consistency and reliably. Both of these factors make it ideal for hybrid cloud deployments, though it should be noted that certain types of object storage solutions work better for public cloud workloads. For instance, solutions based on an S3 protocol can closely replicate the on-premises experience and simplify tasks that are performed on applications and data residing in the public cloud.

Moving data between clouds

Of course, even when data is minimized into smaller objects, moving information back and forth between clouds is often easier said than done. First, data can be difficult to move without adversely impacting applications or end users. Second, the manual transfer of data from one cloud to another can be a labor intensive, time consuming, and error-prone process.

Organizations should consider complementing their object storage initiatives with an abstraction layer that combines storage from multiple clouds into a single virtual storage unit. Enterprises shouldn’t migrate data unless absolutely necessary. An abstraction layer can make it easier to manage data wherever it resides.

The end result of all of this is an IT strategy that eliminates or reduces discontinuity between different cloud platforms. Enterprises can choose to use the public cloud based on their unique business needs, not their technical bandwidth. Or, they can opt to use a combination of public and private clouds. Either way, with the appropriate storage infrastructure, they can get rid of the remorse and rest assured that their data will always be available.

Peter Brey
Peter Brey is a principal product marketing manager for Red Hat Storage.
    TEST