Scroll to top
Tech.mt - Malta Leading Through Innovation
Share

Are You Wasting Money in the Cloud? Public cloud repatriation starting off in 2023


databloom - January 4, 2023 - 0 comments

Public cloud displacement is a subject we don’t discuss as often as we ought to. Many see relocating data and apps back from a public cloud provider to an enterprise data center as an admission that the initial decision to move the workloads to the cloud was a grave error. In my opinion, this is less of a failure than a hosting platform change depending on the current state of the economy. People frequently return to more conventional platforms because of the high expense of cloud computing. You surly remember the article from Dropbox (1), explaining why they left their public infrastructure and went to a private cloud approach.

The transfer of workloads from public clouds to local infrastructure environments, usually via a private or hybrid cloud deployment strategy, is referred to as cloud repatriation.

It’s just business. It is unavoidable that some workloads need be returned to a conventional data center or private server hosting farm for financial reasons.

Businesses are first running against a zoo of everything, which results in a ultra-complex pick-and-try scenario. Because of the sheer number of services they have enrolled, they are unable to operationalize them with the funds and resources they currently have. In accordance with conducted surveys, “multicloud usage is getting increasingly cumbersome, and prices are challenging to control across hybrid systems.” I won’t go into too much detail because I’ve already discussed these issues extensively.

Second, recurring cloud expenditures are startling the majority of businesses. The costs are far higher than anticipated despite just 20% to 30% of enterprise workloads being on public clouds. Lack of planning, which leads to suboptimized cloud solutions, is one cause of this. Additionally, there is no accountability and no financial restraint, therefore finops is nonexistent.

A more cost-effective year to run applications and data stores inside a standard enterprise data center is 2023, thus this may be the year we start doing so (2). The purpose of this is not to disparage cloud computing. Cloud computing, like any other technology, has its advantages and disadvantages. Businesses will once more adapt as that “reality” changes and develops over time. There is no shame in that. Finding the best optimal architecture to support your business is the main objective. It’s on a public cloud occasionally, but not always, or not yet. I’ve long since learnt not to fall head over heels for any technology, especially cloud computing and data processing, given my history with Hadoop, Cloudera and the open source world.

The foolishness of “lift-and-shift”, meaning not restructuring data and apps for cloud platforms while migrating is simple to point out. Refactoring is time-consuming and expensive in reality, and the epidemic pushed many businesses’ cloud migration deadlines very close together. It is not particularly cost-effective for businesses that did not migrate their systems to reorganize their workloads at this time. Even with the complexity and expense of running your own systems in your own data center, repatriation is frequently a more affordable alternative for these businesses.

In a fortuitous coincidence, while cloud computing costs have remained around the same or slightly higher over the past ten years, the costs of hard drive storage, networking hardware, compute hardware, power supply, and other tech equipment have all decreased.

Repatriating workloads and data storage that frequently do similar tasks, such as just holding data for extended periods of time without any additional data processing, makes the most sense (e.g., no advanced artificial intelligence or business intelligence). These workloads can frequently return to owned hardware and provide a positive return on investment. Even with the additional expenditures involved in taking over and internalizing operations, the organization still saves money (or a lot of money) when compared to hosting services in a public cloud.

Remember, though, that a lot of workloads rely on specialized cloud-based services. Because there are usually no affordable analogs that can run on conventional platforms, those workloads cannot usually be repatriated. Public clouds are often more affordable when advanced IT services (AI, deep analytics, enormous scalability, quantum computing, etc.) are involved. In order to avoid incurring additional fees, many businesses at the time chose to run lifted-and-shifted apps on public clouds. Now, many businesses will make the straightforward option to bring some workloads back into their data center based on the current business environment and economics.

We are going into the Internet of Things, where data is generated everywhere (3). However, using those data is the difficult part, and it is inherently costly due to the required data transformation. Here comes the data mesh, a new technology aimed at reducing overspending by operating data at the point of origin, thus eliminating data transportation, ETL, and central data stores (which are legally complex). Databloom was founded by the creators of Rheem and Apache Wayang, a pioneer in decentralized computing and one of the startups aiming to change the big data landscape dramatically. Talk to our sales team now to get a brief introduction and build a plan on how to enable data-driven opportunities for your business.

(1) https://www.geekwire.com/2018/dropbox-saved-almost-75-million-two-years-building-tech-infrastructure/

(2) https://a16z.com/2021/05/27/cost-of-cloud-paradox-market-cap-cloud-lifecycle-scale-growth-repatriation-optimization/

(3) https://link.springer.com/book/10.1007/978-3-030-85559-8


Disclaimer
Tech.mt releases all liability on the quality or reliability of offerings / delivery of any products/services advertised or pitched from a sales point of view in any of the articles submitted.

Related posts