Select the directory option from the above "Directory" header!

SoftIron

SoftIron® is the world-leader in task-specific appliances for scale-out data center solutions. SoftIron eliminates vendor lock-in, siloed operations, and complexity to enable flexible, efficient, scalable, and high-performing outcomes. We do this by creating task-specific appliances that leverage leading open-source software - all of which are designed, built, and assembled transparently.

Menu
Hybrid Cloud: A Binary Choice or a Hybrid Decision?

Hybrid Cloud: A Binary Choice or a Hybrid Decision?

The data repatriation race is on. Recently, TechTarget suggested that “many organisations are doing the math and discovering that the ROI for cloud isn't as good as with on-premises storage…”. As enterprises plan for the pivot and look to their channel partners for support, it’s crucial that the right platform choices are made at this strategic “inflection point” to avoid negative, long term repercussions.

The pendulum is coming to rest 

We often talk of technology lifecycles, but from where we sit in the data storage camp we liken it more to a swinging pendulum, with on-premises data infrastructure in full throttle on one side, and the all-out rush to the cloud on the other. For some time there, if you weren’t moving all your workloads and data to AWS, Azure, or Google Cloud, your peers might have looked at you like you were crazy. And why wouldn’t you? The glittering promise of switching Capex to Opex, elasticity as your demand level changed, and the theoretically low management overhead was alluring.

More recently though we’re seeing that pendulum settle to a more rational state of equilibrium, a hybrid world unifying the best of both worlds. In fact, IDC report that “80% of organisations are now undertaking some level of cloud data repatriation activities.”

It’s at these “inflection points” in technology where strategic decisions need to be made, and the quality of those choices will have a lasting impact on your organisation, for better or worse. And to be sure, Amazon, Microsoft and Google have not missed that fact either.

Three drivers underpinning data repatriation

The way we see it, there are three core reasons driving such a big shift back from an “all-in” cloud strategy and of course, for different organisations the priority between these will differ.

1. Cost

Moving workloads and data into the cloud may seem initially attractive, however, for many businesses, the egress costs (the cost of moving the data back out again) can add up really fast and sometimes unexpectedly. Lyft infamously made headlines in 2019 when it was revealed that they were on the line to pay Amazon Web Services (AWS) around $300m a year to provide cloud services, ouch! If you can identify workloads where the data need to be accessed locally on a regular basis, why pay the cost of moving it to and from the cloud? In addition, the much-lauded ability to deploy a multi-cloud strategy has in reality, not been achievable in practice at a workload level for many, making it harder to dynamically switch between the big three vendors to help keep their pricing keen.

2. Agility and Performance

The public cloud is naturally fantastic for managing elastic workloads, but not so great where latency is an issue. This performance challenge has been magnified of late by the explosion of new workload types such as AI and IoT, which have in turn also driven the growth in edge computing, adding compute and storage closer to where the data is created.

3. Data Governance and Security

As data breaches continue unabated and regulators grapple to respond, the third factor that’s influencing data repatriation is the increasing need to ensure greater levels of governance and security controls over data, and the desire for transparency into how these controls have been implemented. 

If you can’t beat ‘em...

As mentioned, the big guys aren’t giving up without a fight because they don’t want to lose your data from their cloud ecosystem. But if hybrid is a growing reality, how do they respond? Enter Amazon Outposts, Microsoft Azure Stack and Google Anthos: three slightly varied approaches, but each designed to keep your data in their cloud, even if it’s physically residing in your own data center. We still question the value. If there were three reasons to move from public cloud, then how do any of these approaches measure against those drivers we just identified?

The Choices Just Don’t Stack Up

To start with, if you’re still in the same proprietary ecosystem, then you’re still locked into your cloud vendor of choice’s pricing plan. Then, from an agility perspective, your ability to deploy a multi-cloud strategy is also hindered, giving you very little room to negotiate. You might observe improvements in performance by placing workloads back in your own data center, but if Edge Computing is in your future, will you be able - and willing - to roll out that same architecture to every edge environment? Will the hardware that is prescribed by those vendors actually perform efficiently in all the edge locations you need?

And lastly, have you really addressed your all-important security and governance concerns by simply placing a cloud vendor’s hardware and/or software in your location? What real controls are you able to place on that deployment, and how will those controls stand up to regulator scrutiny?

The solution: task-specific hardware, built for open source 

In our view, the right strategic choice as you consider both hybrid cloud - or any future shift toward Edge Computing - is to use this opportunity to break ties with the public cloud vendors and move to an open source-based architecture that’s able to integrate with them on your terms.

The software stack is already well understood and widely deployed, consulting and support is widely available and your hardware choices are broad giving you the bargaining chips you need to drive a good deal, and to select the right platform for your use case.

That said, in our opinion, you really start to make the best strategic choice if you couple that shift to open source with a move to task-specific hardware on which it will run – appliances designed with a single purpose in mind, giving you the best of all worlds –  blistering performance and incredible efficiency, manageability, and scalability, yet underpinned by transparent design and manufacture giving you secure provenance for your hybrid cloud platform. And, if you're selling generic hardware into open-source data center deployments you're missing out on a very unique opportunity: the chance to differentiate by delivering superior performance and efficiency, as well a solution that's easier (and therefore more profitable) to deploy and support, yet still delivers all the independence of open source.

Remember though, under the hood it’s still open source at its core, eliminating vendor lock-in for your own infrastructure, but also facilitating a multi-cloud strategy should you wish to implement one.

SoftIron just announced that their HyperDrive Storage appliance has been selected by Enterprise Management Associates (EMA), a leading IT and data management research and consulting firm, to receive a Top 3 Award in their “EMA Top 3 Enterprise Decision Guide 2020” report. HyperDrive was selected as a leading storage solution in the “Hybrid Cloud Management – Enterprise Data Services” category for the awards. The report is available here.

Follow Us

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about AmazonAmazon Web ServicesAWSEnterpriseEnterprise Management AssociatesGoogleMicrosoftMicrosoft AzureSoftIronTechTarget

Show Comments