We Must Find A Way To Defy Data Gravity In The Cloud
Limited Time Offer!
For Less Than the Cost of a Starbucks Coffee, Access All DevOpsSchool Videos on YouTube Unlimitedly.
Master DevOps, SRE, DevSecOps Skills!
Source- forbes.com
The IT industry is in the midst of a transformational era in terms of how we treat data. At Moor Insights & Strategy we have discussed countless the times forces that are driving the need for a data strategy, how that need is deeply impacted by real-time analytics, and how data has escaped the data center and is now spread from edge to cloud.
The world of IT today is one of hybrid and multi-clouds. IT deploys workloads to the public cloud because it delivers on a compelling value proposition across a number of realms. Deploying resources dynamically, as needed, and terminating them when the project is over saves countless CapEx dollars. Having resources that can be deployed dynamically, gives IT and application owners an almost infinite amount of flexibility.
Adopting public cloud technologies does present challenges, however, in providing a consistent and timely view of data across an IT infrastructure that blends on-premise and cloud resources. The public cloud flexibility comes at a cost, sometimes 2-3X compared to a well-utilized on-prem cloud solution and some applications are latency-challenged. This is why, today, IT has embraced hybrid and multi-clouds.
We believe that modern businesses of every size need 24/7 real-time access to all of their data and applications, regardless of where it lives. This data should move freely between on-premise, private cloud, the edge, and public cloud. If there were a bill-of-rights for cloud computing, this would be the first tenant.
Overcoming limitations
Thinking about data holistically and strategically is critical for any IT architect. What is often neglected in high-level conversations about data architecture is the nagging reality of physics. Data has gravity and momentum, both of which can be hampered by the infrastructure and systems that move that data around.
I’ve been thinking about this quite a bit over the past few months. It started when Pure Storage published a manifesto back in September eloquently describing the need for a new storage architecture to support the changing nature of data. Pure called it a data hub, and we published our thought about it then.
The arguments that Pure Storage made about collapsing physical data to ensure global access to that data really resonated with me. It also leads to the obvious question of: is the industry looking at these same challenges as relates to bridging on-site and off-site solutions?
It’s a hard problem to solve, no question. Every storage vendor has some flavor of data migration available from on-site cloud, but migration isn’t enough. Data migration between locations helps with the availability problem but does nothing to address the timeliness of that availability, or even help address cross-site consistency.
We believe that a user should not have to compromise between on-premise and public cloud. It is possible to have an infrastructure that enabled hybrid applications, with the associated long-tail of data, to move seamlessly between different cloud environments without fear of lock-in or limitation.
It starts and ends with storage
Ensuring that data is available where you need it when you need it, is not an application level problem. It is a problem that begins and ends with the storage system. Just as storage is tiered for application availability, it should be equally tiered for cloud-level availability.
This challenge is best solved by the storage technology innovators who can deliver purposely-built solutions that deliver consistent data availability in tiers from flash to cloud. Solutions optimized for running the most mission-critical workloads, regardless of physical location.
This means no compromise in resiliency, data protection, compliance, or performance. Purpose built storage bridging on-premises and cloud environments to provide best-in-class data reduction with a unified management interface will reduce the cost of managing applications. Open APIs on such a solution will drive innovation and allow for application-specific customization.
This is not a vision that a cloud-provider can deliver, or that can be solved purely in hardware. Just as we architect storage systems for optimal on-site performance, those same storage systems should be further optimized for the hybrid world.
Dell EMC and Hewlett Packard Enterprise are approaching this space from the perspective of converged computing. Converged computing, whether delivered as converged infrastructure, composable infrastructure, or active data fabrics, will absolutely provide benefits for IT organizations that deploy those architectures. Convergence abstracts away the physical limitations and presents a great consistent management model. Converged infrastructures are being extended to bridge into the cloud, but it still doesn’t’ necessarily solve the underlying storage problem.
Storage problems are best solved by storage vendors. It’s your storage that has a view of data from ingest to consumption, where-ever those points may lay.
Concluding thoughts
We love the hybrid cloud model. Integrating the cloud and datacenter to deliver IT services to organizations of every size is a positive move. The benefits, both economic and practical, far outweigh the challenges. I am looking forward to the day that companies will actively deliver this as its a bottleneck.
We as an industry must also keep in mind that these are very early days in understanding the right approaches.
While software technology providers such as VMWare, Nutanix, and others work with the public cloud providers from Amazon.com AWS, Microsoft Corporation Azure, and Google’s Google Cloud Services to smooth out the control plane, it’s critical that the storage world similarly engage and innovate.
There is a huge opportunity for innovators in the storage world to deliver cross-site capabilities that deliver a consistent experience, regardless of where data lives. The door is open for innovators like Pure Storage, as one of the last pure-play storage providers, to extend their data hub architecture to the cloud. IBM could also take the work that they are doing to address this problem for IBM cloud customers and extend that into the public cloud sphere.
Regardless of where the solution comes from, we solidly believe that the solution to the problem will come from the storage industry. There’s an opportunity to change the game, and somebody will take it.