Red Hat’s strategy for achieving hybrid cloud dominance

Limited Time Offer!

For Less Than the Cost of a Starbucks Coffee, Access All DevOpsSchool Videos on YouTube Unlimitedly.
Master DevOps, SRE, DevSecOps Skills!

Enroll Now

Source:-crn.com.au

Red Hat might no longer be an independent company, but the IBM subsidiary’s ambitious strategy to ascend to the top of the hybrid cloud market centers on an infrastructure stack that is open, comprehensive, integrated across clouds, and entirely independent of provider or environment.

That approach, as the company sees it, offers customers a unique value proposition in the cloud era—a portfolio of integrated solutions that ease life for developers who simply want to write code, and have little, if anything at all, to do with selecting, provisioning and managing infrastructure.

Red Hat changed its annual Summit to a virtual affair this year because of the coronavirus crisis. After Tuesday’s keynotes at the Red Hat Summit 2020 Virtual Experience, executives and product leaders — including newly minted CEO Paul Cormier — participated in an online panel discussion, elaborating on the company’s vision for conquering the hybrid and multi-cloud market with its entirely open-source technology stack.

Here’s what CRN USA learned about Red Hat’s strategy for achieving hybrid cloud dominance.

Linux is the foundation

Red Hat made generally available Red Hat Enterprise Linux 8.2 during this week’s Summit, an upgrade to its flagship product providing greater monitoring functionality and container support.

The industry’s leading operating system is the springboard thrusting Red Hat to the center of digital transformation discussions.

As new CEO Cormier said during a panel following his Summit keynote, “Linux started an innovation cycle that continues in full force today” into the virtualization, cloud, container, Kubernetes and edge computing markets.

“Linux provides the foundation upon which all this innovation is based,” said Stephanie Chiras, vice president and general manager for RHEL.

That foundation supports OpenShift, Red Hat’s container platform, said Tim Cramer, Red Hat’s vice president for software engineering, and “OpenShift is putting a good amount of pressure on our [RHEL] road map.”

RHEL has distinguished itself in the market by providing the operational consistency enabling workloads to be deployed across a diverse array of environments, said Gunnar Hellekson, senior director of product management and strategy.

“The good news is because of the way we constructed the Red Hat portfolio, all of those benefits accrue to the higher-order products.” Hellekson said.

OpenShift is the future

Red Hat built its brand on Linux, but the company sees its OpenShift container platform as the product giving it the upper hand in the cloud era.

At Summit, Red Hat introduced OpenShift 4.4, with new capabilities aiming to advance the platform’s strong position in the crucial Kubernetes market.

The latest release looks to mature OpenShift to better support emerging uses cases like artificial intelligence, machine learning and big data, and more generally, free developers working on those types of applications from having to think about infrastructure at all.

The release implements Kubernetes 1.17, and delivers some evolutionary progress: Advanced Cluster Management eases management of extensive Kubernetes footprints, and new policy-based management is helpful for dynamic environments that have to rapidly scale.

But the “revolution” is OpenShift Virtualization, said Matt Hicks, who took over Cormier’s previous role leading product development as Red Hat’s executive vice president for products and technologies.

By tightly integrating the KubeVirt open-source project, Red Hat has built a bridge between containers and virtualization. For the first time, VMs can be imported and run natively in OpenShift side-by-side with Kubernetes clusters, Hicks said.

“Bringing virtualization forward into the infrastructure foundation of the future” reduces licensing costs while enabling new use cases for customers that need to maintain legacy systems, like telecoms preparing 5G deployments, Hicks said.

Open cloud is hybrid key

Hybrid cloud has evolved rapidly in recent years, with infrastructure extending from data centers to public clouds and out to the edge of networks, said Red Hat CTO Chris Wright.

Red Hat’s combined portfolio, with its open-source pedigree, gives the company a leg up in the crucial hybrid cloud market, Wright said, as hybrid environments implement a set of core capabilities that Red Hat has been developing for years, from Linux to virtualization to containerization.

But a “critical part of the future” will be making it easier to operate and develop applications running in those environments, Wright said.

New paradigms, like serverless computing and event-driven environments, will advance the goal of making developers more efficient by allowing them to focus on building applications rather than catering to the hybrid infrastructure needed to support those applications.

Red Hat’s portfolio bridges traditional data center technologies and the hybrid cloud world, said Cormier.

“This is what open, hybrid cloud is,” he said.

Innovations around Kubernetes, especially, have nurtured the industry’s largest hybrid cloud ecosystem, and made Red Hat the hybrid leader, Cormier added.

Hybrid cloud tends toward multi-cloud

Most Red Hat customers have already adopted or are on a path toward hybrid infrastructure, said Joe Fernandes, Red Hat’s vice president and general manager for core cloud platforms.

But rather than stopping there, they “then tend to move to a multi-cloud strategy,” Fernandes added.

While hybrid cloud and multi-cloud are not one in the same, “they’re normally used together,” added Mike Evans, Red Hat’s vice president for technical business development.

It’s an “amalgamation of hybrid cloud, multi-cloud and cloud-to-cloud happening, and I expect to see more of that happening this year, giving customers more flexibility and choice in being able to run things the way they want to,” he added.

In recent years, Red Hat has been preparing for the multi-cloud world by nurturing close partnerships with public cloud giants like Amazon Web Service, Microsoft, Google, and parent IBM.

And those hyperscalers, in a surprising way, are now validating the multi-cloud approach.

One of the most interesting trends of late, Evans said, is the providers offering services that span their competitors’ clouds, such as Google’s Anthos or Microsoft’s Azure stack.

In effect, they’re building out “a hybrid cloud of their own,” Evans said. “You’re going to see more of the public cloud providers targeting their services on other public clouds as well.”

Hybrid cloud extends to the edge

“Edge itself is kind of this nebulous concept,” said CTO Wright. But it’s increasingly a vital component of hybrid cloud.

That concept, in a broad sense, involves “putting computing closer to users and the data they need to use,” said Cormier.

Edge applies to many different types of workloads, and spans a half-dozen layers, said Joe Fitzgerald, vice president and general manager of Red Hat’s management business unit.

Red Hat’s focus is on the “end-user’s premises,” Fitzgerald said, where customers place servers, gateways and other edge infrastructure.

That aim is increasingly vital for large telecommunications companies, like Verizon, whose efforts to build out 5G networking were showcased during Cormier’s Summit keynote.

Such projects create a complex distributed computing problem—one that’s important for service providers to solve if they are to deliver 5G and other cutting-edge technologies at scale, Fitzgerald said.

To support those telecom customers and others implementing IoT workloads, Red Hat is working to develop high-speed, continuous, policy-based management of distributed edge systems.

“It’s all about speed and automation,” Fitzgerald said.

That involves integrating real-time capabilities across the RHEL portfolio—and especially into OpenShift—that help federate and centrally manage networked systems from a platform consistent across all environments.

Next-gen Kubernetes
Central to Red Hat’s ambitious cloud strategy is Kubernetes.

Five years after that container orchestration technology was open-sourced by Google, it has matured out of the sole realm of the infrastructure administrator and closer to the developer.

A massive open-source community is now fostering a “third generation of customer use cases,” said Brian Gracely, Red Hat’s senior director for product strategy, OpenShift.

Kubernetes has overcome most of the early learning curves and big technical challenges, he said.

With those stumbling blocks eliminated, new use cases like AI, machine learning, big data and edge “are the kinds of challenges I’m seeing,” added Clayton Coleman, a Red Hat solutions architect for containerized application infrastructure.

The first generation of startups to deploy Kubernetes consisted of “some brave, brave operations teams.” They had no support and had to rely on their own ability to master the technology.

But the current adoption phase is about empowering people who don’t particularly care about the nuances of infrastructure, Coleman said.

That means the goal should be “maximum empathy for administrators,” Coleman said.

The challenge for the coming years will be delivering Kubernetes in a way that’s “rock solid for them,” he added, with clear guardrails, operators to simplify use, and an ecosystem that puts those customers “on a path to success.”

“We try to make it so Kubernetes is the easiest place to build and run applications,” Coleman said.

VMs, containers and a competitor
Ashesh Badani, Red Hat’s senior vice president for cloud platforms, said he’s regularly asked to compare Red Hat’s container strategy to what competitor VMware is doing in the market.

While both companies agree that Kubernetes is the future, Badani said, “one company has been shipping Kubernetes for five years with over 1,700 customers, the other’s just starting.”

Badani then proceeded to draw more pointed distinctions with the rival enterprise software giant.

Where Red Hat’s portfolio is based on a Cloud Native Computing Foundation standardized Kubernetes stack, the “other” is still trying to figure out which of its three Kubernetes stacks, some of which are proprietary, to offer customers.

VMware’s hybrid cloud strategy incorporates proprietary infrastructure for each public cloud, and some of those offerings won’t even be widely available till next year, Badani went on.

And where Red Hat offers a broad set of integrated application services certified with hundreds of ISVs, VMware only offers a “rebranded application strategy from a previously failed model.”

As to converging VMs and containers, Red Hat is taking a modern, cloud-native approach; VMware is just wrapping proprietary APIs around its virtualization platform, Badani said.

Finally, Red Hat has been collaborating in open-source communities for 20 years, Badani said. “The other is still trying to figure it out, yet considers it it’s birthright to win.”

The developer’s infrastructure
Red Hat’s hybrid cloud strategy prioritizes infrastructure that through its consistency and simplicity allows application developers to not think about it at all.

“Developers should be shielded from where those applications are running,” said Fernandes.

Red Hat’s portfolio delivers the ability to build an application once, package it as a container, and then have it deployed in any environment an administrator determines is best to host it. That could be in an on-premises data center powered by vSphere or OpenStack, or on AWS, Microsoft Azure or Google Cloud.

With Red Hat OpenShift, “we try to meet individual developers where they’re at,” he said.

The end goal should be infrastructure so abstracted from developers that they really don’t care where their applications land in the production stage. That’s especially important for cutting-edge workloads, such as those in the field of data science, that require their own realms of expertise and focus.

“Machine learning is typically a hardware accelerated environment, enabled by the operating system, and then made accessible to data scientists and application developers through OpenShift,” said Wright.

“A unique beast” for cloud partners
Over the past few years, as customers increasingly demanded consistent management of workloads spanning their on-premises data centers and public clouds, Red Hat focused on “not just maturing our technology, but maturing our partnerships” with cloud providers, said Mike Farris, vice president for strategy, products and technologies.

While the hyperscalers all offer their own container services, they recognize in OpenShift “a little bit of a unique beast”— a Kubernetes platform that joint customers often prefer for migrating applications to the cloud, or developing new ones there, said Mike Evans, Red Hat’s vice president for technical business development.

Customers are putting pressure on the hyperscalers to offer jointly managed, supported and in some cases jointly engineered solutions for running mission-critical OpenShift workloads in their clouds, Evans said.

And the cloud giants also recognize “significant value in the enterprise relationships Red Hat has, and IBM as well,” said Farris.

That has those providers “rationalizing their own Kubernetes platforms,” Farris said, as is demonstrated by what Microsoft has already done with Azure Red Hat OpenShift.

“I would expect you would see deeper partnerships coming … this year with the cloud providers,” Evans added.

The current pandemic
The coronavirus pandemic isn’t having a huge impact on Red Hat operations, said Cormier. Red Hat employees are used to working remotely, especially product developers and engineers, a quarter of whom are always based from home.

So as far as developing products, “we’re going full force on the road map and product line we planned from the beginning,” Cormier said. “I don’t think we’ve lost a beat, especially on the product side.”

Adjusting go-to-market operations that require customer interactions has been trickier, he said, but “we’re helping our customers on a daily basis keep their business running.”

To that end, Red Hat has implemented coronavirus measures, including extending product life cycles so customers won’t be forced into upgrades with all the distractions and challenges they now face, said Hicks.

Red Hat has also cut by half the cost of Technical Account Management services for new customers, and it is offering free training programs for furloughed workers and job seekers looking to develop in-demand skills.

Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x