Bringing Choice, Consistency and Protection to Hybrid Cloud Strategies
Hybrid cloud has quickly grown in popularity, but with this extensive growth it’s more important than ever for organizations to develop strategies that address cost control, IT complexity, cybersecurity threats and data corruption.
When developing a hybrid cloud strategy, notes Scott Baker, VP, Storage Product Marketing for IBM, it’s important for organizations to define what hybrid cloud means to them. “Some organizations define hybrid cloud as the way they operate their IT infrastructure, similar to private cloud. For others, hybrid cloud uses that private cloud foundation integrations with traditional public cloud offering like IBM Cloud, AWS or Microsoft Azure.”
Regardless of what hybrid cloud means to an organization, or what underlying purposes hybrid cloud strategies serve, cloud technologies bring key benefits. “Everything we do at IBM from the storage perspective through the higher technology stack is focused on delivering a consistent set of data and operational services that allow business to easily move workloads and data where they need to be, regardless of where the users are,” notes Baker. “That’s what’s critical about hybrid cloud technologies.”
Extending to Microsoft Azure
IBM’s dedication to enhancing hybrid cloud strategy shines through in its recently announced capabilities and integrations designed to mitigate hybrid cloud pain points—helping organizations reduce IT complexity, deploy cost-effective solutions and improve cyber resilience across hybrid cloud environments.
One key announcement includes extending hybrid cloud storage simplicity (delivered through the IBM Spectrum Virtualize operating environment) to Microsoft Azure, in addition to IBM Cloud and Amazon Web Services (AWS).
“At this point, if customers aren’t already hybrid cloud or multicloud, they will be at some point,” reflects Baker. “It’s important for us to give customers options—whether that’s the IBM Cloud, AWS or Microsoft Azure.” For IBM, it was a natural progression to extend the IBM Spectrum Virtualize operating environment—the same one that runs on IBM storage products—into the public cloud so that customers using Microsoft Azure are on equal footing as those using IBM Cloud or AWS.
Delivering Automation With Turbonomic
IBM is also expediting Turbonomic integration for automated operations. This important for several reasons. For one, application performance management has become incredibly complex because teams tend to be siloed. Data sources are also diverse, and there’s a constant demand placed on IT—making it difficult to maintain response times for applications.
In this case, manual approaches are not scalable—and that’s where Turbonomic comes in. Turbonomic’s analysis engine combines FlashSystem data, virtualization data and application data to continuously automate non-disruptive actions and ensure applications get required storage performance—eliminating the need for unnecessary over-provisioning and safely increasing density without sacrificing performance.
“The notion that Turbonomic now gives organizations this application down to storage view is critical to alleviating that problem—offering better visibility and orchestration and delivering the best possible application experience,” adds Baker. In this way, automation helps give time back to IT, allowing them to solve bigger challenges and allowing the system to continuously learn from itself and move workloads around.
IBM Spectrum Enhancements: Expanding Cyber Resilience
IBM continues to enhance data and cyber resilience capabilities with enhancements to IBM Spectrum Protect Plus, IBM Spectrum Protect, IBM Spectrum Scale and IBM Elastic Storage System 3200.
“Spectrum Scale is a product used to handle distributed file and object. We’re extending native support for cloud protocols like S3—whereas before, we would have to offload that connection responsibility to one of our specific storage products by making it native,” says Baker. IBM has also broken through what Baker deems “the industry ceiling of capacity” with a 38-terabyte storage module, which can store twice as much data for organizations in the same physical space as before while driving lower power and space requirements.
The Spectrum Protect and Spectrum Protect Plus enhancements solve two requests from IBM customers: Multisite protection, and defining backup policies that set retention and allow customers to target cloud as a backup destination.
“On the Spectrum Protect Plus offering,” adds Baker, “The big news is being able to protect critical data sets and applications that are running in Kubernetes and OpenShift—so not only the container and the data, but the cluster as well.”
Looking Forward
Of course, part of IBM’s dedication to tending to customer hybrid cloud pain points means evaluating current ones and looking to future trends. “One of the biggest pain points for hybrid cloud environments is creating awareness and control of hybrid cloud infrastructure, and managing the underlying infrastructure—whatever those services might be,” says Baker.
This might be a “blue-sky” problem to solve—but the key here is addressing the future of how workloads get deployed, implementing appropriate integration points, increasing consistency across data and operational services, and giving organizations a choice of underlying protection. This is what IBM innovations are all about—addressing immediate needs, anticipating future ones and enabling customers when they need support.
As for Baker—these innovations are always exciting, but there are a few stand-outs. “Everything that we’re doing is equally important,” he notes. “But I’m really excited about our top line here—delivering consistency on-prem and in the cloud, with protection—not a sacrifice of one for the other.”