IBM Infrastructure Plays a Critical Role in Cloud, AI and Security
Cloud, artificial intelligence (AI) and security all have underpinnings in infrastructure—the servers, storage and software that power an organization. A purpose-built architecture is critical to achieving the performance needed for current and future business requirements. “Technology alone can’t transform a company, but it’s a very essential ingredient in business transformation,” says Tom Rosamilia, senior vice president, IBM Systems.
We sat down with Rosamilia to learn more about the role of infrastructure in organizations’ digital transformations.
Q: Where are IBM clients in terms of digital transformation?
Tom Rosamilia (TR): Some IBM clients are very advanced. Others are just beginning their journey. I think the journey will take time. Many of our clients have an established infrastructure and their digital transformation requires them to think about how they may change it or modernize it.
IBM is helping clients get to a point where they can create a set of services and combine those into other applications to leverage the investments they’ve already made. We’re also helping clients move workloads to hybrid cloud. I think the journey to the cloud–public, private and hybrid cloud—is just beginning. IBM estimates that enterprises are about 20 percent of the way there. And going to the cloud doesn’t mean moving everything off premises. It means that clients want to be in the most flexible environment possible and the right answer for them is a hybrid and/or multicloud environment.
Q: What business challenges are clients looking to IBM infrastructure to solve?
TR: Companies will be working with five, 10, 15 different cloud platforms. Some of these may be off-the-shelf applications. A level of interconnection and integration between them is needed. The good news is that this is something IBM is very, very good at—being the glue among all of these different platforms. Some of these will be on premises; they will be running on IBM servers with IBM storage. Some will be hyperscale data centers, hopefully with IBM servers, storage and software. Some may be by service providers. The role of CIOs and the role of IT providers is to enable that multicloud world. To do that, they will need to provide clients with a level of choice, portability and freedom.
Q: Cloud is becoming important because of all of the data that people are using to make important business decisions. Why are IBM infrastructure solutions well suited to handle vast quantities of data?
TR: IBM architected the Power Systems* platform for the era of analytics, which leads very nicely into the era of AI, machine learning and inference. The Power Systems architecture was built to efficiently handle massive amounts of data. It’s not just the processor that we deliver, but the accelerators that plug into the POWER9* architecture—field programmable gate arrays, GPUs or flash. Accelerators enable clients to take advantage of system-level performance, not just processor-level performance.
IBM architected memory bandwidth, networking bandwidth, and speed between GPU and CPU that are tops in industry. It’s not just the performance per core. We have that. It’s not just about memory per socket. We have that. It’s also the bandwidth between processor and accelerator. We have lots and lots of that.
The IBM Spectrum family of storage software also does heavy lifting with data. Products like Spectrum Discover enable clients to leverage analytics, AI, compliance and regulatory applications by leveraging the metadata on storage to more rapidly execute those solutions. This is about data oceans, not just data lakes. IBM flash and software-defined storage can handle all of this big data.
“Not only is IBM infrastructure built for a hybrid, multicloud environment; it’s also been architected for AI. The things IBM has done for analytics make sense for AI as well.”
–Tom Rosamilia, senior vice president, IBM Systems
Q: Didn’t IBM Storage announce a new security solution too?
TR: Yes, IBM Safeguarded Copy, which works on the DS array solutions, protects clients against having their data held hostage and having to pay with cryptocurrency to get it back. It keeps multiple copies of the data—from four hours ago, from eight hours ago, from 12 hours ago, etc. So if a hacker comes along and holds your data hostage, you go back to your file and say, “I’ll go back to what I had four hours ago.” It’s not a perfect solution, but it’s better than paying to get your data back.
Q: What business challenges are leading organizations to explore private cloud?
TR: Cloud—public or private—provides a level of flexibility, agility and speed to deployment that organizations need. Data location is critical because data has a lot of gravity. If your data lives on premises, you’re going to want to use a private cloud. You get the advantages of cloud, the speed of deployments, the ups and the downs, the ebbs and the flows, the capacity as needed and potentially even the pricing.
Private cloud gives people the ability to say, “Maybe I want to run that somewhere else someday. If I run it in my container in my private cloud today, if I architect it a certain way, it gives me the choice of running it somewhere else tomorrow.” That’s the kind of choice IBM wants to give people. That’s why we encourage enterprises to provision infrastructure for the location of where they might want to run that workload in the future. For private cloud, the combination of IBM Cloud Private and Red Hat Open Shift, which we announced in May 2018, allows Power Systems and IBM Z* servers to be integral parts of a hybrid cloud.
Q: You mentioned AI a few times. Can you explain how IBM is changing the landscape for clients using infrastructure built for AI?
TR: I think this is an essential point. Not only is IBM infrastructure built for a hybrid, multicloud environment; it’s also been architected for AI.
The things IBM has done for analytics make sense for AI as well. For example, machine learning is greatly assisted by the combination of processors and GPUs. Our partnership with NVIDIA has proven to be invaluable in many client scenarios.
As a proof point, the work we did with the national laboratories in the U.S. at Oak Ridge and Lawrence Livermore is a great proof point for POWER9 and its ability to handle AI workloads—in this case, lots and lots of machine learning. They use an IBM AC922. These two supercomputers also have many cousins that are very similarly architected that everyone can use.