Skip to main content

The Rise of Edge Infrastructure Storage

Hybrid cloud has changed the way business gets done. Its growth has also brought about the need to manage the edge of the network, which is often geographically dispersed and disconnected from the data. According to an IDC report, the number of operational processes deployed on edge infrastructure is expected to grow from less than 20% currently to over 90% in 2024. Organizations are increasingly faced with the challenge of storing this data.
 
Storing edge data is one of the reasons IBM announced IBM Spectrum Fusion, which seamlessly spans edge, core and cloud. Designed to simplify data accessibility and availability, Spectrum Fusion is the first container-native solution designed on Red Hat OpenShift. The new offering fuses IBM’s general parallel file system technology and its data protection software.
 
“IBM sees that containers are where the world is going,” says Eric Herzog, vice president, Business Development and Evangelism and Vice President of Global Storage Channels, IBM Storage Division.
 
The need for Spectrum Fusion is also evident to systems integrators like Mainline Information Systems. “In today's world, our customers want to leverage their data from edge to core data center to cloud and with IBM Spectrum Fusion hyperconverged infrastructure (HCI) our customers will be able to do this seamlessly and easily,” says​ Bob Elliott, vice president storage sales, Mainline Information Systems. 

Why Container-Based Storage? 

Rather than looking back and focusing on virtualization like its competitors, IBM made the decision to look ahead to where clients plan to move in the future—containers.
 
It’s common for workloads and applications to create multiple copies of the same data set—one for the edge users, one for the core users and one for the cloud users. Spectrum Fusion offers a streamlined solution with a single copy of data.
 
“By having a single copy, we give customers much better CapEx and OpEx, because they only have one copy,” Herzog says. “If you’ve got 20 PB and you make four copies of it, you have to buy 80 PB. You have to manage 80 PB and all of the aspects of OpEx, like power, cooling, rack space.”
 
The software savings are also significant with a single piece of software that incorporates a scalable file system in global managed space across all of the locations. With Spectrum Fusion, for example, IBM offers an API that can gather all the metadata information across IBM and competitor storage for AI and analytics data.
 
Finally, Spectrum Fusion backs up, protects and restores data. It can also replicate the data for disaster recovery and high availability. “It basically combines a storage foundation with enterprise-class data storage services into a single piece of software versus buying all of these separate pieces of software. It’s the first fully container native HCI appliance,” Herzog explains.
It will also be available as a software-defined solution later in 2021. 

Elastic Storage System Updates 

Spectrum Fusion wasn’t the only announcement Big Blue made. Updates to the IBM Elastic Storage System (ESS) family were also unveiled. The revamped ESS 5000 supports 10% more density with a total capacity of 15.2 PB. The new ESS 3200 offers double the read performance of its predecessor, the ESS 3000. The all-flash systems are a strong play for AI, analytics and big data.
 
The ESS updates are important for a few reasons, Herzog says. First, the new ESS family automatically joins the existing file system in the global native space, saving time and manpower. Second, the performance and capacity scales linearly, which helps clients save on CapEx.
 
The ESS 3200 is a 2U solution providing up to 80 GB/second throughput per node. It supports up to eight InfiniBand HDR-200 or Ethernet-100 ports for high throughput and low latency. It can also provide up to 367 TB of storage capacity per node.
 
ESS systems are equipped with streamlined containerized deployment capabilities automated with the latest version of Red Hat Ansible. Both feature containerized system software and support of Red Hat OpenShift and Kubernetes Container Storage Interface. IBM Spectrum Scale is also built in.

AI, Big Data and More

Herzog views these latest storage announcements as testament to IBM’s leadership in the container-native storage, which he says will be important to clients interested in AI, big data and analytics.

“These are some of the fastest-growing applications in the world for companies of all sizes, not just the global Fortune 2000. It’s become more and more important to make sure you’re aligned with what the storage needs are,” he explains. “The storage must provide the capability for the app to work right with the appropriate bandwidth, availability and disaster recovery in a container-native world.”