Skip to main content

The Evolution of Mainframe Education: Part 3

Kyndryl’s Pat Stanard recaps this year’s Kyndryl Internship Program

TechChannel Education and Training

One of Kyndryl’s goals is to provide mainframe learning opportunities for young people, as the industry needs a pipeline of mainframe talent to replace mainframers approaching retirement. This year I am writing my third article on the Kyndryl Internship. As my hair continues to turn gray, this topic could not be more critical to the future of the mainframe.

I am passionate about preparing young university students to take the helm and support the mainframe ecosystem that runs our planet. This job is a true labor of love and it’s extremely important that we do a good job of educating our young university students on the platform as we welcome them into new and exciting careers as mainframers.

The Kyndryl Internship Program

This year, I had the extreme pleasure to work with our talented mainframe interns for the third year in a row. The 12-week program had 42 participants and covered several project areas for Kyndryl, including our mainframe discipline in the Dallas, Texas, area.

Our mainframe architect interns all had that “fire in the belly” that I value so highly in work ethics. I want to thank one of our mainframe architects, Kim Duran, who helped me to run this year’s program once again. I’d also like to thank the myriad mentors and presenters that helped to make the program successful.

The Kyndryl Mainframe Internship experience is that of an introductory mainframe architect and exposes interns to zSystems lab work, architect tools, disruptive topics, mainframe modernization and architectural thinking. The program also offers real-life customer experience with the zCloud and MFaaS team. At the end of the program, interns demonstrate their skills and understanding during a team presentation. Interns acquire a basic understanding of why mainframe architecture is vitally necessary in modern industry. One of the assignments in our internship is a disruptive technology research paper, and this year’s Kyndryl mainframe architect interns did a fantastic job constructing and presenting their assigned topics. The papers were in direct support of our mainframe strategy: the right workload for the right platform. Here’s an overview of this year’s topics.

Carly Krieger: Temple University Senior

Paper: Mainframe in the Era of Big Data and Analytics

Over the past 15 years, data has grown exponentially worldwide. The International Data Corporation, IDC, calculates that in 2010, 2 zettabytes of data were created. A single zettabyte (ZB) is equivalent to 1,000 exabytes. To put this into perspective, every language ever spoken by humans would be around 5 exabytes of data.

So, 2 ZB in 2010 was 400 times the amount of data needed to store every human language.

Just 15 years later, in 2025, the IDC predicts over 180 ZB of data will be created. This is the largest information explosion in human history. So, how will organizations manage all that data?

The technological trend of big data and analytics is one that mainframe systems are continuously adapting to. Big data analytics is the systematic processing and analysis of large amounts of data to uncover trends and patterns and extract valuable insights for businesses. Unlike traditional data analytics, which typically stores structured data in relational databases, big data analytics stores various types of data including structured, semi-structured and unstructured data. In the early 2000s, software and hardware advances allowed organizations to collect and process massive amounts of unstructured data. Coming from sources like IoT sensors, social media, financial transactions and artificial intelligence, unstructured data reveals market trends, customer preferences and important business metrics.

Because of the complexity of the data, analysis requires more sophisticated techniques like machine learning, data mining, deep learning and natural language processing models to generate useful insights. For optimal utilization, data must be collected, processed, cleaned and analyzed so companies can make data-driven decisions. Big data analytics offers countless benefits like real-time intelligence, better-informed decisions, cost savings, better customer engagement and optimized risk management strategies.

Eric Hooten: Missouri University Senior

Paper: Beyond Classical Limits: An Exploration of Quantum Computing’s Impact on the Mainframe

Computing has been a field of constant change and innovation. From vacuum tube machines that took up entire rooms, to personal computers and workstations that fit on a desk, to phones that possess more processing power than was used to land man on the moon, it seems that computers are always getting smaller and more powerful. This phenomenon is largely due to Moore’s Law, the observation that the number of transistors on an integrated circuit doubles approximately every two years. However, as we approach the physical limits of classical computing, a new paradigm is emerging: quantum computing.

Quantum computing represents a groundbreaking approach to computation, leveraging principles of quantum mechanics to perform calculations in ways that classical computers cannot. Although this burgeoning field is expected to revolutionize the way we solve complex problems, it is not—and never will be—a replacement for classical or binary computers. Instead, quantum computers will be used to augment and complement existing classical computing systems, offering unprecedented capabilities in areas like cryptography, machine learning and large-scale data processing.

Mainframe computing, a cornerstone of enterprise IT infrastructure, excels in handling vast amounts of data with high throughput and reliability. Mainframes are optimized for many small transactions—measured in MIPS (millions of instructions per second)—making them indispensable in sectors such as banking, medicine and logistics. Conversely, quantum computers excel at solving single complex problems exponentially faster.

The intersection of quantum computing and mainframe computing holds immense potential, as integrating quantum’s advanced computational abilities with the robust data-handling prowess of the mainframe could revolutionize data processing and analytical capabilities in enterprise environments.

Colette Atupulazi: University of Texas Senior

Paper: Mainframe Machine Learning and AI Applications

Although mainframes have long served as the foundation of big corporate operations, the computer industry is rapidly changing. The development of machine learning (ML) and artificial intelligence (AI) is revolutionizing how humans interact with technology and get meaningful insights from data. Large data sets may be used to train AI algorithms, which can then be used to make smart judgments, automate processes and even forecast future trends. ML enables computers to perform better on a given job without the need for explicit programming.

In light of these developments, increasingly more potent and secure computing systems are becoming necessary to manage the intricate algorithms and enormous data sets related to AI and ML applications. This is where the strengths of mainframes become more apparent. Modern mainframe CPUs offer many advantages that make them suitable for AI integration which includes the raw processing power needed to efficiently run complex AI algorithms, which is essential for training large AI models that require processing massive amounts of data to learn and improve. The robust security features of mainframes provide a trusted platform for handling the sensitive data used in AI applications, enabling businesses to leverage AI for tasks like fraud detection or risk analysis with the peace of mind that their data is protected by the industry's most secure computing environment.

Mainframes also offer scalability to handle ever-growing datasets and increasing workload demands associated with AI applications, allowing businesses to adapt their AI capabilities as their needs evolve. Furthermore, the renowned reliability of mainframes minimizes downtime, ensuring uninterrupted operation of AI-powered processes.

Zach Thomas: Missouri University Senior

Paper: Mainframe Virtualization and Containerization

Virtualization is a concept in cloud computing which allows for the dynamic allocation of server, network and storage resources across a network of servers. This process maximizes resource utilization, reduces costs and enhances flexibility and scalability by creating multiple virtual machines on a single server.

A similar concept applies to mainframe computing. Virtualization abstracts physical resources like hardware platforms, storage and networks to enable multiple operating systems to run simultaneously on a single mainframe, optimizing resource use and increasing system flexibility. Not only that, but data stored on physical drives can be virtualized and moved off platform to be processed however fits the user’s needs.

Alongside virtualization, containerization is increasingly being used in mainframes. Unlike VMs that virtualize physical hardware, containers virtualize at the operating system level, packaging an application with its necessary dependencies. This makes containers lighter and ensures consistent performance across different environments and platforms. Containers are useful for their portability, efficient resource use and quick deployment capabilities, aligning traditional mainframe strengths with practices like those in cloud environments. Together, virtualization and containerization can increase the operational capabilities of mainframes, merging their robust architecture with contemporary computing efficiencies.

Input From Jonathan Dietz

I asked Jonathan Dietz to weigh in on the value of the mainframe internship, as well. Jonathan is vice president of U.S. data center strategy and execution at Kyndryl, and formerly the practice leader for core enterprise and zCloud. Here’s what he shared:

We are in our third year of running the Kyndryl mainframe internship, and I have to say, the enthusiasm of the students and the mentors working with them is infectious. The amazing real-world investigations and collateral that was produced by these four amazing interns is something that many of our experienced mainframers are asking to review. When you have the passion and drive to dive into big data, quantum computing, virtualization and containerization and add the cherry on top of artificial intelligence within the IBM Telum chip capabilities…I mean, really! This group just rocks (sorry, the 70s and 80s will do that to you). The ambition of these youngsters to want to dive into this complex, next-generation, “legacy” technology is just like I said, “infectious” (but definitely not in a cyber way).

Keeping Up With Mainframe Education

As an adjunct professor with Wentworth Institute of Technology in Boston, I have authored and delivered my enterprise computing class to a myriad of IT/CS students over the past eight years. The enterprise computing class teaches the fundamentals of the mainframe and piques the interest in the mainframe as a career. The class is always fully booked as mainframe education remains a popular topic that needs continuous advocacy at our universities.

The Kyndryl Internship Program is a prime example of our commitment to mainframe education.  As the current mainframe workforce ages, it is pivotal that new talent is developed to replace this vanishing skill set. The world economy is fueled by the mainframe and that skill set needs to be grown and fortified to ensure the future success of Fortune 500 companies. The new interns bring fresh ideas and skills to the mainframe platform to continue its modernization and optimization into the future and beyond.