Skip to main content

Feet of Iron, Heads of Clay: The Counterintuitive Persistence of Legacy Technology

COBOL and the IBM mainframe are nerds. I’ll explain shortly, but first…

The term “feet of clay” has become a common part of our language, referring to leaders, organizations and initiatives that have unreliable foundational traits despite the best and most visionary of intentions. Of course, the subtle implications often include that those very attractive parties had essential weaknesses that were part of this self-defeating flaw. Yet, for all appearances and investments, they had bright futures—until they didn’t.

The past few decades (and actually, all of human history) have given us plenty of stories of this nature to reflect on, and so this image has become nearly commonplace as a way to refer to them. What “feet of clay” comes down to is shiny, attractive appearances but flimsy fundamentals.

Ah, fund – amentals. You know: money (funds) without due thought (amental). Every so-called “bubble” in history has had an element of large amounts of money being pumped into ephemeral pursuits. Meanwhile, great investors like Warren Buffett have taken the time to get the real data and have not been easily deceived by shiny appearances.

Ooh! Substance over appearances! Back to the nerdy legacy technologies like COBOL and the mainframe! How easy a target they appear to make for pundits who presume anything new must be better than the tried-and-proven.

So, here’s the thing: The world of IT has grown up and is no longer just a festival of the latest sizzle to replace last year’s steaks. The stakes are much higher than that. The technologies that power the global economy are now over half a century old, and the chaff that pretended to be replacements have either blown away or morphed into endless generations of commodity consumer electronics.

But the thinking that keeps the latest sizzle draining the budgets is still pervasive. It’s time to consider what this means and what can be done about it, especially as the golden age of big iron slowly becomes evident as being the entirety of the past six decades, and likely the next millennium.

Big Irony

Ironically, that brings us to the foundational strength of established business technology, which is like housework: Nobody notices as long as it gets done. We take big iron for granted, along with these other foundational legacy technologies (COBOL!), while continuing to fill the ranks of management and decision-makers with people who seem intent on shooting their organizations, and the economy at large, in the proverbial foot. And yet this has in some ways been a refining challenge that has allowed the very best of IT to become ever stronger.

But, while history gives us an endless supply of, ummm, people of appearances and not substance in roles of leadership and influence, we really shouldn’t take them for granted as a refining adversity for the technologies that were built to last. We need to demand better from our leaders, and that begins with recognizing where we do have substance and value and getting the word out.

In other words, it’s time to turn our perception of IT on its proverbial head. But…not if it’s a head of clay. So, before we go ahead, let’s try to understand who the players are and what their game has been since the dawn of electronic computing.

The Continuum of Business Computing

Of course, the history of business computing is a continuum that can be traced back through human history. Whether we’re learning about Socrates in Plato’s Symposium talking about technē and poiēsis (roughly, technology and artisanship), or looking at Pascal’s various calculation mechanisms, or Babbage’s theoretical Analytical Engine, or going straight to Hollerith’s punch cards that became a key line of business for IBM, there were numerous practical and theoretical threads of thought and innovation leading up to the first electronic computer—and its immediate electromechanical predecessors.

But one of the least appreciated threads in this whole journey has been the tension between substance and credit. And yet, it has become a point of differentiation, even today, between those who are willing to use what works to get reliable results, and those who wish to get headlines and speculative investment by deprecating legacy with nebulous promises. (“Buy on rumor; sell on fact.”)

A historical moment (described in Emerson Pugh’s book “Building IBM”) that embodies this essential fault line that has divided the history of business computing was the August 1944 announcement of the Harvard Mark I, an electromechanical computer which IBM had invested heavily in co-developing with Harvard University’s Howard Aiken. Aiken announced the computer without IBM’s involvement and without giving them any credit, enraging IBM CEO Thomas J. Watson, Sr., who had heavily backed this technology that looked to be the future of data processing beyond the punched cards that had been such an important line of business for IBM since its founding.

Why is this more than a footnote in history? Just ask anyone with a computer science degree (myself included) how much they learned about IBM, COBOL and related established (legacy) business computing versus platforms with more purely academic origins (e.g., UNIX and its progenitor MULTICS) and commodity consumer electronics computers (e.g., Microsoft, Intel, Apple). No matter how heavily IBM has invested in academia, the sizzle goes to individual egos, and the legacy platforms and languages and their purveyors get shrugged aside. Therefore, students are given the impression that computing is about the latest sizzle and academic innovation, not established functionality that is tried and proven and, well, legacy.

To shift from favoring quick results and unprovable quality to more solid, professional practices in computing, an essential differentiator between the average graduate today and the type of professional produced by engineering, accounting, law and medicine, will be a clear understanding of and ability to effectively work with established systems.

Those systems aren’t going away. There aren’t enough people in all of IT to rewrite the hundreds of billions of lines of COBOL running the world economy, and there isn’t a business platform or language out there offering a compelling alternative, let alone advantage, to make it worth trying.

That hasn’t stopped clay-headed decision-makers from trying. Time and again, organizations that have built their success on IBM mainframes and/or COBOL and related legacy technologies have given in to the “visionary leadership” that promised a better world if new alternatives could just replace proven incumbents. As a member of one such organization wryly noted, “We started sunsetting this system so long ago, the sun’s coming up again!”

A Legacy of Professional Provision

But how do you describe a “visionary” investment in replacement technology that cannot be proven to even offer the same qualities of service of what is being replaced? Obviously, “professionalism” is not one of the qualities that one would associate with such an approach. For that matter, an awareness of instructive stories throughout history and literature (think: Aladdin’s lamp) would mitigate the enthusiasm of anyone with a well-formed perspective on how things actually work.

Instead, IT leadership over the past half-century has increasingly been drawn from the realms of people with no formation in understanding, maintaining and building on established and proven value. A pure focus on potential sizzle is self-evidently a missed steak, yet the spirit of Enron seems to have taken over the cohorts of IT decision-makers who are more interested in lining their CVs with interesting projects than keeping businesses moving full-speed-ahead with what works.

This, despite my contention that one of IBM’s least appreciated products is ex-IBMers. Organizations that are run by people who have had proper formation in a blue-chip business that understands integrity with established value—they are the ones that have staying power for the future.

One might be tempted to suggest that, if the lessons learned by IBM and their customers since the 1940s are being relearned on other platforms in the new millennium, at least they’re being learned. But they’re not. This is the great tragedy of non-legacy IT. It’s built on every different type of bloat, with foundations that require constant patching in a Sisyphean effort to keep up with security and other constantly emerging feet of clay.

So, why am I going to end on a positive note? Because what works, works. For all the crumbling commodities that have been hocked at bargain prices, only to need constant repair and replacement, there is nothing that comes close to what was planned and proven and established by the 1970s and has only gotten stronger since.

But it’s time for IT leaders to sit down and look at what works and ask why it works, why it matters and what are the genuine costs and benefits, all-in, of keeping and building these technologies. There is no justification for the constant introduction of flimsy parallel infrastructures that can only absorb conversions of the lowest-hanging fruit while more than doubling the costs of IT. And there will never be enough qualified IT people to staff the ballooning demand for non-legacy technologists.

When we realize the answer we’re looking for has always been there and wipe the clay sleepy seeds from our eyes, there’s no telling how much more our legs of iron will enable us to achieve.