Being a technologist who embarked on a career in corporate technology during the 1990s, I witnessed the dominance of massive IBM Mainframe computers within corporations. These mammoth machines were situated in expansive data centers centralized in one location. The old mainframe computers had a certain elegance to them—they were accessed by clients through what was known as a “dumb terminal.” A dumb terminal, a mere screen and keyboard, transmitted keystrokes via cable networks to the mainframe in the data center. The IBM mainframe possessed the capability to allocate processor, memory, and disk space to distribute resources among various clients, industries, and users.
The concept of sharing resources in mainframe data centers was pioneered by Ross Perot when he founded Electronic Data Systems in the 1960s, effectively initiating the outsourcing trend for large corporations. From the 1960s until the mid-1990s, much of the business landscape operated under this model, devoid of client computing; all computing activities occurred in the data center. End users or coders sat at their desks, and their keystrokes were transmitted to the centralized data center.
However, the 1990s brought about a significant shift! Desktop computers and local area networks (LANs) began to dominate. Networking expanded rapidly, and every office worker eventually had a complete computer on their desk connected to a LAN. These LANs marked the inception of distributed computing. Each desktop possessed its own processing power, alongside data center servers capable of further processing. This computing revolution led to exponential growth in business computing power. LAN networks evolved into client-server networks. Business applications were developed to leverage the processing power and CPU cycles on both the desktop (client) and in the data center (server).
Despite significantly enhancing processing power efficiency, this computing model heightened risks related to data loss and security vulnerabilities. Client/server networks facilitated the advancement of Graphical User Interfaces and desktop applications that collected data. Client data was transmitted across the network to the server, where it was amalgamated with data from other clients, followed by data manipulation and heavy processing taking place on the server side.
Fast forward to the past 15 years, and we encounter the advent of cloud computing, aligned with my initial hypothesis. Is cloud computing the contemporary rendition of the classic IBM mainframe? I argue yes, because with cloud computing configured into virtual machines, both client input and business logic processing occur on servers within the data center. The applications you use on your smart devices merely serve as windows into the data center. These apps transmit keystrokes and data to cloud server data centers for processing. While cloud computing offers immense advantages owing to distributed computing and enhanced security measures, it essentially parallels the age-old method of IBM mainframes processing everything in a central data center, but with substantial improvements!
Leave a Reply