"The more things change, the more they seem to stay the same"
In the 1960s, my Aunt Delores handed out some white plastic spinning tops with a blue paper insert, emblazoned with the letters "IBM." I remember her telling us kids how she got them from her workplace, the Savings Bank of Utica, and how she was currently working with something called a "Mainframe."
We didn't understand what she meant, and we weren't alone in that - at the time, most people had no idea what a mainframe was. Today, most people know its a computer, but they lack detailed knowledge of how a mainframe works.
Mainframes started as building-sized machines, privately operated on-premise in specialized rooms or buildings isolated to one location. Its computer power paled in comparison to the computer power of today's $1000 cell phone. They were also far more expensive: a typical 1960's IBM 7090 mainframe leased in the range of $63,500 a month and did not include any additional capital cost, electricity, or people to run it.
As a mainframe programmer, my Aunt earned just $55 per week, or about $1.37 an hour, and nobody understood what she did. Maybe that's why she ultimately became a teacher.
Today, mainframe programmers earn between $120,610 and $151,340 a year plus benefits. Now, we have Data Centers that house thousands of physical machines, host millions of virtual systems, and host billions of callable functions and lambdas. Services can be consumed on a per-use basis, connected to multiple clients from a mesh of centers worldwide. We're able to purchase cloud subscriptions that enable us to run a certain number of operations per second.
While the Mainframe is now the fastest-growing "dying platform," it still has an allure to many of us in the IT industry. Itself and the process surrounding it were our roots for innovation, structured operations, and technical history.
We often like to boast - or complain - about how quickly the Computer Industry changes. Technologically, this is true; we have many new programming languages since Cobal, we can process more information faster, and we can transmit, store, and manage staggering amounts of data.
But as much as technology has changed, many challenges surrounding it have not. Deep understanding, structured process, and methodical approaches have given way to agile, minimally viable software product development, making some challenges far more significant than they used to be.
"But as much as technology has changed, many challenges surrounding it have not. Deep understanding, structured process, and methodical approaches have given way to agile, minimally viable software product development, making some challenges far more significant than they used to be."
In this cloudification challenge series, we will be looking at real-world examples of these challenges and exploring strategies to reduce the risk and costs associated with them.
Do you know all of the costs of your cloudification strategy?
How aligned is your spending to the consumption of Cloud resources?
Are your cloud assets being placed in the most cost-effective hosting solution?
Follow along for guidance on how to solve some of today's most difficult cloudification challenges, generate an understanding of the possible strategies to implement, and get a comprehensive overview of the critical role that data plays in making your business stronger.
Jim Szczygiel has been working as an Information Technologist since the early nineties. Most recently Jim has held roles as a consultant, product manager, data analyst, sales, and solutions architect along with working on application development agile and extreme programming teams. Jim has provided services to hundreds of different fortune 500 clients in the sectors of; Chemical and Natural Resources, Finance, Insurance, and Manufacturing. Read More...