Welcome!

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Linux Containers, @BigDataExpo, SDN Journal

@CloudExpo: Article

Mainframe: A Resilient Model for the Modern Cloud

The emerging cloud-based model of computing requires systems that can provide fast response times to huge volumes of requests

Technology is moving at a blistering pace. In today's era of data-centric, complex environments where the lines between business and technology are becoming increasingly blurred, organizations are moving beyond virtualization to cloud computing to meet new challenges and keep up with the pace of change. Critical investments are needed to keep companies competitive, and chief among these technologies is cloud computing. In fact, Gartner expects cloud computing to become the bulk of new IT expenditure by 2016. The bottom line is, if you're not already looking at cloud as an essential investment, you're risking your survival into the next era of computing.

The emerging cloud-based model of computing requires systems that can provide very fast response times to huge volumes of requests. And, mission critical services such as healthcare, finance, transportation, public utilities, and other industries require very high levels of availability, security and other industrial-strength capabilities. Those attributes, qualities and requirements make the mainframe the ideal platform for such mission critical cloud-based workloads.

Cloud computing is a modern extension of a concept first developed nearly 50 years ago with the mainframe. The inherent spirit behind mainframe based computing was to serve users in remote locations at the same time, on a pay-as-you-go basis. The mainframe was introduced as the most robust, scalable system ever built, and with continued innovation the system has maintained its leadership status as one of the platforms of choice to handle today's complex workloads including sophisticated public, private and hybrid cloud computing environments. At its core, the mainframe was designed around three key traits - virtualization, standardization and provisioning. Not coincidentally, these are the foundational requirements for true cloud implementation.

Most enterprises today started their cloud journey with low-risk applications and high agility requirements. This approach allows customers to ease into cloud computing, learn and adjust their management of the cloud, and build the confidence to introduce more demanding applications.  The applications tend to use web technologies and architectures that can be scaled on commodity infrastructures, using load balancing and service cloning. Batch workloads that fit with commodity infrastructures are another popular workload on clouds.

For private, public or hybrid clouds, the mainframe can provide the following key requirements:

  • Scalability - users need to scale quickly and efficiently both up and down with complete confidence and zero loss of availability.
  • Reliability - a cloud computing environment that is always accessible with guaranteed application performance, limited to no downtime with provisions for rapid recovery from failure.
  • Multi-Tenancy - allowing multiple users to access software applications on the same system, concurrently and securely, critical for cloud service providers hosting many organizations in a single cloud infrastructure and for enterprises deploying private clouds to manage growth through acquisitions to host multiple companies in the same infrastructure;
  • Cost Efficiency - consolidating a distributed x86 cloud environment onto one mainframe creates a simplified, more efficient environment with reductions in floor space and power requirements, and higher return on investment over the life of the platform;
  • Security - the mainframe has unmatched system security with ensured isolation and protection of each virtual server environment.

Companies across various industries are gaining these advantages and efficiencies by consolidating cloud environments on a mainframe, such as:

By consolidating cloud on a mainframe private cloud solution that replaced thousands of standalone servers for its daily business activities like policy verification, claims processing, and generating customer quotations, Nationwide Insurance has saved 80% in energy and facility costs. The consolidation saved the company roughly $15 million over three years and will only continue to efficiently keep costs down in the future. Additionally, this solution gives them the capacity, processing speeds and reliability to increase the pace of innovation across its products and channels as it continues to grow.

By leveraging the cloud capabilities offered by the mainframe, Marist College was able to extend its business analytics technology to its academic community including researchers and students, while extracting even more value from its IT investments. By providing its analytics technology via cloud, the college has been able to expose analytics tools to a wide variety of programs, including technical disciplines and also business, liberal arts and communications programs so students learn how to apply it to their fields of study. Marist has also realized significant financial benefits, saving roughly $350,000 by using the cloud to support the college's ERP system.

The mainframe, with its shared platform, integration, and secure design attributes combined with continuous innovation, has enabled organizations to stay ahead of changing market dynamics with a solution that embodies efficiency, economics and agility - a resilient solution for today's cloud environment.

More Stories By Jose Castano

Jose Castano is the Director for the System z Growth Initiatives in IBM’s Systems & Technology Group. He has over 25 years of experience within IBM and has held multiple key positions in System z during this tenure.

Jose has worldwide responsibility to drive new workloads on System z. This includes Cloud, Analytics, Mobile, and Security He sets the business and technical strategy and direction for the System z platform. He drives coordination and collaboration of the System z ecosystem, from marketing, sales, business partners, consultants, and most importantly customers; leading the platform through an evolution that maintains leadership and meets customer and industry requirements.

Jose has a team comprised of workload and industry architects (who focus on business trends, market and industry requirements and develop solutions/offerings). Offering managers (who are responsible for the GTM for the solutions/offerings) and ISV managers (who work with our ecosystem to support new and existing workloads). Together, these teams have responsibility for researching, designing, building and maintaining the new workload strategy and its roadmap for IBM System z, driving the plans for the next 3-5 years.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Ovum, a leading technology analyst firm, has published an in-depth report, Ovum Decision Matrix: Selecting a DevOps Release Management Solution, 2016–17. The report focuses on the automation aspects of DevOps, Release Management and compares solutions from the leading vendors.
Large scale deployments present unique planning challenges, system commissioning hurdles between IT and OT and demand careful system hand-off orchestration. In his session at @ThingsExpo, Jeff Smith, Senior Director and a founding member of Incenergy, will discuss some of the key tactics to ensure delivery success based on his experience of the last two years deploying Industrial IoT systems across four continents.
"This week we're really focusing on scalability, asset preservation and how do you back up to the cloud and in the cloud with object storage, which is really a new way of attacking dealing with your file, your blocked data, where you put it and how you access it," stated Jeff Greenwald, Senior Director of Market Development at HGST, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
"We formed Formation several years ago to really address the need for bring complete modernization and software-defined storage to the more classic private cloud marketplace," stated Mark Lewis, Chairman and CEO of Formation Data Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
There will be new vendors providing applications, middleware, and connected devices to support the thriving IoT ecosystem. This essentially means that electronic device manufacturers will also be in the software business. Many will be new to building embedded software or robust software. This creates an increased importance on software quality, particularly within the Industrial Internet of Things where business-critical applications are becoming dependent on products controlled by software. Qua...
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Most organizations prioritize data security only after their data has already been compromised. Proactive prevention is important, but how can you accomplish that on a small budget? Learn how the cloud, combined with a defense and in-depth approach, creates efficiencies by transferring and assigning risk. Security requires a multi-defense approach, and an in-house team may only be able to cherry pick from the essential components. In his session at 19th Cloud Expo, Vlad Friedman, CEO/Founder o...
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. Commvault can ensure protection, access and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his general session at 18th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Part...