Blog Feed Post

Integrating Legacy Technology into Digital Best Practice by ‘Modernizing in Place’

As enterprises undergo digital transformation, they soon run up against a sobering realization: everything must transform. Not just technology, but people and processes as well.

It’s no wonder, therefore, that today’s executives are looking for ways to lower their risk – perhaps by taking shortcuts, or maybe by breaking up the Herculean task of transformation into manageable, bite-size pieces.

For enterprises that depend upon certain legacy technologies – most notably, mainframes running COBOL – the challenge of transformation multiplies, as none of the options sound appealing.

Replatform legacy applications, even though the process is extraordinarily costly and yields poor results? Split the IT effort into ‘slow’ and ‘fast’ groups, in spite of the numerous strategic and organizational challenges such a bimodal approach introduces?

The good news: these two options aren’t the only choices. For many such enterprises, it’s possible to achieve the strategic goals of digital transformation by modernizing some technology in-place.

Here’s how it works.

Avoiding the Bimodal IT Anti-Pattern

First, let’s start with a simplified representation of the enterprise before it undertakes its digital transformation:

Figure 1: The ‘Before’ Picture

In the ‘before’ picture in figure 1 above, we have a siloed organization with slow, gatekeeper-oriented processes. We’ll refer to all the tools and technology in this picture as legacy, even though some of it may not be that old. Supporting the entire stack are the enterprise’s data.

Some people (Gartner in particular) recommend that if you’re saddled with the before picture above and wish to transform your business, it’s best to create a separate, ‘fast’ effort (Mode 2) while leaving the existing, ‘slow’ IT (Mode 1) alone, as figure 2 below illustrates.

Figure 2: The Bimodal IT Anti-Pattern

While bimodal IT has its appeal to anyone who’d rather not monkey with existing systems and processes, it has numerous flaws.

Technical personnel would much rather be on the Mode 2, ‘fast,’ cool side, leading to morale and staffing challenges. Bimodal IT also leads to counterproductive budgeting priorities that starve the slow, yet mission-critical Mode 1.

The anti-pattern above also leads to bifurcated corporate data that can cause customer issues, compliance breaches, and all manner of other ills.

Worst of all, an organization that follows figure 2 will never achieve true digital transformation, as such transformation requires an end-to-end focus on customer needs.

Digital Best Practice

Instead, the approach we recommend is in figure 3, which we’ll call digital best practice:

Figure 3: Digital Best Practice

In the diagram above, the organization has undergone the necessary transformation of its organization and corresponding processes, leveraging modern, Agile/DevOps techniques to achieve an end-to-end focus on the customer.

Complicating the technology story above, however, is the fact that most enterprises typically have legacy, on-premises assets that are best left on their existing platform – especially if that platform is a mainframe.

We call this bifurcation of technology the ‘if it ain’t broke, don’t fix it’ principle. While the bimodal IT pattern in figure 2 is a poor choice, simply thinking that moving from the ‘before’ picture in figure 1 to all-new technology is the best option for the business is also a recipe for failure.

Instead, IT leadership should carefully review both existing technology assets as well as requirements for new technology, and implement a hybrid IT architecture that properly represents both types of technology.

Such implementation requires modern tooling – tools that today’s collaborative, self-organizing teams following Agile processes are comfortable using. Unlike in the bimodal scenario where the slow Mode 1 uses older tools, in the digital scenario, everyone uses modern tools – even when working with legacy assets. Compuware Topaz is a perfect example of such tooling.

Digital best practice also requires a modern, architected approach to corporate data. While mainframes may still serve as the enterprise’s central systems of record, there may also be requirements to maintain data in modern, cloud-based systems like Salesforce or ServiceNow.

However, instead of the complex integration scenarios the bimodal anti-pattern requires, modern digital best practice properly abstracts and virtualizes corporate data across the board.

The Intellyx Take

The diagrams above are simplifications, of course – true enterprise environments are by necessity far more complicated.

In particular, digital best practice rarely breaks technology cleanly into two boxes. Far more common is a diversity of different technologies, each with a different strategy regarding modernization or migration.

It is in this real-word context, the ‘if it ain’t broke, don’t fix it’ principle is especially important. Replatforming existing COBOL applications, for example, is almost always inadvisable, as such a move rarely gains any benefit, while losing the bulletproof reliability of the mainframe.

Simply separating such mainframe efforts into slow Mode 1, furthermore, is also a bad idea.

However, we find that many senior-level executives nevertheless confuse the bimodal anti-pattern with digital best practice.

The reason: digital best practice may require the maintenance of legacy, on-premises assets while simultaneously calling for modern, cloud-native technologies. Superficially, such a move may resemble bimodal IT – but in reality, it means modernizing legacy assets like COBOL programs in place, by incorporating them into modern hybrid IT architectures and leveraging modern tooling to update older COBOL programs to participate fully in the modern, digital context.

As figures 2 and 3 above illustrate, furthermore, bimodal and digital best practice are in reality quite different from each other.

Yes, it’s possible to transform your people and your processes while simultaneously maintaining on-premises legacy assets as part of a coordinated digital transformation strategy.

If you have the right tooling, the right architecture, and you get the data right, you’re well on your way to digital transformation success – while maintaining mainframe assets well into the future.

Copyright © Intellyx LLC. Compuware and ServiceNow are Intellyx clients. At the time of writing, none of the other organizations mentioned in this article are Intellyx clients. Intellyx retains full editorial control over the content of this paper.

Read the original blog entry...

More Stories By Jason Bloomberg

Jason Bloomberg is a leading IT industry analyst, Forbes contributor, keynote speaker, and globally recognized expert on multiple disruptive trends in enterprise technology and digital transformation. He is ranked #5 on Onalytica’s list of top Digital Transformation influencers for 2018 and #15 on Jax’s list of top DevOps influencers for 2017, the only person to appear on both lists.

As founder and president of Agile Digital Transformation analyst firm Intellyx, he advises, writes, and speaks on a diverse set of topics, including digital transformation, artificial intelligence, cloud computing, devops, big data/analytics, cybersecurity, blockchain/bitcoin/cryptocurrency, no-code/low-code platforms and tools, organizational transformation, internet of things, enterprise architecture, SD-WAN/SDX, mainframes, hybrid IT, and legacy transformation, among other topics.

Mr. Bloomberg’s articles in Forbes are often viewed by more than 100,000 readers. During his career, he has published over 1,200 articles (over 200 for Forbes alone), spoken at over 400 conferences and webinars, and he has been quoted in the press and blogosphere over 2,000 times.

Mr. Bloomberg is the author or coauthor of four books: The Agile Architecture Revolution (Wiley, 2013), Service Orient or Be Doomed! How Service Orientation Will Change Your Business (Wiley, 2006), XML and Web Services Unleashed (SAMS Publishing, 2002), and Web Page Scripting Techniques (Hayden Books, 1996). His next book, Agile Digital Transformation, is due within the next year.

At SOA-focused industry analyst firm ZapThink from 2001 to 2013, Mr. Bloomberg created and delivered the Licensed ZapThink Architect (LZA) Service-Oriented Architecture (SOA) course and associated credential, certifying over 1,700 professionals worldwide. He is one of the original Managing Partners of ZapThink LLC, which was acquired by Dovel Technologies in 2011.

Prior to ZapThink, Mr. Bloomberg built a diverse background in eBusiness technology management and industry analysis, including serving as a senior analyst in IDC’s eBusiness Advisory group, as well as holding eBusiness management positions at USWeb/CKS (later marchFIRST) and WaveBend Solutions (now Hitachi Consulting), and several software and web development positions.

Latest Stories
@DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises - and delivering real results.
DXWorldEXPO LLC announced today that Dez Blanchfield joined the faculty of CloudEXPO's "10-Year Anniversary Event" which will take place on November 11-13, 2018 in New York City. Dez is a strategic leader in business and digital transformation with 25 years of experience in the IT and telecommunications industries developing strategies and implementing business initiatives. He has a breadth of expertise spanning technologies such as cloud computing, big data and analytics, cognitive computing, m...
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
DXWorldEXPO LLC announced today that Kevin Jackson joined the faculty of CloudEXPO's "10-Year Anniversary Event" which will take place on November 11-13, 2018 in New York City. Kevin L. Jackson is a globally recognized cloud computing expert and Founder/Author of the award winning "Cloud Musings" blog. Mr. Jackson has also been recognized as a "Top 100 Cybersecurity Influencer and Brand" by Onalytica (2015), a Huffington Post "Top 100 Cloud Computing Experts on Twitter" (2013) and a "Top 50 C...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
DXWorldEXPO LLC announced today that ICOHOLDER named "Media Sponsor" of Miami Blockchain Event by FinTechEXPO. ICOHOLDER give you detailed information and help the community to invest in the trusty projects. Miami Blockchain Event by FinTechEXPO has opened its Call for Papers. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Miami Blockchain Event by FinTechEXPO also offers s...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was jo...
Daniel Jones is CTO of EngineerBetter, helping enterprises deliver value faster. Previously he was an IT consultant, indie video games developer, head of web development in the finance sector, and an award-winning martial artist. Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams.
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.