Welcome!

Related Topics: @DevOpsSummit, Linux Containers, Containers Expo Blog

@DevOpsSummit: Blog Feed Post

The ‘What’ and ‘Why’ Behind Continuous Delivery | @DevOpsSummit #CD #Agile #DevOps

Users’ needs have evolved – they expect more releases and features which waterfall just isn’t able to deliver

The ‘What’ and ‘Why’ Behind Continuous Delivery
By Ron Gidron

In the digital age, speed is everything and no-one wants to be left behind. Being slow to react is the first and, most likely, final nail in the coffin for a 21st century company. The rigidity of the traditional ‘waterfall' development approach was overhauled by the ‘agile' philosophy. And with agility things have changed; its brought into reality the idea of continuous delivery.

Move with the Times or Get Left Behind
But what is continuous delivery? Simply put, it's the ability to push any kind of change, be it new functionality, code updates, bug squashes or the like, into production environments quickly and safely. Not only this, but each release must be stable and sustainable. Continuous delivery eradicates pesky time-consuming processes and freeze periods, as it removes the need for dedicated silos of integration and testing. This is all achieved by creating code which is always in a state fit for deployment. In theory, continuous delivery enables us to release stable code to production at any given time - an approach demanded by the needs of the end user and impossible to achieve through waterfall-styled development.

Those who have adopted continuous delivery are already reaping the rewards. They're seeing how it significantly improves your time-to-market: as your changes can be published almost instantly, you're no longer restricted by a rigid release schedule. You become much faster to react, to iron out bugs and release features. Your customers are more satisfied than before, which is what we ultimately strive for.

Shipping a release has traditionally been momentous, something we want to shout about, and rightly so. Ironing out those bugs or releasing those long sought-after features is no small feat and takes a huge amount of effort from everyone involved. There's nothing worse than shipping, only for new bugs to crop up, or the software to crash. Continuous delivery allows you (in theory) to continually deploy - just not always to customers. Pushing your stable code to UAT environments or staging reduces the risk around the production release. Now, you're no longer deploying once every few months: you can be releasing every single day.

There are often barriers, inefficiencies and hidden costs in the release cycle which, historically, went unnoticed until launch. Continuous delivery highlights all these flaws, making them clear to the business and senior management members, who are in charge of making decisions. Pipelines will be much more transparent: you'll know where and when manual, human input will be required, where bottlenecks will crop up, where automation can be implemented. The pipeline now creates a clear incentive for a dynamic software delivery schedule, replacing a notion of dissatisfaction with costly, long and arduous release windows.

Flexibility is one of the main selling points of the model. Yes, there is an initial outlay in terms of infrastructure, in both software and operational architectures, but once this seed is sown, the benefits are there to be reaped. Features and fixes are now available to be pushed to specific individuals or customer subsets, ensuring the functionality works as expected. Or the features can remain dormant within the product, awaiting a future release which could be sparked by a marketing push for example. In the past, trying to devise such functionality would have been a logistical and costly nightmare. With continuous delivery, it's par for the course.

The rewards speak for themselves. Like I said, continuous delivery is simple - in theory. The difficulties arise in its implementation.

The Pains of Changing
The main challenge you may encounter while trying to implement continuous delivery will probably be organizational. While more and more companies are working toward implementing a DevOps culture - structurally, they may not yet be ready for continuous delivery. You'll still find plenty of firms are split into seemingly countless divisions, each of which ‘owns' a particular product, feature, or codebase. Each division is going to have its own goals, targets and KPIs it must meet. Trying to bring these ‘opposing' factions together can be a logistical nightmare which could prove to be the undoing of your dreams of agility.

Therein lies the problem. For large companies, it can take months, if not years, to move complex applications to continuous delivery. It requires a complete mind-set overhaul to adapt to this new process. New behaviors and practices must be learned, architecture will probably need revisiting as will software development processes. Top-down changes must be implemented in order to promote a culture of collaboration.

In all honesty, continuous delivery can seem a tough sell when presenting the concept to senior management for a number of reasons. Firstly, they have their own day-to-day tasks to see to, which depending on seniority levels, are going to take up much of their time. They may not be as tech-minded as yourself and may not be able to immediately see the benefits of implementation. They also have their own views, priorities and goals, which may vary from yours.

The barriers to employing continuous delivery may seem insurmountable at times but, as we've seen, the benefits of the approach speak for themselves. That's how you'll sell the approach to senior stakeholders.

Automation is the key
Continuous delivery has revolutionized the way we develop and release software, but without automation, arguably it wouldn't be possible at all. Automating the entire pipeline, from code submission, through testing and environmental deployments is crucial to being able to obtain true continuous delivery.

The entire philosophy is built around flexibility, around agility; code changes and releases could be occurring at any time. Without the right automation processes in place, saving manual testing, deployment and releases, we'd be back at square one: the cumbersome waterfall approach. Manually overseeing each of these processes would completely defeat the entire goal of continuous delivery. But, like I said, times really have (a) changed.

Read the original blog entry...

More Stories By Automic Blog

Automic, a leader in business automation, helps enterprises drive competitive advantage by automating their IT factory - from on-premise to the Cloud, Big Data and the Internet of Things.

With offices across North America, Europe and Asia-Pacific, Automic powers over 2,600 customers including Bosch, PSA, BT, Carphone Warehouse, Deutsche Post, Societe Generale, TUI and Swisscom. The company is privately held by EQT. More information can be found at www.automic.com.

Latest Stories
SYS-CON Events announced today that Dasher Technologies will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Dasher Technologies, Inc. ® is a premier IT solution provider that delivers expert technical resources along with trusted account executives to architect and deliver complete IT solutions and services to help our clients execute their goals, plans and objectives. Since 1999, we'v...
Though cloud is the future of enterprise computing, a smooth transition of legacy applications and systems is critical for seamless business operations. IT professionals are eager to start leveraging the cost, scale and other benefits of cloud, but with massive investments already in place in existing infrastructure and a number of compliance and resource hurdles, it can be challenging to move to a cloud-based infrastructure.
SYS-CON Events announced today that NetApp has been named “Bronze Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. NetApp is the data authority for hybrid cloud. NetApp provides a full range of hybrid cloud data services that simplify management of applications and data across cloud and on-premises environments to accelerate digital transformation. Together with their partners, NetApp emp...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, will provide a fun and simple way to introduce Machine Leaning to anyone and everyone. Together we will solve a machine learning problem and find an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intellige...
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
SYS-CON Events announced today that TidalScale, a leading provider of systems and services, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TidalScale has been involved in shaping the computing landscape. They've designed, developed and deployed some of the most important and successful systems and services in the history of the computing industry - internet, Ethernet, operating s...
Infoblox delivers Actionable Network Intelligence to enterprise, government, and service provider customers around the world. They are the industry leader in DNS, DHCP, and IP address management, the category known as DDI. We empower thousands of organizations to control and secure their networks from the core-enabling them to increase efficiency and visibility, improve customer service, and meet compliance requirements.
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, will describe how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launchi...
Data scientists must access high-performance computing resources across a wide-area network. To achieve cloud-based HPC visualization, researchers must transfer datasets and visualization results efficiently. HPC clusters now compute GPU-accelerated visualization in the cloud cluster. To efficiently display results remotely, a high-performance, low-latency protocol transfers the display from the cluster to a remote desktop. Further, tools to easily mount remote datasets and efficiently transfer...
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing...
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, will lead you through the exciting evolution of the cloud. He'll look at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering ...
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere delivers a more modern architectural approach to storage that doesn't require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbui...