Welcome!

Blog Feed Post

DevOps Toolchains – Integrated Toolchains for High Performance Software Development

The overall goal of the Cloud Native formula is the shift to DevOps practices, and this section begins to describe how all of these different trends combine to unleash the power of the formula.

As Wikipedia explains DevOps is:

a clipped compound of “software DEVelopment” and “information technology OPeration S“) and is a term used to refer to a set of practices that emphasize the collaboration and communication of both software developers and information technology (IT) professionals while automating the process of software delivery and infrastructure changes. It aims at establishing a culture and environment, where building, testing, and releasing software can happen rapidly, frequently, and more reliably.

http://cloudbestpractices.net/wp-content/uploads/2017/08/devops.png 496w" sizes="(max-width: 300px) 100vw, 300px" />This goal of increasing software release frequency is the headline message and is why the approach typically goes in hand in hand with the adoption of Continuous Delivery best practices:

“an approach in which teams produce software in short cycles, ensuring that the software can be reliably released at any time. It aims at building, testing, and releasing software faster and more frequently. The approach helps reduce the cost, time, and risk of delivering changes by allowing for more incremental updates to applications in production. A straightforward and repeatable deployment process is important for continuous delivery.

DevOps Toolchains

As the name suggests the goal of a ‘toolchain‘ is to define a collection of different tools, used together to achieve some broader process activity, in this case is key to making possible this higher frequency of software releases by better connecting each step in the cycle.

Given the mix of development languages (Java, .NET etc.), skill levels and existing in-house apps, this combination will vary between organizations, and there are also a plethora of tool options to choose from. For example in the Wikipedia entry for open source configuration apps alone they list numerous options, and also they make the simple but crucial point:

“Not all tools have the same goal and the same feature set.”

Netflix again provides the perfect poster child example for explaining what and how this toolchain is formed, identifying their combination which includes their own Spinnaker tool, and where and how each tool is used in the overall software lifecycle. (Read more in this detailed Netflix case study.)

http://cloudbestpractices.net/wp-content/uploads/2017/08/netflix-spinnak... 300w, http://cloudbestpractices.net/wp-content/uploads/2017/08/netflix-spinnak... 768w" sizes="(max-width: 829px) 100vw, 829px" />

Popular tools for this goal include Jenkins, a centrepiece foundation for Continuous Integration practices, and Ansible, which can be combined with other tools like Puppet and Chef to define a DevOps recipe, and in this era of Cloud-driven plenitude there is almost no end of possible permutations.

Other tool chain combinations highlight additional dimensions:

DevOps Integration Hub

Other vendor examples offer the full integrated suite, such as Tasktop’s Integration Hub.

As they suggest in their blog one of the key bottlenecks constricting the throughput of DevOps teams is the lack of integration between the different tools used to complete the software development lifecycle. So they have suggested the need for a general solution category, a ‘DevOps Integration Hub‘ as a particular fit for this need.

They offer a DevOps Integration suite that can be used to build linked permutations such as:

  • Exposes their Tasktop integration platform with a modern webhook, REST and JSON based integration layer that will allow any DevOps tool to be connected to the software lifecycle, such as Selenium-based test execution to Agile planning tools.
  • Jenkins integration for flowing build information to user stories or requirements, as well as Github and Gerrit integration for automating change set traceability.

They describe how the 1,000 strong Bosch team have achieved a unified tool chain that funnels 2,500 feature requests a month from electric car innovation programs, into a single dash board for developers to prioritize and action.

Furthermore by tying together Jenkins with the automation to burn Flash ECUs they have empowered the team to work around a true end-to-end Continuous Deployment lifecycle.

It’s a very powerful metaphor when you consider how literally it enables sharing of best practices. High velocity team processes that have been developed by a real team, encoded into the application patterns they use to achieve them, is a big building block for enabling this.

Open Source DevOps

There are also a wealth of open source options for building a toolchain. Described in this blog is the news Accenture open sourced their own DevOps design and tool set, and also an overview for how you can repeat your own implementation of their powerful model.

‘ADOP’ – Accenture DevOps Platform is:

a framework that glues together a suite of open source development tools and provides a really quick way of mobilizing software development projects in a consistent way, using a standard set of tools and processes (for example, Continuous Delivery). The framework can be configured for use for a particular technology using a ‘cartridge’.

ADOP is mainly a set of Docker-composed templates which provision a suite of DevOps tools, but along with the insights shared by Martin Croker the pioneer behind the idea through his blog, also constitutes a repeatable design recipe if you too want to implement a DevOps model for your organization.

In particular is the key blueprint design idea of ‘cartridges’, implemented through the core Jenkins foundation.

The key business benefit to note is how Martin describes an implementation of the environment can be stood up quickly in response to new client engagements, not only supporting those teams with agile processes but also ensuring they then follow defined central governance for DevOps practices, from the start.

Martin explains more in this video.

Read the original blog entry...

More Stories By Cloud Best Practices Network

The Cloud Best Practices Network is an expert community of leading Cloud pioneers. Follow our best practice blogs at http://CloudBestPractices.net

Latest Stories
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, d...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
Most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes a lot of work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost ...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.