Welcome!

Blog Feed Post

Stop Scripting Your Deployments in your CI Server

Continuous Integration tools like Jenkins and Bamboo are especially effective when building, unit testing and validating your applications. In fact, with CI tools, teams can do almost anything they want with code. But do CI tools scale?

The short answer is no. There are many steps required to deploy applications, such as removing servers from load balancers, stopping servers, updating web server content and application server binaries, running database updates, and restarting servers to name a few. Application deployment using a CI server basically means scripting out entire deployments yourself, using CI tools to move your script to a target server and running it there. In an enterprise environment, it wouldn’t take long for all this scripting to turn into a convoluted mess.

But I Like to Script!

So maybe you’re fine with scripting. But are you ok with:

  • Maintaining all of those deployment scripts, especially if something changes?
  • Installing agents on every machine that’s not a build server? A QA database or app server is unlikely to be a build agent for your CI tool, for example.
  • Having no cross-machine orchestration? CI servers, after all, are designed to run an entire build on one machine of a large pool of possible machines. In contrast, application deployments need parts of a build to run on multiple, specific machines.
  • Lacking a suitable domain model for applications and deployments? Yes, you can see whether the job succeeded or not. But answering a simple, very common deployment questions such as “which version of my application is running in which environment” is non-trivial with most CI tools.

 

http://i0.wp.com/blog.xebialabs.com/wp-content/uploads/2017/01/Screen-Sh... 672w" sizes="(max-width: 127px) 100vw, 127px" data-recalc-dims="1" />

Continuous Delivery with a CI Tool is like using workflows to scale deployments. Read “5 Reasons To Stay Away from Workflows” next!

 

Deploying with CI Tools

CI servers typically offer a couple of choices for deployments:

  1. Creating jobs for each application/environment combination (e.g., “MyApp to Dev,” “MyApp to Test”)
  2. Creating a parameterized job that accepts that application and/or environment as parameters

The latter is the more scalable of the two options, but how do you secure and configure it effectively? You don’t because it’s impossible. The target servers on which the deployment needs to run and the build slaves on which the job is executed depend on the “target environment” parameter. Who has permission to run the job depends on the target environment, but CI tools generally do not support security configuration of the form “this user is only allowed to run this job if the value of this parameter to the job is X, Y, or Z”.

So if you want security, you’re left with option 1. That’s right. Building delivery pipelines using CI servers tends to mean one deployment job per application/environment combination. Changing your deployment logic, for example, to support a new target server version, suddenly means modifying tens, or even hundreds of jobs. With most CI tools, this is definitely not a quick or easy thing to do.

CI Tools are Great–for CI

CI tools are proven workhorses for build, test, and validation before deploying applications to production. It’s time to appreciate these pipeline standards for what they’re great at while recognizing that they aren’t made to automate deployment.

Fortunately, IT teams all over the world are doing just that. More and more are turning to deployment automation and release orchestration software while integrating their trusted CI tools as part of an efficient, scalable and easily maintainable delivery process.

 

Interested in seeing what a Continuous Delivery tool looks like? Check it out, you might be surprised: XL Deploy & XL Release

The post Stop Scripting Your Deployments in your CI Server appeared first on XebiaLabs.

Read the original blog entry...

More Stories By XebiaLabs Blog

XebiaLabs is the technology leader for automation software for DevOps and Continuous Delivery. It focuses on helping companies accelerate the delivery of new software in the most efficient manner. Its products are simple to use, quick to implement, and provide robust enterprise technology.

Latest Stories
Almost two-thirds of companies either have or soon will have IoT as the backbone of their business. Though, IoT is far more complex than most firms expected with a majority of IoT projects having failed. How can you not get trapped in the pitfalls? In his session at @ThingsExpo, Tony Shan, Chief IoTologist at Wipro, will introduce a holistic method of IoTification, which is the process of IoTifying the existing technology portfolios and business models to adopt and leverage IoT. He will delve in...
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...
Cloud Expo, Inc. has announced today that Aruna Ravichandran, vice president of DevOps Product and Solutions Marketing at CA Technologies, has been named co-conference chair of DevOps at Cloud Expo 2017. The @DevOpsSummit at Cloud Expo New York will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and @DevOpsSummit at Cloud Expo Silicon Valley will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at @ThingsExpo, Steve Wilkes, CTO and founder of Striim, will delve into four enterprise-scale, business-critical case studies where streaming analytics serves as the key to enabling real-time data integration and right-time insights in hybrid cloud, IoT, and fog computing environments. As part of this discussion, he will also present a demo based on its partnership with Fujitsu, highlighting their technologies in a healthcare IoT use-case. The demo showcases the tracking of pati...
Tricky charts and visually deceptive graphs often make a case for the impact IT performance has on business. The debate isn't around the obvious; of course, IT performance metrics like website load time influence business metrics such as conversions and revenue. Rather, this presentation will explore various data analysis concepts to understand how, and how not to, assert such correlations. In his session at 20th Cloud Expo, Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Sys...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
Stratoscale, the software company developing the next generation data center operating system, exhibited at SYS-CON's 18th International Cloud Expo®, which took place at the Javits Center in New York City, NY, in June 2016.Stratoscale is revolutionizing the data center with a zero-to-cloud-in-minutes solution. With Stratoscale’s hardware-agnostic, Software Defined Data Center (SDDC) solution to store everything, run anything and scale everywhere, IT is empowered to take control of their data ce...
It is one thing to build single industrial IoT applications, but what will it take to build the Smart Cities and truly society changing applications of the future? The technology won’t be the problem, it will be the number of parties that need to work together and be aligned in their motivation to succeed. In his Day 2 Keynote at @ThingsExpo, Henrik Kenani Dahlgren, Portfolio Marketing Manager at Ericsson, discussed how to plan to cooperate, partner, and form lasting all-star teams to change the...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
What are the new priorities for the connected business? First: businesses need to think differently about the types of connections they will need to make – these span well beyond the traditional app to app into more modern forms of integration including SaaS integrations, mobile integrations, APIs, device integration and Big Data integration. It’s important these are unified together vs. doing them all piecemeal. Second, these types of connections need to be simple to design, adapt and configure...
To manage complex web services with lots of calls to the cloud, many businesses have invested in Application Performance Management (APM) and Network Performance Management (NPM) tools. Together APM and NPM tools are essential aids in improving a business's infrastructure required to support an effective web experience... but they are missing a critical component - Internet visibility.
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...