Welcome!

Related Topics: @DevOpsSummit, Java IoT, Linux Containers, Containers Expo Blog, FinTech Journal

@DevOpsSummit: Blog Post

Our Favorite Continuous Delivery Tools By @TrevParsons | @DevOpsSummit [#DevOps]

Identifying and tracking issues as easily as possible

We're working hard in the Logentries towers to integrate our continuous delivery tools, so we can identify and track issues as easily as possible.

This saves us time that we can spend on important things like adding new features (or playing pool!). We use a lot of tools to manage our development cycle, and we've made them interact too.

We use JIRA (by Atlassian) to plan our work, Gitlab to manage our code, and Jenkins for our continuous integration.

Does Mr Jenkins Approve?

Merge requests (a.k.a. a pull request in github) aren't accepted in Logentries until Mr Jenkins is happy with it.

Mr Jenkins is listening, and whenever a new merge request is submitted, he'll go through it with a fine-tooth comb. If he's not happy, he'll reject it.

People don't like Mr Jenkins, he's pedantic.

Jenkins

Not only does he perform all of our tests (which the developer will have run before submitting a merge request, but you never know...), he also performs style checks and static analysis depending on the language:

Once he's done with his tests, Mr. Jenkins will vote on the merge request:

continuous delivery tools Jenkins message

Another failed build sir. Shall I prepare afternoon tea?

Implementation
This integration is actually very simple to implement thanks to a great Jenkins plugin: theGitlab Merge Request Builder Plugin. Once you create Mr. Jenkins' account on your gitlab site and install the plugin, you can have automatic builds running in 5 minutes.

Why You Should Bother
If you run Jenkins builds against code before it gets merged into a central branch, it can be fixed before it gets accepted!

You have to work on Stories

JiraGitlab

<->

We don't think we're unique in planning our work.

We hope we're not unique in planning our work...

We use JIRA to write stories which our developers work on, along with the occasional bug, and we've integrated our JIRA stories into Gitlab.

Gitlab can be customised to perform pre-receive checks on any commits pushed to the server.

We've written a Ruby script that checks every commit pushed to our gitlab server, and confirms the commit message mentions a JIRA issue that's in progress or blocked.

If no issue is mentioned, the push is rejected. Ouch.

If an issue is mentioned, the script goes ahead and comments on the mentioned JIRA issue to say that a commit has been made.

cd message in Gitlab

Mr Jenkins always cleans up after himself, as any good butler should.

  • Now we can track all the work that's been done on a story in JIRA.
  • Now we can see the JIRA issue that each commit relates to in Gitlab.

Awesome.

Implementation
Jira has a well-documented API which is pleasant to use.

You need to update your pre-receive hook file in:

${Git installation directory}/gitlab-shell/hooks/

Then, check if the commit message mentions a jira issue

$oldrev = ARGV[1]$newrev = ARGV[2]
revision_list = IO.popen(%W(git rev-list ^#{$oldrev} #{$newrev})).read
revision_list.split("\n").each do |rev|commit_msg = IO.popen(%W(git log -format=%B -n 1 #{rev})).readissues = commit_msg.scan(/(jira[- ]?\d+)/i)if issues.empty?puts "No issue mentioned"exit 1endend

Once you've found issue IDs, check if the issues are open using the JIRA api, and then post a comment. These steps are long, but not complicated.

our favorite continuous delivery tools

An Easy Shortcut for Lazy Developer
It can be a bit of a pain remembering the exact code of your issue (and typing it in every time), so our developers just put the JIRA issue ID in the branch name and use a  handy script we found here to automatically put the branch name in each commit message.

BRANCH_NAME=$(git symbolic-ref -short HEAD)if [ -n "$BRANCH_NAME" ] && [ "$BRANCH_NAME" != "master" ]; thensed -i.bak -e "1s/$/ [$BRANCH_NAME]/" $1fi

Why You Should Bother
Having your git push rejected seems like a huge irritation, but most people are used to a little rejection(#risqué joke), and after a couple of failures, you'll remember to mention the right ID.

This results in a huge amount of time saved for planners, because they now know exactly what's being done on that. We also have a great, interlocked, history between our revision control system and our planning system.

All Over the Shop
All of this data is a pain to manage and compare when it's spread across so many products.

Happily enough, we have just the ticket to fix that... There'll be another blog post soon on how we've integrated our CD tooling into Logentries and how we're using it.

More Stories By Trevor Parsons

Trevor Parsons is Chief Scientist and Co-founder of Logentries. Trevor has over 10 years experience in enterprise software and, in particular, has specialized in developing enterprise monitoring and performance tools for distributed systems. He is also a research fellow at the Performance Engineering Lab Research Group and was formerly a Scientist at the IBM Center for Advanced Studies. Trevor holds a PhD from University College Dublin, Ireland.

Latest Stories
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will d...
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and G...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. ...
In this presentation, you will learn first hand what works and what doesn't while architecting and deploying OpenStack. Some of the topics will include:- best practices for creating repeatable deployments of OpenStack- multi-site considerations- how to customize OpenStack to integrate with your existing systems and security best practices.
"DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team at @DevOpsSUMMIT and CloudEXPO tell the world how they can leverage this emerging disruptive trend."
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Everyone wants the rainbow - reduced IT costs, scalability, continuity, flexibility, manageability, and innovation. But in order to get to that collaboration rainbow, you need the cloud! In this presentation, we'll cover three areas: First - the rainbow of benefits from cloud collaboration. There are many different reasons why more and more companies and institutions are moving to the cloud. Benefits include: cost savings (reducing on-prem infrastructure, reducing data center foot print, redu...
DXWorldEXPO LLC announced today that "IoT Now" was named media sponsor of CloudEXPO | DXWorldEXPO 2018 New York, which will take place on November 11-13, 2018 in New York City, NY. IoT Now explores the evolving opportunities and challenges facing CSPs, and it passes on some lessons learned from those who have taken the first steps in next-gen IoT services.
Founded in 2000, Chetu Inc. is a global provider of customized software development solutions and IT staff augmentation services for software technology providers. By providing clients with unparalleled niche technology expertise and industry experience, Chetu has become the premiere long-term, back-end software development partner for start-ups, SMBs, and Fortune 500 companies. Chetu is headquartered in Plantation, Florida, with thirteen offices throughout the U.S. and abroad.