Welcome!

Related Topics: Open Source Cloud, Microservices Expo, Containers Expo Blog, @DevOpsSummit

Open Source Cloud: Blog Feed Post

How to Use Docker | @DevOpsSummit #DevOps #Docker #Containers

In the last few years, this container management service has become immensely popular in development

How to Use Docker
By Ron Gidron

Docker is on a roll. In the last few years, this container management service has become immensely popular in development, especially given the great fit with agile-based projects and continuous delivery. In this article, I want to take a brief look at how you can use Docker to accelerate and streamline the software development lifecycle (SDLC) process.

First however, a brief introduction. The whole concept of Docker is for developers to easily ship applications inside ‘software containers' which can then be deployed and run anywhere. Let's imagine you develop an application on your laptop where it works perfectly. Then you push that into a test or production environment; you've chosen the right stack, the right language, and right version. But it doesn't work. Why? Because it's not the same environment.

Maybe you used a new version of a library, but the ops guy tells you that you can't use this library because all the other applications running on the server will break. So, there's a lot of back and forth between the ops and your team of developers. It delays projects, costs money and is frustrating for everyone involved.

When you develop with Docker, everything is packaged inside a container, or inside several containers that talk to each other. Docker completely isolates your piece of software from any external dependencies. A container is self-sufficient. You simply push the container to another environment; its contents and how it was developed are transparent to the ops team. So how is a ‘software container' like this any different to a standalone computer? Or a virtual machine for that matter?

Containers are lightweight because they don't need the extra load of a hypervisor, but run directly within the host machine's kernel. This means you can run more containers on a given hardware combination than if you were using virtual machines. You can even run Docker containers within host machines that are actually virtual machines. All of this makes containers - and Docker - ideal for continuous integration and continuous delivery workflows.

Docker itself uses a client-server architecture. The client talks to the daemon, which does the heavy lifting of building, running and distributing the containers. The Docker client and daemon can run on the same system, or you can connect the client to a remote daemon. The Docker client and daemon communicate using a REST API, over UNIX sockets or a network interface.

Orchestrating the Management and Deployment of Docker Containers
Despite all its advantages
, Docker still requires its own platform to be managed, someone to manage build, run, assign and stop the containers (as well many additional administrative tasks). And Docker itself, still needs to be aware of the other environments it interacts with, including the infrastructure layer and other traditional software services not currently running within the container landscape. Time spent by developers on tasks like these is considered an overhead which can result in missed deadlines, while time spent by ops performing these tasks creates pipeline bottlenecks right before production deployments, which can in turn lead to production failures.

There is a solution that enables you to orchestrate all the moving parts - people, process and technology - involved in managing and deploying to Docker containers. Release automation packages. The built in Automic container blueprint provisioning system along with the Docker actionpack from Automic allows you to blueprint entire Docker systems, build visual workflows and automate container builds, maintenance, provisioning, configurations and most administration tasks. This not only increases productivity among developers and administrators, it also lowers the risk of errors occurring. One example might be automatically ensuring the underlying infrastructure has enough capacity to support the projected container workloads or that certain container versions are always rolled out with matching external service package versions.

The Docker environment blueprint provisioning capability and actionpack from Automic combines an integrated application packaging system, smart deployment models, and out-of-the-box actions for common deployment tasks with robust workflow design and high-volume execution capability. The Automic Docker package allows users to build, provision, configure and manage Docker containers as part of an automated application deployment process.

Ultimately, this accelerates deployments to Docker containers, ensures the quality of container deployments and minimizes management overhead to help both development and operations grow the business.

Read the original blog entry...

More Stories By Automic Blog

Automic, a leader in business automation, helps enterprises drive competitive advantage by automating their IT factory - from on-premise to the Cloud, Big Data and the Internet of Things.

With offices across North America, Europe and Asia-Pacific, Automic powers over 2,600 customers including Bosch, PSA, BT, Carphone Warehouse, Deutsche Post, Societe Generale, TUI and Swisscom. The company is privately held by EQT. More information can be found at www.automic.com.

Latest Stories
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, d...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
Most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes a lot of work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost ...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.