Click here to close now.


News Feed Item

Docker Introduces Docker Hub and Official Repo Program

(DockerCon) – Docker, Inc., the commercial entity behind the open source Docker project, today announced the launch of Docker Hub and the Docker Official Repository program. Docker Hub provides cloud-based platform services for distributed applications, including container image distribution and change management, user and team collaboration, lifecycle workflow automation and third-party services integrations. A major feature of Docker Hub is the availability of optimized, maintained and supported “Dockerized” applications through the Docker Official Repository program.

Docker is an open platform for developers and sysadmins to build, ship and run distributed applications. Consisting of the Docker Engine, the de facto container standard, and Docker Hub, a cloud-based service for users, content and workflows, Docker enables applications to be quickly assembled from components and eliminates the friction between environments. As a result, IT can ship faster and run the same app, unchanged, on laptops, data center VMs or the cloud.

“Enterprise organizations are seeking and sometimes struggling to make applications and workloads more portable and distributed in an effective, standardized and repeatable way,” said Jay Lyman, senior analyst at 451 Research. “Just as GitHub stimulated collaboration and innovation by making source code shareable, Docker Hub, Official Repos and commercial support are helping enterprises to answer this challenge by improving the way they package, deploy and manage applications.”

Docker Hub’s services help developers and sysadmins build, ship and run distributed applications built on Docker Engine. Major features include:

  • An integrated Console for managing users, teams, containers, repositories, and workflows;
  • The Docker Hub Registry, offering more than 14,000 “Dockerized” applications, available to all users as building blocks for their own applications;
  • Collaboration tools, enabling users to manage and share their applications through both public and private repositories, and to invite collaborators to participate in any stage of the application lifecycle;
  • The Automated Build Service, which keeps applications up-to-date by automatically rebuilding and updating an application’s public or private repository whenever the source code is updated on GitHub or Atlassian Bitbucket. Over 25 percent of the more than 14,000 Dockerized applications in the Docker Hub Registry are now created using Automated Builds, providing both automation and end-user assurance of container origin;
  • The Webhooks service, which enables users to automate repetitive workflows for build pipelines or continuous deployment. Interoperable with any RESTful API, webhooks enables organizations to take advantage of the web APIs published by any service or software package, like GitHub, AWS, or Jenkins; and
  • The Docker Hub API, which includes a user authentication service, so that third party applications and services can gain authenticated access to applications in a user’s public and private repositories. Third-party services that have already integrated with the Docker Hub API include AWS Elastic Beanstalk, Deis,, Google Compute Engine, Orchard, Rackspace, Red Hat, Tutum, and many others.

"As heavy users of Docker ourselves, we see first hand how Docker shares the Atlassian focus to provide tools to developers for better team collaboration and productivity,” said Eric Wittman, general manager of the developer tools business unit at Atlassian. “We are excited to be integrating Bitbucket and Docker Hub, combining the best of both the Atlassian and Docker platforms. Users of Bitbucket will now be able to build Dockerized versions of their applications automatically from their repos."

A major feature of Docker Hub is Official Repositories – “Dockerized” applications that are optimized, maintained, and supported and available to all Docker Hub users. Initially encompassing the top 13 most-searched-for applications in the Docker Hub Registry – including CentOS, MongoDB, MySQL, Nginx, Redis, Ubuntu, and WordPress – the program is open to any community group or software ISV willing to commit resources to on-going maintenance of an application according to the program’s guidelines. For more information, contact [email protected].

“The DevOps movement is transforming the vast majority of the software industry, however its impact is particularly disruptive in high performance Big Data processing,” said Lonne Jaffe, CEO, Syncsort. “This is why we moved quickly to launch Ironcluster ETL, Docker Edition in the Docker Hub, along with a 30-day free trial. Our customers and partners can now procure and rapidly deploy Ironcluster ETL in scalable, lightweight and portable Docker containers, both on-premise or in the cloud, taking advantage of all of Docker’s platform services.”

Users can sign up for free Docker Hub accounts today at

Additional Resources:

Docker blog post

Docker website

Sign up for a Docker Hub account

Take the Docker Tutorial

Atlassian blog post

Syncsort blog post

About Docker, Inc.

Docker, Inc. is the commercial entity behind the open source Docker project, and is the chief sponsor of the Docker ecosystem. Docker is a platform for developers and sysadmins to build, ship, and run distributed applications. With Docker, IT organizations shrink application delivery from months to minutes, frictionlessly move workloads between data centers and the cloud, and slash infrastructure costs by 50 percent or more. Inspired by an active community and by transparent, open source innovation, Docker has been downloaded 2.75+ million times and is used by thousands of the world’s most innovative organizations, including eBay, Baidu, Yelp, Spotify, Yandex, and Cambridge HealthCare. Docker’s rapid adoption has catalyzed an active ecosystem, resulting in more than 14,000 “Dockerized” applications and integration partnerships with AWS, Red Hat, Google, Canonical, IBM, OpenStack, Rackspace, and Cloud Foundry.

Docker, Inc. is venture backed by Greylock Partners (Jerry Chen), Benchmark (Peter Fenton), Trinity Ventures (Dan Scholnick), AME Cloud Ventures (Yahoo! Founder Jerry Yang), Insight Venture PartnersY Combinator, and SV Angel (Ron Conway).

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
Achim Weiss is Chief Executive Officer and co-founder of ProfitBricks. In 1995, he broke off his studies to co-found the web hosting company "Schlund+Partner." The company "Schlund+Partner" later became the 1&1 web hosting product line. From 1995 to 2008, he was the technical director for several important projects: the largest web hosting platform in the world, the second largest DSL platform, a video on-demand delivery network, the largest eMail backend in Europe, and a universal billing syste...
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line lo...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively.
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends an...
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driv...
Chris Van Tuin, Chief Technologist for the Western US at Red Hat, has over 20 years of experience in IT and Software. Since joining Red Hat in 2005, he has been architecting solutions for strategic customers and partners with a focus on emerging technologies including IaaS, PaaS, and DevOps. He started his career at Intel in IT and Managed Hosting followed by leadership roles in services and sales engineering at Loudcloud and Linux startups.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ability. Many are unable to effectively engage and inspire, creating forward momentum in the direction of desired change. Renowned for its approach to leadership and emphasis on their people, organizations increasingly look to our military for insight into these challenges.
SYS-CON Events announced today that Key Information Systems, Inc. (KeyInfo), a leading cloud and infrastructure provider offering integrated solutions to enterprises, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Key Information Systems is a leading regional systems integrator with world-class compute, storage and networking solutions and professional services for the most advanced softwa...