Welcome!

Related Topics: @DevOpsSummit, Java IoT, Linux Containers, Open Source Cloud, @CloudExpo, @BigDataExpo

@DevOpsSummit: Blog Post

Docker + Stackato: The Perfect Workload Portability Solution

Looking to ease application development and deployment and also retain the maximum flexibility in terms of deployment location?

Looking to ease application development and deployment and also retain the maximum flexibility in terms of deployment location?

If you work in technology, you'd have to have been under a rock to have not heard about Docker. In a nutshell, Docker provides a lightweight container for code that can be installed onto a Linux system, providing both an execution environment for applications and partitioning to securely segregate sets of application code from one another. While this high-level description doesn't sound that exciting, Docker addresses three key issues confronting application developers:

  • Efficient resource use: One of the problems confronting IT organizations is how to get the most benefit from computing resources; this translates as to how to raise utilization of servers to ensure that their cost and power use is actually applied to computing rather than being used to operate a server that is running, but performing no useful work. The previous solution to this issue was virtualization, which enabled a single server to support multiple virtual machines, each containing an operating system and software payload. While virtualization helps address the issue of utilization, it seems obvious that operating multiple virtual machines, each with its own operating system presents the problem that a lot of the server's resources may be tied up with running multiple operating systems rather than application code, which is where all the value resides. Said another way, the operating system is a necessary evil, but it's not where business value resides. A solution that reduces the proportion of the server's overall processing capacity devoted to running operating systems would be extremely valuable. Docker is that solution -- it requires only one operating system per server and uses containers to provide the segregated execution environment that individual virtual machines previously provided. My colleague Phil Whelan used an analogy of a server as being like a jar -- and choosing to use sand rather than marbles to most efficiently fill the jar; just so, containers are more efficient as optimizing overall server use and waste less computing capacity (i.e., leave less "wasted space in the jar") than virtualization.
  • Workload encapsulation: A container offers exactly what it sounds like -- an environment to hold something. In the case of Docker, it holds a set of executable code that runs inside the Docker container. This means that the container encapsulates the execution code, and that the container can be transferred from one location to another. This simplifies the application lifecycle, as containers can be passed from one group to another with no need for separate groups to recreate the same application into different environments via recompiling and repeated configuration.
  • Workload portability: It's a fact of life that businesses use a variety of application deployment environments -- a single company may deploy applications into an on-premise VMware vSphere environment, a virtual private cloud run by an OpenStack-based provider, and also Amazon Web Services. Each uses a different hypervisor and has a different set of operational controls, which presents a challenge to organizations that desire greater flexibility and choice for workload deployment. The previous vendor solution to this issue was OVF -- the Open Virtualization Format -- which promised to enable workload portability, but in practice ended up being a mechanism to transport proprietary virtual machine images along with operational metadata. This reduced the vision of true workload portability to vendor-constrained islands of technology homogeneity, which didn't really address end user objectives at all. By contrast, Docker containers are easily transported and run on any hypervisor environment that supports Linux -- which is all of them. Therefore, Docker is a much better solution to workload portability and addresses a key user desire. You'll hear much more about how Docker enables workload portability over the coming months and years.

Given the advantages Docker offers, it's easy to understand why it has been so avidly embraced by the vendor and user community. It addresses efficient use and provides for better workload portability.

On the other hand, Docker does not solve all application problems. In fact, its benefits expose a significant issue: if it's easier to run and distribute workloads, then efficient creation and management of application workloads is all the more important. And Docker does nothing to ease application creation and management -- it merely does a fantastic job of deploying workloads once they are created.

And application creation and management is where Stackato shines. Its Cloud Foundry-based framework accelerates application development and management by providing easy to use code deployment inside a Docker container, as well as predefined and managed application data storage (i.e., database). Moreover, Stackato makes it easy to grow and shrink the pool of Docker containers within which an application operates.

For organizations looking to ease application development and deployment and that also want to retain the maximum flexibility in terms of deployment location, combining Docker and Stackato is the perfect solution. In fact, ActiveState agrees with this so much that it integrates Docker into its Stackato product.

So if you're a company or IT organization looking to address the issue of workload portability, Docker and Stackato is a good place to start your search.

Source: ActiveState, originally published, here.

More Stories By Bernard Golden

Bernard Golden has vast experience working with CIOs to incorporate new IT technologies and meet their business goals. Prior to joining ActiveState, he was Senior Director, Cloud Computing Enterprise Solutions, for Dell Enstratius. Before joining Dell Enstratius, Bernard was CEO of HyperStratus, a Silicon Valley cloud computing consultancy that focuses on application security, system architecture and design, TCO analysis, and project implementation. He is also the Cloud Computing Advisor for CIO Magazine and was named a "Top 50 Cloud Computing Blog" by Sys-Con Media. Bernard's writings on cloud computing have been published by The New York Times and the Harvard Business Review and he is the author of Virtualization for Dummies, Amazon Web Services for Dummies and co-author of Creating the Infrastructure for Cloud Computing. Bernard has an MBA in Business and Finance from the University of California, Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
There are 66 million network cameras capturing terabytes of data. How did factories in Japan improve physical security at the facilities and improve employee productivity? Edge Computing reduces possible kilobytes of data collected per second to only a few kilobytes of data transmitted to the public cloud every day. Data is aggregated and analyzed close to sensors so only intelligent results need to be transmitted to the cloud. Non-essential data is recycled to optimize storage.
SYS-CON Events announced today that Conference Guru has been named “Media Sponsor” of SYS-CON's 20th International Cloud Expo, which will take place on June 6–8, 2017, at the Javits Center in New York City, NY. A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great dea...
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
Whether they’re located in a public, private, or hybrid cloud environment, cloud technologies are constantly evolving. While the innovation is exciting, the end mission of delivering business value and rapidly producing incremental product features is paramount. In his session at @DevOpsSummit at 19th Cloud Expo, Kiran Chitturi, CTO Architect at Sungard AS, discussed DevOps culture, its evolution of frameworks and technologies, and how it is achieving maturity. He also covered various styles and...
“We're a global managed hosting provider. Our core customer set is a U.S.-based customer that is looking to go global,” explained Adam Rogers, Managing Director at ANEXIA, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
In recent years, containers have taken the world by storm. Companies of all sizes and industries have realized the massive benefits of containers, such as unprecedented mobility, higher hardware utilization, and increased flexibility and agility; however, many containers today are non-persistent. Containers without persistence miss out on many benefits, and in many cases simply pass the responsibility of persistence onto other infrastructure, adding additional complexity.
Some people worry that OpenStack is more flash then substance; however, for many customers this could not be farther from the truth. No other technology equalizes the playing field between vendors while giving your internal teams better access than ever to infrastructure when they need it. In his session at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will talk through some real-world OpenStack deployments and look into the ways this can benefit customers of all sizes....
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In his general session at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will explore...
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.