|By Bernard Golden||
|October 27, 2014 10:00 AM EDT||
Looking to ease application development and deployment and also retain the maximum flexibility in terms of deployment location?
If you work in technology, you'd have to have been under a rock to have not heard about Docker. In a nutshell, Docker provides a lightweight container for code that can be installed onto a Linux system, providing both an execution environment for applications and partitioning to securely segregate sets of application code from one another. While this high-level description doesn't sound that exciting, Docker addresses three key issues confronting application developers:
- Efficient resource use: One of the problems confronting IT organizations is how to get the most benefit from computing resources; this translates as to how to raise utilization of servers to ensure that their cost and power use is actually applied to computing rather than being used to operate a server that is running, but performing no useful work. The previous solution to this issue was virtualization, which enabled a single server to support multiple virtual machines, each containing an operating system and software payload. While virtualization helps address the issue of utilization, it seems obvious that operating multiple virtual machines, each with its own operating system presents the problem that a lot of the server's resources may be tied up with running multiple operating systems rather than application code, which is where all the value resides. Said another way, the operating system is a necessary evil, but it's not where business value resides. A solution that reduces the proportion of the server's overall processing capacity devoted to running operating systems would be extremely valuable. Docker is that solution -- it requires only one operating system per server and uses containers to provide the segregated execution environment that individual virtual machines previously provided. My colleague Phil Whelan used an analogy of a server as being like a jar -- and choosing to use sand rather than marbles to most efficiently fill the jar; just so, containers are more efficient as optimizing overall server use and waste less computing capacity (i.e., leave less "wasted space in the jar") than virtualization.
- Workload encapsulation: A container offers exactly what it sounds like -- an environment to hold something. In the case of Docker, it holds a set of executable code that runs inside the Docker container. This means that the container encapsulates the execution code, and that the container can be transferred from one location to another. This simplifies the application lifecycle, as containers can be passed from one group to another with no need for separate groups to recreate the same application into different environments via recompiling and repeated configuration.
- Workload portability: It's a fact of life that businesses use a variety of application deployment environments -- a single company may deploy applications into an on-premise VMware vSphere environment, a virtual private cloud run by an OpenStack-based provider, and also Amazon Web Services. Each uses a different hypervisor and has a different set of operational controls, which presents a challenge to organizations that desire greater flexibility and choice for workload deployment. The previous vendor solution to this issue was OVF -- the Open Virtualization Format -- which promised to enable workload portability, but in practice ended up being a mechanism to transport proprietary virtual machine images along with operational metadata. This reduced the vision of true workload portability to vendor-constrained islands of technology homogeneity, which didn't really address end user objectives at all. By contrast, Docker containers are easily transported and run on any hypervisor environment that supports Linux -- which is all of them. Therefore, Docker is a much better solution to workload portability and addresses a key user desire. You'll hear much more about how Docker enables workload portability over the coming months and years.
Given the advantages Docker offers, it's easy to understand why it has been so avidly embraced by the vendor and user community. It addresses efficient use and provides for better workload portability.
On the other hand, Docker does not solve all application problems. In fact, its benefits expose a significant issue: if it's easier to run and distribute workloads, then efficient creation and management of application workloads is all the more important. And Docker does nothing to ease application creation and management -- it merely does a fantastic job of deploying workloads once they are created.
And application creation and management is where Stackato shines. Its Cloud Foundry-based framework accelerates application development and management by providing easy to use code deployment inside a Docker container, as well as predefined and managed application data storage (i.e., database). Moreover, Stackato makes it easy to grow and shrink the pool of Docker containers within which an application operates.
For organizations looking to ease application development and deployment and that also want to retain the maximum flexibility in terms of deployment location, combining Docker and Stackato is the perfect solution. In fact, ActiveState agrees with this so much that it integrates Docker into its Stackato product.
So if you're a company or IT organization looking to address the issue of workload portability, Docker and Stackato is a good place to start your search.
Source: ActiveState, originally published, here.
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
Mar. 29, 2017 06:30 AM EDT Reads: 6,242
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
Mar. 29, 2017 06:00 AM EDT Reads: 9,029
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, will discuss how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He will discuss how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Mar. 29, 2017 06:00 AM EDT Reads: 2,805
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
Mar. 29, 2017 05:00 AM EDT Reads: 6,401
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
Mar. 29, 2017 04:00 AM EDT Reads: 15,076
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Mar. 29, 2017 04:00 AM EDT Reads: 3,175
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Mar. 29, 2017 03:45 AM EDT Reads: 2,129
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Mar. 29, 2017 03:30 AM EDT Reads: 11,749
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. In his session at @BigDataExpo, Sum...
Mar. 29, 2017 03:30 AM EDT Reads: 3,208
Virtualization over the past years has become a key strategy for IT to acquire multi-tenancy, increase utilization, develop elasticity and improve security. And virtual machines (VMs) are quickly becoming a main vehicle for developing and deploying applications. The introduction of containers seems to be bringing another and perhaps overlapped solution for achieving the same above-mentioned benefits. Are a container and a virtual machine fundamentally the same or different? And how? Is one techn...
Mar. 29, 2017 03:30 AM EDT Reads: 3,208
Niagara Networks exhibited at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
Mar. 29, 2017 03:30 AM EDT Reads: 3,348
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Mar. 29, 2017 03:00 AM EDT Reads: 6,066
Information technology (IT) advances are transforming the way we innovate in business, thereby disrupting the old guard and their predictable status-quo. It’s creating global market turbulence. Industries are converging, and new opportunities and threats are emerging, like never before. So, how are savvy chief information officers (CIOs) leading this transition? Back in 2015, the IBM Institute for Business Value conducted a market study that included the findings from over 1,800 CIO interviews ...
Mar. 29, 2017 01:45 AM EDT Reads: 5,438
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
Mar. 29, 2017 01:15 AM EDT Reads: 2,486
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Mar. 29, 2017 01:15 AM EDT Reads: 9,279