Click here to close now.


Related Topics: SDN Journal, Java IoT, Microservices Expo, Linux Containers, Containers Expo Blog, @CloudExpo, Cloud Security, @BigDataExpo

SDN Journal: Article

Bringing Software-Defined to the Data Center

Lower costs and increase control

"Software-defined," like any new trend that technology companies rush to attach to has suffered from marketing hype. Starting in mid-2012 with the acquisition of Nicera by VMware, most traditional infrastructure technology vendors across compute, networking, and storage have some messaging around how software-defined fits into their product strategy.

But software-defined is not a traditional concept. Since the transition from mainframe to distributed computing, corresponding with the rise in networking, most technologies in the data center have been very hardware specific. For many years, the way to get the right amount of intelligence in the right place to execute the functionality has been with specialized hardware.

Software-defined is fairly self-explanatory. The value is in the software, with one of the biggest benefits being the use of standard hardware that is a fraction of the cost of vendor-specific hardware, which has been the norm for data centers since the beginning of the 21st century. Standard hardware from servers to networking devices have so many resources available that specialized hardware no longer provides the differentiation it once did.

Freedom from hardware opens up freedom to extend SDDC outside the walls of the data center. SDDC helps organizations deliver a modern private cloud in the same model that large operators like Amazon and Rackspace use to deliver public cloud. What's more, using the right software-defined technologies enables hybrid cloud, the panacea for most enterprises. This is one of the main reasons that open source technologies, and those with very strong standards are emerging in the software-defined space.

That said, software defined has the biggest opportunity to benefit private data centers, where the majority of applications simply cannot run in the public cloud based on policy or preference. Here are some best practices when looking at how software defined can benefit your private data center architecture:

Take advantage of your already efficient procurement: Odds are your organization has a go-to vendor or reseller of servers. Whether from HP, Dell, or a white box vendor like Supermicro, the premium paid on server infrastructure is much smaller than with storage. You may even have a volume purchase agreement, making purchasing of hardware for software-defined storage even more affordable. Buying infrastructure for SDDC is no different than your channels for standard servers and networking equipment.

Insist on standards compliance to avoid future lock-in: SDDC is a real opportunity to reduce or eliminate lock-in among the technologies used in your data center. The best software-defined solutions are based on industry standards so freedom to change is retained. This is more than basic interoperability with a standard API, as that case still relies on the software vendor to keep up with changes. Avoid software-defined solutions that are vendor specific and limit your flexibility to integrate and innovate as this market continues to evolve.

Have a preference for open source technologies: Once dismissed by their proprietary competitors as immature, open source operating systems, middleware, application frameworks, and databases are now standards in the data center. The same trend will hold true for software-defined solutions. Using an open source technology does not preclude organizations from working with commercial vendors to support the success of open source in the data center architecture.

Where are the biggest opportunities to use software defined? We believe it is in storage, a solution area where very large premiums have been paid for many years, based on the perception that specialized hardware was the only way to keep data safe and available. The biggest operators have proven the opposite - they can be available to millions of concurrent users without downtime or losing data while using standard hardware and intelligent software.

The reality is data is simply growing too fast and must be retained too long, at a cost that in many cases must be as close to zero as possible. Plus, unstructured data is the fastest growing, fueled by SaaS applications and the transition to mobile devices. Any private data center today is in direct competition with the operators of large public clouds. Internal users demand the flexibility and operating costs the big operators have proven are possible, so private operators must use the same software-defined strategy to remain competitive.

"Utility Computing" was a hyped trend at the start of this century that was before its time. Some might classify SDDC in the same category, all hype. Objectively, SDDC is a natural extension of cloud, and should prove to be equally as disruptive to the data center architecture as cloud has been. Software used for compute, networking, and storage will need to be as standard as the servers you buy today in order to fit into your data center architecture of the future.

More Stories By Joe Arnold

Joe Arnold founded SwiftStack to deploy high-scale, open-source cloud storage systems using OpenStack. He managed the first public OpenStack Swift launch independent of Rackspace, and has subsequently deployed multiple large-scale cloud storage systems. He is currently building tools to deploy and manage OpenStack Swift with his firm, SwiftStack.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Latest Stories
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driv...
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.
Docker is hot. However, as Docker container use spreads into more mature production pipelines, there can be issues about control of Docker images to ensure they are production-ready. Is a promotion-based model appropriate to control and track the flow of Docker images from development to production? In his session at DevOps Summit, Fred Simon, Co-founder and Chief Architect of JFrog, will demonstrate how to implement a promotion model for Docker images using a binary repository, and then show h...
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at DevOps Summit, Bryan Cantrill, CTO at Joyent, will demonstrate a third path: containers on multi-tenant bare metal that maximizes performance, security, and networking connectivity.
Achim Weiss is Chief Executive Officer and co-founder of ProfitBricks. In 1995, he broke off his studies to co-found the web hosting company "Schlund+Partner." The company "Schlund+Partner" later became the 1&1 web hosting product line. From 1995 to 2008, he was the technical director for several important projects: the largest web hosting platform in the world, the second largest DSL platform, a video on-demand delivery network, the largest eMail backend in Europe, and a universal billing syste...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively.
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll sha...
The modern software development landscape consists of best practices and tools that allow teams to deliver software in a near-continuous manner. By adopting a culture of automation, measurement and sharing, the time to ship code has been greatly reduced, allowing for shorter release cycles and quicker feedback from customers and users. Still, with all of these tools and methods, how can teams stay on top of what is taking place across their infrastructure and codebase? Hopping between services a...
In today's digital world, change is the one constant. Disruptive innovations like cloud, mobility, social media, and the Internet of Things have reshaped the market and set new standards in customer expectations. To remain competitive, businesses must tap the potential of emerging technologies and markets through the rapid release of new products and services. However, the rigid and siloed structures of traditional IT platforms and processes are slowing them down – resulting in lengthy delivery ...
As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ability. Many are unable to effectively engage and inspire, creating forward momentum in the direction of desired change. Renowned for its approach to leadership and emphasis on their people, organizations increasingly look to our military for insight into these challenges.
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction....