Welcome!

Related Topics: SDN Journal, Java IoT, Microservices Expo, Linux Containers, Containers Expo Blog, @CloudExpo, Cloud Security, @BigDataExpo

SDN Journal: Article

Bringing Software-Defined to the Data Center

Lower costs and increase control

"Software-defined," like any new trend that technology companies rush to attach to has suffered from marketing hype. Starting in mid-2012 with the acquisition of Nicera by VMware, most traditional infrastructure technology vendors across compute, networking, and storage have some messaging around how software-defined fits into their product strategy.

But software-defined is not a traditional concept. Since the transition from mainframe to distributed computing, corresponding with the rise in networking, most technologies in the data center have been very hardware specific. For many years, the way to get the right amount of intelligence in the right place to execute the functionality has been with specialized hardware.

Software-defined is fairly self-explanatory. The value is in the software, with one of the biggest benefits being the use of standard hardware that is a fraction of the cost of vendor-specific hardware, which has been the norm for data centers since the beginning of the 21st century. Standard hardware from servers to networking devices have so many resources available that specialized hardware no longer provides the differentiation it once did.

Freedom from hardware opens up freedom to extend SDDC outside the walls of the data center. SDDC helps organizations deliver a modern private cloud in the same model that large operators like Amazon and Rackspace use to deliver public cloud. What's more, using the right software-defined technologies enables hybrid cloud, the panacea for most enterprises. This is one of the main reasons that open source technologies, and those with very strong standards are emerging in the software-defined space.

That said, software defined has the biggest opportunity to benefit private data centers, where the majority of applications simply cannot run in the public cloud based on policy or preference. Here are some best practices when looking at how software defined can benefit your private data center architecture:

Take advantage of your already efficient procurement: Odds are your organization has a go-to vendor or reseller of servers. Whether from HP, Dell, or a white box vendor like Supermicro, the premium paid on server infrastructure is much smaller than with storage. You may even have a volume purchase agreement, making purchasing of hardware for software-defined storage even more affordable. Buying infrastructure for SDDC is no different than your channels for standard servers and networking equipment.

Insist on standards compliance to avoid future lock-in: SDDC is a real opportunity to reduce or eliminate lock-in among the technologies used in your data center. The best software-defined solutions are based on industry standards so freedom to change is retained. This is more than basic interoperability with a standard API, as that case still relies on the software vendor to keep up with changes. Avoid software-defined solutions that are vendor specific and limit your flexibility to integrate and innovate as this market continues to evolve.

Have a preference for open source technologies: Once dismissed by their proprietary competitors as immature, open source operating systems, middleware, application frameworks, and databases are now standards in the data center. The same trend will hold true for software-defined solutions. Using an open source technology does not preclude organizations from working with commercial vendors to support the success of open source in the data center architecture.

Where are the biggest opportunities to use software defined? We believe it is in storage, a solution area where very large premiums have been paid for many years, based on the perception that specialized hardware was the only way to keep data safe and available. The biggest operators have proven the opposite - they can be available to millions of concurrent users without downtime or losing data while using standard hardware and intelligent software.

The reality is data is simply growing too fast and must be retained too long, at a cost that in many cases must be as close to zero as possible. Plus, unstructured data is the fastest growing, fueled by SaaS applications and the transition to mobile devices. Any private data center today is in direct competition with the operators of large public clouds. Internal users demand the flexibility and operating costs the big operators have proven are possible, so private operators must use the same software-defined strategy to remain competitive.

"Utility Computing" was a hyped trend at the start of this century that was before its time. Some might classify SDDC in the same category, all hype. Objectively, SDDC is a natural extension of cloud, and should prove to be equally as disruptive to the data center architecture as cloud has been. Software used for compute, networking, and storage will need to be as standard as the servers you buy today in order to fit into your data center architecture of the future.

More Stories By Joe Arnold

Joe Arnold founded SwiftStack to deploy high-scale, open-source cloud storage systems using OpenStack. He managed the first public OpenStack Swift launch independent of Rackspace, and has subsequently deployed multiple large-scale cloud storage systems. He is currently building tools to deploy and manage OpenStack Swift with his firm, SwiftStack.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, wil...
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
Enterprises have forever faced challenges surrounding the sharing of their intellectual property. Emerging cloud adoption has made it more compelling for enterprises to digitize their content, making them available over a wide variety of devices across the Internet. In his session at 19th Cloud Expo, Santosh Ahuja, Director of Architecture at Impiger Technologies, will introduce various mechanisms provided by cloud service providers today to manage and share digital content in a secure manner....
SYS-CON Events announced today that eCube Systems, a leading provider of middleware modernization, integration, and management solutions, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. eCube Systems offers a family of middleware evolution products and services that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Ixia (Nasdaq: XXIA) has announced that NoviFlow Inc.has deployed IxNetwork® to validate the company’s designs and accelerate the delivery of its proven, reliable products. Based in Montréal, NoviFlow Inc. supports network carriers, hyperscale data center operators, and enterprises seeking greater network control and flexibility, network scalability, and the capacity to handle extremely large numbers of flows, while maintaining maximum network performance. To meet these requirements, NoviFlow in...
SYS-CON Events announced today that Hitrons Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Hitrons Solutions Inc. is distributor in the North American market for unique products and services of small and medium-size businesses, including cloud services and solutions, SEO marketing platforms, and mobile applications.
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...