Related Topics: SDN Journal, Java IoT, Microservices Expo, Linux Containers, Containers Expo Blog, Cloud Security

SDN Journal: Blog Feed Post

The SDN Chicken and the Standardization Egg

Which came first?

Pundits, owing to SDN still being very young and not widely adopted, continue to put forth treatise upon treatise as to why organizations should be falling over themselves to go out and get themselves some SDN. Now. Not tomorrow, today.

One of the compelling reasons to adopt is to address a variety of operational issues that arise from complexity and differentiation in network elements. The way SDN addresses this - as do a variety of other options - is standardization. Note this is standardization with a little "s", not the capital "S" that might imply industry-agreed-upon-and-given-an-RFC standardization.

The claim is simple: organizations have a whole lot of various standalone appliances in their network serving a variety of purposes. Load balancing, firewalls, IDS/IPS, application acceleration, and even identity and access control. All these individual appliances are (generally) from different vendors, each requiring their own management console, their own policy formats, their own.... everything.

The argument goes that if you take all those services and deploy them as services on a (common) network operating system, you'll reduce complexity and all the associated management (operational) overhead. Rainbows will sprout from the network and unicorns will dance merrily from node to node while sprinkling happy dust on your engineers and operators.

Okay, maybe the rainbows and unicorns aren't part of the promise, but the nirvana implied by this consolidation on a standardized (common) network operating system is.

Now, I'm not saying this isn't true. It could be*. You can get standardization (theoretically, at least) from SDN at the upper layers of the stack, but it isn't SDN itself that elevates the data center to a level closer to operational nirvana, it's the standardization on a common platform. SDN is (theoretically) just one path to achieve that standardization.

The Layer 4-7 Landscape

The reason achieving this is more difficult than it sounds is that there are far more functions at layer 4-7 that need to be "standardized". Organizations do, in fact, end up with a hodge-podge of appliances deployed to support all the various layer 4-7 services required. Let me illustrate with a diagram (that is itself incomplete. There are a whole lot of services that go in the "layer 4-7 category"):

the difference

Are organizations struggling with the proliferation of devices at layer 4-7? Absolutely. I've heard stories of folks with 20 or more different vendors operating at layer 4-7 in their network. Not kidding. And every day, every new challenge brings another new solution to the table. Mobility and cloud - not to mention security - are driving the identification of new problems and challenges which in turn drives the proliferation of new solutions that often (though not always) come in the form of a nice, neat box to deploy in the network.

The way in which this proliferation (and its associated overhead) can be effectively addressed is through standardization. That doesn't require SDN, though SDN may be one way in which such standardization is achieved. Standardization - or more apropos - the consolidation of application (that's layer 4-7) network services onto a common platform is what brings that benefit.

* It could be if we ever figure out how to deploy said services in such an architecture without forcing the controller to be part of the data path.

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Latest Stories
DevOps theory promotes a culture of continuous improvement built on collaboration, empowerment, systems thinking, and feedback loops. But how do you collaborate effectively across the traditional silos? How can you make decisions without system-wide visibility? How can you see the whole system when it is spread across teams and locations? How do you close feedback loops across teams and activities delivering complex multi-tier, cloud, container, serverless, and/or API-based services?
Today every business relies on software to drive the innovation necessary for a competitive edge in the Application Economy. This is why collaboration between development and operations, or DevOps, has become IT’s number one priority. Whether you are in Dev or Ops, understanding how to implement a DevOps strategy can deliver faster development cycles, improved software quality, reduced deployment times and overall better experiences for your customers.
Cloud based infrastructure deployment is becoming more and more appealing to customers, from Fortune 500 companies to SMEs due to its pay-as-you-go model. Enterprise storage vendors are able to reach out to these customers by integrating in cloud based deployments; this needs adaptability and interoperability of the products confirming to cloud standards such as OpenStack, CloudStack, or Azure. As compared to off the shelf commodity storage, enterprise storages by its reliability, high-availabil...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Qosmos, the market leader for IP traffic classification and network intelligence technology, has announced that it will launch the Launch L7 Viewer at CloudExpo | @ThingsExpo Silicon Valley, being held November 1 – 3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The L7 Viewer is a traffic analysis tool that provides complete visibility of all network traffic that crosses a virtualized infrastructure, up to Layer 7. It facilitates and accelerates common IT tasks such as VM migra...
The Quantified Economy represents the total global addressable market (TAM) for IoT that, according to a recent IDC report, will grow to an unprecedented $1.3 trillion by 2019. With this the third wave of the Internet-global proliferation of connected devices, appliances and sensors is poised to take off in 2016. In his session at @ThingsExpo, David McLauchlan, CEO and co-founder of Buddy Platform, discussed how the ability to access and analyze the massive volume of streaming data from millio...
In the 21st century, security on the Internet has become one of the most important issues. We hear more and more about cyber-attacks on the websites of large corporations, banks and even small businesses. When online we’re concerned not only for our own safety but also our privacy. We have to know that hackers usually start their preparation by investigating the private information of admins – the habits, interests, visited websites and so on. On the other hand, our own security is in danger bec...
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
Enterprises have been using both Big Data and virtualization for years. Until recently, however, most enterprises have not combined the two. Big Data's demands for higher levels of performance, the ability to control quality-of-service (QoS), and the ability to adhere to SLAs have kept it on bare metal, apart from the modern data center cloud. With recent technology innovations, we've seen the advantages of bare metal erode to such a degree that the enhanced flexibility and reduced costs that cl...
“Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. CloudBerry Backup is a leading cross-platform cloud backup and disaster recovery solution integrated with major public cloud services, such as Amazon Web Services, Microsoft Azure and Google Cloud Platform.
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...