Welcome!

Related Topics: SDN Journal, Containers Expo Blog, @CloudExpo

SDN Journal: Blog Feed Post

Abstracting vs Ignoring

The primary goal of abstraction is to break something down into its most essential elements

There’s a running meme in networking that has caught a lot of momentum with SDN: abstraction. Just about every vendor these days has bold ambitions to abstract out the complexity. There are APIs, abstraction layers, and architectural shims that all aim to hide complexity from the user. But what does that really mean?

The primary goal of abstraction is to break something down into its most essential elements. You then expose only those parts that require end-user participation (think: configuration knobs, for example). Ideally, a small set of relatively well-understood parameters can then move the mass of hulking machinery underneath. The complexity? It is hidden, cleverly abstracted away from the user.

This leads to an interesting question: which bits need to be exposed to the end user?

The question seems innocuous enough. But the answer depends an awful lot on your context.

Networking has long been a management-through-precision proposition. That is to say that network behavior is determined by correctly setting a bunch of configuration knobs. The problem is at its absolute most horrible when specifying edge policy. So acute is our collective need for control that we have developed entire businesses around merely demonstrating proficiency in the knobs themselves.

The subtle point about abstraction is that by reducing networking to only the most critical characteristics, you are actually taking away some of the power from a user base that has defined their careers by their knowledge of said details. On the other extreme, you have an entire class of companies for whom the network is just not that interesting. At the limit, these companies simply want the network to work. That this currently requires an army of specialists with total command over thousands of widgets is more necessary evil than desired outcome.

So what do we do?

In some ways, we need to fight our own instincts. Our need for absolute control is our own doing. We have taken a largely incremental approach to networking for more than a couple of decades now. We have layered functionality on top of functionality, using configuration to incrementally enable each piece. The thought of things working across the whole of the network is a scary proposition, so we have grown accustomed to piecemeal deployments. And change has become so difficult that you don’t dare do anything unless you absolutely have to.

But despite all of this, where are we headed now?

We want to code our way out of the problem. We want to add another complex system on top of an already complex—and crumbling—system. Consider that SDN is largely an organic reaction to the failure of vendors to make networks that are manageable. Building another management layer on top of the existing failed infrastructure might hide the complexity, but it doesn’t remove the source of the underlying infrastructure rot. Without cutting away that rot, we are really just punting the problem into the future.

None of this is meant to suggest that APIs and programmability are not important. For a certain class of user, they are extremely powerful. But not every user has or event wants the kind of sophistication that goes with this type of approach. And fewer still have a solid enough foundation from which to build.

How do I know? We celebrate programmability while swapping stories of companies relying on screen scraping. How is that latter class of networking professionals going to make the leap?

The answer is actually simple: they won’t.

This creates an interesting market dynamic. The number of companies who are actively pushing the envelope is relatively small. Sure, there are lots of people who are constantly evaluating new technologies, but if you measure actual deployments, the business tilts heavily to the less sexy, less risky networks of yesteryear. For every SDN trial that nets a couple of devices in some deal and generates a snazzy press release, there is a pile of uncontested deals that get closed with the incumbent vendors.

Why? Because things barely work the way they are, and adding more capability on top of existing stuff isn’t actually the need for most people.

There is an enormous opportunity for companies that provide a solution to the underlying infrastructure rot. The billions of dollars spent on networks that users wish they could just ignore are mostly in play. The vendors pursuing those dollars need to make sure they keep their eye on the ball. The shiny object that is abstraction and programmability will undoubtedly be important, but pursuing that at the expense of fixing the underlying network is chasing an opportunity that might very well forever be just on the horizon. Over time, the obvious end game here is that both things needs to happen. Abstraction and underlying correction are both required.

While it might be the CTOs and CIOs at the largest enterprises and carriers that drive overarching requirements, the middle and lower tiers of the market consume an awful lot. Interestingly enough, that business is somehow lost in all the noise. To make a difference here, networking doesn’t need to be more abstract; it just needs to be a whole lot more instinctive.

[Today’s fun fact: 100% of all lottery winners gain weight. Seriously, how does anyone know this is a fact?]

The post Abstracting vs Ignoring appeared first on Plexxi.

Read the original blog entry...

More Stories By Michael Bushong

The best marketing efforts leverage deep technology understanding with a highly-approachable means of communicating. Plexxi's Vice President of Marketing Michael Bushong has acquired these skills having spent 12 years at Juniper Networks where he led product management, product strategy and product marketing organizations for Juniper's flagship operating system, Junos. Michael spent the last several years at Juniper leading their SDN efforts across both service provider and enterprise markets. Prior to Juniper, Michael spent time at database supplier Sybase, and ASIC design tool companies Synopsis and Magma Design Automation. Michael's undergraduate work at the University of California Berkeley in advanced fluid mechanics and heat transfer lend new meaning to the marketing phrase "This isn't rocket science."

Latest Stories
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Continuous testing helps bridge the gap between developing quickly and maintaining high quality products. But to implement continuous testing, CTOs must take a strategic approach to building a testing infrastructure and toolset that empowers their team to move fast. Download our guide to laying the groundwork for a scalable continuous testing strategy.
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...