Related Topics: IoT User Interface, Containers Expo Blog, Cloud Security, @DevOpsSummit

IoT User Interface: Article

Clash of Ops | @DevOpsSummit #BigData #APM #DevOps #Docker #Monitoring

DevOps and netops speak different languages that use the same word to mean different things

It was a Monday. I was reading the Internet. Okay, I was skimming feeds. Anyway, I happened across a title that intrigued me, "Stateful Apps and Containers: Squaring the Circle." It had all the right buzzwords (containers) and mentioned state, a topic near and dear to this application networking-oriented gal, so I happily clicked on through.

Turns out that Stateful Apps are not Stateful Apps. Seriously.

To be fair, I should really say that when a devops guy talks about ‘stateful apps' it is not the same thing as when a netops gal uses the term ‘stateful apps.'  That's because the devops guy is referring to persistent data storage for applications. File systems, databases, etc... When a netops gal talks about stateful apps, they're talking about the unique characteristics that identify existing TCP connections between two systems, like a client and an app. Devops thinks in terms of app data, netops about network data.

Devops and netops speak different languages that use the same word to mean different things. It's like English. No big deal.

The thing is that this may seem like a minor issue to be worried about. But then I got thinking about emerging application architectures like microservices and the dominance of APIs and the urgency with which everyone is moving to secure HTTP traffic. And I realized that actually, it is a pretty big deal, because it's a clash of ops. While devops is over there, building stateless architectures based on the newest theories and principles of scalability, we're requiring security that basically negates many of the benefits we might have seen.

That's because the nature of public key cryptography requires state in the network.

Here Comes the (Computer) Science
Public key infrastructure (PKI) is based on a fairly simple premise that information is exchanged between two endpoints (client and app) that is unique to that connection. That means any subsequent exchanges have to be made between the two endpoints that established that connection.

That's stateful networking.

Even if your entire architecture is based on stateless microservices, once you add security (SSL/TLS), it's stateful. Whamo! Just like that. And that impacts scale. Because now you've got to figure out how best to distribute traffic based on how loaded any given instance of that app might be.

And you probably don't want to be renegotiating secure sessions for every, single, interaction. You don't. I don't care how much faster HTTP/2 is, or how much better ECC is over previous generations of cryptography (spoiler: quite a bit better), there is still significant latency by the process of negotiating that connection. There's the overhead of establishing the underlying TCP session and then the security negotiations. That adds latency thanks to all those round trips back and forth, which means slower application response times. Especially on mobile devices.

So what? You might say. It's measured in milliseconds, that can't possibly impact the application experience.

But it does. Milliseconds matter, especially today, when digital natives who've never experienced what 2800bps feels like want their apps to respond instantaneously, with LAN-like performance.

What that means is that adding that layer of security (which is - or should be - a requirement) effectively turns your elegant, stateless architecture into a stateful one.

This is why architecture matters. Because it's no longer a matter of throwing a load balancer in front of those services and picking an algorithm, it's about extending the app architecture into the network, upstream, and understanding the advantages of terminating that security before it gets all that "state" in your "stateless" architecture. If the load balancer (or ADC if you prefer) is terminating SSL/TLS, then it has to manage the negotiation, and the back and forth with clients. That means it's free (if it's a modern proxy-based solution) to interact in with services in the back-end the way dev intended: statelessly.


The thing to be aware of is that when app architectures and network architectures meet, they can often clash and effectively negate all the goodness intended by the new app architecture in the first place. DevOps is as much about communication between groups as it is automating the processes between them. That means understanding the impact of the network on apps, and vice versa and agreeing on an architecture that preserves the best characteristics of the app architecture without sacrificing network speed or security.

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Latest Stories
Cloud based infrastructure deployment is becoming more and more appealing to customers, from Fortune 500 companies to SMEs due to its pay-as-you-go model. Enterprise storage vendors are able to reach out to these customers by integrating in cloud based deployments; this needs adaptability and interoperability of the products confirming to cloud standards such as OpenStack, CloudStack, or Azure. As compared to off the shelf commodity storage, enterprise storages by its reliability, high-availabil...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Complete Internet of Things (IoT) embedded device security is not just about the device but involves the entire product’s identity, data and control integrity, and services traversing the cloud. A device can no longer be looked at as an island; it is a part of a system. In fact, given the cross-domain interactions enabled by IoT it could be a part of many systems. Also, depending on where the device is deployed, for example, in the office building versus a factory floor or oil field, security ha...
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
Governments around the world are adopting Safe Harbor privacy provisions to protect customer data from leaving sovereign territories. Increasingly, global companies are required to create new instances of their server clusters in multiple countries to keep abreast of these new Safe Harbor laws. Is it worth it? In his session at 19th Cloud Expo, Adam Rogers, Managing Director of Anexia, Inc., will discuss how to keep your data legal and still stay in business.
Successful transition from traditional IT to cloud computing requires three key ingredients: an IT architecture that allows companies to extend their internal best practices to the cloud, a cost point that allows economies of scale, and automated processes that manage risk exposure and maintain regulatory compliance with industry regulations (FFIEC, PCI-DSS, HIPAA, FISMA). The unique combination of VMware, the IBM Cloud, and Cloud Raxak, a 2016 Gartner Cool Vendor in IT Automation, provides a co...
Today every business relies on software to drive the innovation necessary for a competitive edge in the Application Economy. This is why collaboration between development and operations, or DevOps, has become IT’s number one priority. Whether you are in Dev or Ops, understanding how to implement a DevOps strategy can deliver faster development cycles, improved software quality, reduced deployment times and overall better experiences for your customers.
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...