Welcome!

Related Topics: Linux Containers, Containers Expo Blog, Agile Computing, @CloudExpo, @DevOpsSummit

Linux Containers: Article

HTTPS Is Not Faster Than HTTP | @DevOpsSummit #WebPerf #DataCenter

I have read no less than two contrived comparisons of 'HTTPS' and 'HTTP' in the last two weeks

Yes, Lori has been reading the Internet again. And what she's been seeing makes baby Lori angry. It also makes this former test designer and technology editor cry. Really, I weep at both the excuses offered for such testing and the misleading headline.

I have read no less than two contrived comparisons of "HTTPS" and "HTTP" in the last two weeks purporting to demonstrate that secure HTTP is inarguably faster than its plaintext counterpart, HTTP.

Oh, if only that were true.

See, the trick is that both comparisons (and no doubt many more will follow) are comparing secure HTTP/2 with insecure HTTP/1.1. From the aforementioned comparison: "Plaintext HTTP/1.1 is compared against encrypted HTTP/2 HTTPS".

As we are all already aware, HTTP/2 itself is faster (by design) than HTTP/1.1 for a variety of reasons that have absolutely nothing to do with security. Multiplexing, ‘smart' headers, and a binary bitstream all combine to provide a faster and more efficient protocol, period. While it's likely the case that layering security (TLS or SSL) atop HTTP/2 will cause a slight degradation in performance (because math says it will), it's not enough to drive performance down the levels we are used to seeing with HTTP/1.1, even unsecured.

Unfortunately, these results are touting as inarguable proof that HTTPS is faster than HTTP. Which is simply not true. The argument against testing HTTP/2 secure against HTTP/2 plaintext is that browsers refuse to support HTTP/2 without security, and thus there is no way to perform such a test. So a test was contrived to pretend to illustrate the differences, but in fact does not do anything of the kind.

It's true that comparing secure HTTP/2 with insecure HTTP/2 would be passingly difficult, if not impossible. While HTTP/2 backed off its requirement for only secure connections and allows for plaintext, all the major browsers refused to support plaintext and have thus far only provided support for HTTP/2 over TLS/SSL. Even popular command line tools like curl refuse to allow insecure HTTP/2 connections. Which winds up making HTTPS the de facto standard, even though the specification doesn't. But that doesn't mean you can go ahead and compare the two and then make absolutely ridiculous claims based on that test that are disproven with simple mathematics.

See, let's pretend that a web page transferred via HTTP/2 plaintext took exactly 1.2 seconds to load. Now let's add TLS. The addition of TLS (or SSL for that matter) means there is more processing that goes on, specifically encryption and decryption of the data. Even if that takes only .3 seconds, it still means that HTTPS is a teensy bit slower than HTTP. Period. Math says so, and math is pure. It has no agenda, it doesn't care about the results, it simply says "here it is."

And math says if you do X and then add on Y you get Z, and Z will always be greater than X or Y.

I understand the desire to push folks toward HTTP/2, because it's faster and it's the first real "upgrade" we've had to HTTP in a really long time, but it takes time, especially when it requires a lot of upgrades and changes to infrastructure that will necessitate disruptions as everyone from app dev to ops to netops to security have to drop what they're doing and test, deploy, and test again. And that doesn't account for changes in modifying apps that have long been built around HTTP/1.1 and its protocol specification. HTTP/2 changes everything. And its impact spans the entire data center. While gateways mitigate the inherent difficulty and disruption stemming from migration, not everyone necessarily sees a driving need to hop on the HTTP/2 bandwagon.

The boost in performance organizations will see simply means HTTP/2 performs as its designers intended, with increased speed and efficiency. It means organizations should be planning on the app and network infrastructure upgrades necessary to migrate to support the new standard, whether that's through HTTP gateways or not. It doesn't mean that HTTPS is faster than HTTP.

Making demonstrably false claims to craft click bait like headlines regarding allegedly superior performance is simply unacceptable. Yes, you will almost certainly see a boost in performance if you're moving from HTTP/1.1 to HTTP/2, even with forced security. But that does not, in any world where logic and math exist, mean that HTTPS is faster than HTTP. If you want to help organizations, help them understand how to smoothly transition from the old to the new. Provide meaningful data for them to build a business case that enables them to upgrade to the latest and greatest. Provide them the means to show that the investment in moving from HTTP/1.x to HTTP/2 will pay off in the long run.

Offer guidelines and best practices, not punchy headlines and a buried lede.

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Adobe is changing the world though digital experiences. Adobe helps customers develop and deliver high-impact experiences that differentiate brands, build loyalty, and drive revenue across every screen, including smartphones, computers, tablets and TVs. Adobe content solutions are used daily by millions of companies worldwide-from publishers and broadcasters, to enterprises, marketing agencies and household-name brands. Building on its established design leadership, Adobe enables customers not o...
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue an...
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Digitization is driving a fundamental change in society that is transforming the way businesses work with their customers, their supply chains and their people. Digital transformation leverages DevOps best practices, such as Agile Parallel Development, Continuous Delivery and Agile Operations to capitalize on opportunities and create competitive differentiation in the application economy. However, information security has been notably absent from the DevOps movement. Speed doesn’t have to negat...
Leading cloud-centric IT organizations are establishing core capabilities to improve productivity, control costs and provide a highly responsive end-user experience. Key steps along this journey include creating an end-user cloud services catalog, automating workflows and provisioning, and implementing IT showback and chargeback. In his session at 19th Cloud Expo, Mark Jamensky, executive vice president of Products at Embotics, will walk attendees through an in-depth case study of enterprise I...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace.
Your business relies on your applications and your employees to stay in business. Whether you develop apps or manage business critical apps that help fuel your business, what happens when users experience sluggish performance? You and all technical teams across the organization – application, network, operations, among others, as well as, those outside the organization, like ISPs and third-party providers – are called in to solve the problem.
Digital transformation is too big and important for our future success to not understand the rules that apply to it. The first three rules for winning in this age of hyper-digital transformation are: Advantages in speed, analytics and operational tempos must be captured by implementing an optimized information logistics system (OILS) Real-time operational tempos (IT, people and business processes) must be achieved Businesses that can "analyze data and act and with speed" will dominate those t...
While DevOps promises a better and tighter integration among an organization’s development and operation teams and transforms an application life cycle into a continual deployment, Chef and Azure together provides a speedy, cost-effective and highly scalable vehicle for realizing the business values of this transformation. In his session at @DevOpsSummit at 19th Cloud Expo, Yung Chou, a Technology Evangelist at Microsoft, will present a unique opportunity to witness how Chef and Azure work tog...
Almost two-thirds of companies either have or soon will have IoT as the backbone of their business in 2016. However, IoT is far more complex than most firms expected. How can you not get trapped in the pitfalls? In his session at @ThingsExpo, Tony Shan, a renowned visionary and thought leader, will introduce a holistic method of IoTification, which is the process of IoTifying the existing technology and business models to adopt and leverage IoT. He will drill down to the components in this fra...
As ridesharing competitors and enhanced services increase, notable changes are occurring in the transportation model. Despite the cost-effective means and flexibility of ridesharing, both drivers and users will need to be aware of the connected environment and how it will impact the ridesharing experience. In his session at @ThingsExpo, Timothy Evavold, Executive Director Automotive at Covisint, will discuss key challenges and solutions to powering a ride sharing and/or multimodal model in the a...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
I'm a lonely sensor. I spend all day telling the world how I'm feeling, but none of the other sensors seem to care. I want to be connected. I want to build relationships with other sensors to be more useful for my human. I want my human to understand that when my friends next door are too hot for a while, I'll soon be flaming. And when all my friends go outside without me, I may be left behind. Don't just log my data; use the relationship graph. In his session at @ThingsExpo, Ryan Boyd, Engi...