|By Lori MacVittie||
|October 22, 2016 07:45 PM EDT||
I've been reading up on APIs cause, coolness. And in particular I really enjoyed reading Best Practices for Designing a Pragmatic RESTful API because it had a lot of really good information and advice.
And then I got to the part about compressing your APIs.
Before we go too far let me first say I'm not saying you shouldn't compress your API or app responses. You probably should. What I am saying is that where you compress data and when are important considerations.
That's because generally speaking no one has put their web server (which is ultimately what tends to serve up responses, whether they're APIs or objects, XML or JSON) at the edge of the Internet. You know, where it's completely vulnerable. It's usually several devices back in the networking gauntlet that has be run before data gets from the edge of your network to the server.
This is because there are myriad bad actors out salivating at the prospect of a return to an early aughts data center architecture in which firewalls, DDoS protection, and other app security services were not physically and logically located upstream from the apps they protect today.
Cause if you don't have to navigate the network, it's way easier to launch an attack on an app.
Today, we employ an average of 11 different services in the network, upstream from the app, to provide security, scale, and performance-enhancing services. Like compression.
Now, you can enable compression on the web server. It's a standard thing in HTTP and it's little more than a bit to flip in the configuration. Easy peasy performance-enhancing change, right?
Except that today that's not always true.
The primary reason compression improves performance is because when it reduces the size of data it reduces the number of packets that must be transmitted. That reduces the potential for congestion that causes a Catch-22 where TCP retransmits increase congestion that increases packet loss that increases... well, you get the picture. This is particularly true when mobile clients are connecting via cellular networks, because latency is a real issue for them and the more round trips it takes, the worse the application experience.
Suffice to say that the primary reason compression improves performance is that it reduces the amount of data needing to be transmitted which means "faster" delivery to the client. Fewer packets = less time = happier users.
That's a good thing. Except when compression gets in the way or doesn't provide any real reduction that would improve performance.
What? How can that be, you ask.
Remember that we're looking for compression to reduce the number of packets transmitted, especially when it has to traverse a higher latency, lower capacity link between the data center and the client.
It turns out that sometimes compression doesn't really help with that.
Consider the aforementioned article and its section on compressing. The author ran some tests, and concluded that compression of text-based data produces some really awesome results:
Let's look at this with a real world example. I've pulled some data from GitHub's API, which uses pretty print by default. I'll also be doing some gzip comparisons:
$ curl https://api.github.com/users/veesahni > with-whitespace.txt $ ruby -r json -e 'puts JSON JSON.parse(STDIN.read)' < with-whitespace.txt > without-whitespace.txt
$ gzip -c with-whitespace.txt > with-whitespace.txt.gz
$ gzip -c without-whitespace.txt > without-whitespace.txt.gz
The output files have the following sizes:
- without-whitespace.txt - 1252 bytes
- with-whitespace.txt - 1369 bytes
- without-whitespace.txt.gz - 496 bytes
- with-whitespace.txt.gz - 509 bytes
In this example, the whitespace increased the output size by 8.5% when gzip is not in play and 2.6% when gzip is in play. On the other hand, the act of gzipping in itself provided over 60% in bandwidth savings. Since the cost of pretty printing is relatively small, it's best to pretty print by default and ensure gzip compression is supported!
To further hammer in this point, Twitter found that there was an 80% savings (in some cases)when enabling gzip compression on their Streaming API. Stack Exchange went as far as to never return a response that's not compressed!
Wow! I mean, from a purely mathematical perspective, that's some awesome results. And the author is correct in saying it will provide bandwidth savings.
What those results won't necessarily do is improve performance because the original size of the file was already less than the MSS for a single packet. Which means compressed or not, that data takes exactly one packet to transmit. That's it. I won't bore you with the mathematics, but the speed of light and networking says one packet takes the same amount of time to transit whether it's got 496 bytes of payload or 1396 bytes of payload. The typical MSS for Ethernet packets is 1460 bytes, which means compressing something smaller than that effectively nets you nothing in terms of performance. It's like a plane. It takes as long to fly from point A to point B whether there are 14 passengers or 140. Fuel efficiency (bandwidth) is impacted, but that doesn't really change performance, just the cost.
Furthermore, compressing the payload at the web server means that web app security services upstream have to decompress if they want to do their job, which is to say scan responses for sensitive or excessive data indicative of a breach of security policies. This is a big deal, kids. 42% of respondents in our annual security strategy to prevent data leaks. Which means they have to spend extra time to decompress the data to evaluate it and then recompress it, or perhaps they can't inspect it at all.
State of Application Delivery survey always scan responses as part of their overall
Now, that said, bandwidth savings are a good thing. It's part of any comprehensive scaling strategy to consider the impact of increasing use of an app on bandwidth. And a clogged up network can impact performance negatively so compression is a good idea. But not necessarily at the web server. This is akin to carefully considering where you enforce SSL/TLS security measures, as there are similar impacts on security services upstream from the app / web server.
That's why the right place for compression and SSL/TLS is generally upstream, in the network, after security has checked out the response and it's actually ready to be delivered to the client. That's usually the load balancing service or the ADC, where compression can not only be applied most efficiently and without interfering with security services and offsetting the potential gains by forcing extra processing upstream.
As with rate limiting APIs, it's not always a matter of whether or not you should, it's a matter of where you should.
Architecture, not algorithms, are the key to scale and performance of modern applications.
So you think you are a DevOps warrior, huh? Put your money (not really, it’s free) where your metrics are and prove it by taking The Ultimate DevOps Geek Quiz Challenge, sponsored by DevOps Summit. Battle through the set of tough questions created by industry thought leaders to earn your bragging rights and win some cool prizes.
Oct. 24, 2016 01:15 AM EDT Reads: 3,910
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Oct. 24, 2016 01:15 AM EDT Reads: 1,868
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
Oct. 24, 2016 01:00 AM EDT Reads: 768
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and ...
Oct. 24, 2016 01:00 AM EDT Reads: 3,570
“Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. CloudBerry Backup is a leading cross-platform cloud backup and disaster recovery solution integrated with major public cloud services, such as Amazon Web Services, Microsoft Azure and Google Cloud Platform.
Oct. 24, 2016 12:45 AM EDT Reads: 1,336
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
Oct. 24, 2016 12:45 AM EDT Reads: 959
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Oct. 24, 2016 12:45 AM EDT Reads: 2,465
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
Oct. 24, 2016 12:45 AM EDT Reads: 749
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Oct. 24, 2016 12:15 AM EDT Reads: 871
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
Oct. 24, 2016 12:00 AM EDT Reads: 3,973
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
Oct. 23, 2016 11:45 PM EDT Reads: 11,349
Effectively SMBs and government programs must address compounded regulatory compliance requirements. The most recent are Controlled Unclassified Information and the EU’s GDPR have Board Level implications. Managing sensitive data protection will likely result in acquisition criteria, demonstration requests and new requirements. Developers, as part of the pre-planning process and the associated supply chain, could benefit from updating their code libraries and design by incorporating changes.
Oct. 23, 2016 11:15 PM EDT Reads: 1,642
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, will draw together recent research and lessons learned from emerging and established ...
Oct. 23, 2016 11:00 PM EDT Reads: 1,270
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
Oct. 23, 2016 10:30 PM EDT Reads: 4,579
November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Penta Security is a leading vendor for data security solutions, including its encryption solution, D’Amo. By using FPE technology, D’Amo allows for the implementation of encryption technology to sensitive data fields without modification to schema in the database environment. With businesses having their data become increasingly more complicated in their mission-critical applications (such as ERP, CRM, HRM), continued ...
Oct. 23, 2016 09:15 PM EDT Reads: 984