Welcome!

Blog Feed Post

Web Performance 101: HTTP Headers Pt. 2

We recently reviewed some essential HTTP headers and how they are implemented in a previous blog post. Headers can affect page performance in multiple ways, but there are three major factors that can negatively and/or positively affect the digital experience :

  • Caching
  • Compression
  • Security

This article discusses how headers can help optimize website performance. We will focus on headers that help implement caching, compression, and security efficiently and how it impacts the overall digital experience.

Caching Headers

Caching is an important webpage optimization technique that can greatly impact performance. When resources are cached, it means a copy of the resource has been saved to a cache server or a local server. Caching can greatly reduce network latency, making the page render faster. There are two types of caching techniques in use today:

  • Shared proxy caching – the cached resources are stored in a shared network location. These resources are reused multiple times by different users and it cuts down the number of roundtrips to the server, reducing network congestion and latency.
  • Local caching – the cache is stored locally by the browser. The browser caches the page content which greatly reduces the page load time, as the browser does not have to make repeated requests to the web server for a resource it has already downloaded and cached.

http://blog.catchpoint.com/wp-content/uploads/2017/10/headpt21-300x188.jpg 300w, http://blog.catchpoint.com/wp-content/uploads/2017/10/headpt21-768x483.jpg 768w" sizes="(max-width: 869px) 100vw, 869px" />

Let’s look at the HTTP headers that help you control caching behavior:

Cache-control

The Cache-control header is used to define the caching policies for the session. There are several parameters that can be set for HTTP requests and response; these parameters will indicate how the client and server handle caching.

  1. To specify the cache should not be stored, we can use the following options
Cache-Control: no-store
Cache-Control: no-cache, no-store, must-revalidate
  1. To indicate cache validation on the origin server, we can use the option
Cache-Control: no-cache
  1. To specify where the type of caching to be used, that is, whether the resource can be stored on any cache or specifically private cache, set the following options
Cache-Control: private
Cache-Control: public
  1. To specify the time for which a resource is considered fresh and when it should be refreshed from the origin server, set the parameter using max-age
Cache-Control: max-age=31536000
  1. If a cached resource needs to be validated for freshness before serving, then we use the following parameter
Cache-Control: must-revalidate

Pragma

Pragma is an outdated header, it serves the same purpose as the Cache-Control header which was introduced in HTTP/1.1; but unlike the Cache-Control header, the Pragma header is specific to HTTP requests and not used in the response. The header is still in use to ensure compatibility with clients who don’t support HTTP/1.0.

Implementing the relevant HTTP caching headers can help reduce the browser overhead and speed up page rendering. The It can have a positive impact on the user’s digital experience; you can read about the major impact caching has in this blog post.

Compression Headers

Data compression makes data transfers more efficient, freeing up bandwidth and speeding up the page load. Compression algorithms (like gzip and Brotli) can cut down file size by almost 70% which has a huge impact on the performance of modern content heavy websites.

HTTP headers can be used to enable compression and configure the accepted compression formats. There are mainly two categories of headers depending on the type of data compression that is to be configured:

  1. End-to-end headers: This category of HTTP headers is used to set up end-to-end compression which means the message body is compressed by the server and it remains compressed till it reaches the client. The headers must be transmitted from the server to client without modification.

   When the browser makes am HTTP request, it sends an Accept-Encoding header that specifies the  compression algorithm it supports.

Accept-Encoding: gzip
or
Accept-Encoding: gzip, compress, br

The server then sends the requested content after compression with the response header Content-Encoding. This header indicates the algorithm selected by the server from the list specified by the browser.

Content-Encoding: gzip

In addition to these headers, the response could also include a Vary header that will indicate the header that was used to decide the compression type so that subsequent requests will know the format of the cached resource.

Vary: Content-Encoding

http://blog.catchpoint.com/wp-content/uploads/2017/10/headpt22-300x133.jpg 300w" sizes="(max-width: 749px) 100vw, 749px" />

  1. Hop-by-hop headers: The hop-by-hop headers are not retransmitted between proxies and nor is it cached. The resource is transmitted in a compressed or uncompressed format between each hop from the server to the client. This is usually done using the TE header or Transfer-Encoding header.
TE: compress, gzip , deflate
 Or
Transfer-Encoding: gzip, compress

http://blog.catchpoint.com/wp-content/uploads/2017/10/headpt23-300x145.jpg 300w" sizes="(max-width: 739px) 100vw, 739px" />
It is highly recommended to take advantage of data compression when optimizing your website. Reducing the file size without compromising the data quality involves complex and advanced algorithms, take an in-depth look at how this works in this detailed bog post about compression techniques.

Security Headers

HTTP security headers can be used to define the security policies that must be followed during the HTTP session; it instructs the clients on how to handle the content that it has requested for. These headers help patch any security vulnerabilities and provides protection against data injection and Cross Site Scripting attacks. Let us look at some of the headers in the category.

Content-Security-Policy

The Content-Security-Policy header is used to specify the type of resources and scripts the user agent is allowed to load. This means that the browser can access only those content that has been approved by the website administrator. For example, we can specify a list of the source from which to download scripts.

Content-Security-Policy: script-src https://www.google-analytics.com

Public-Key-Pins

Man-in-the-middle attack (MitM) can be mitigated by configuring the Public-Key-Pins header. It assigns a cryptographic key to the web server, minimizing the chance of an MITM attack using forged certificates.

Public-Key-Pins:

  pin-sha256="cUPcluytzOJAhhneDttWpY3oBAkE3h2+soZS7sWs=";

  pin-sha256="M8HztCzM3elUkjdekjkfdppqweHkmjAHKhpGPWE=";

  max-age=5184000; includeSubDomains;

  report-uri=https://www.example.org/

Strict-Transport-Security

The Strict-Transport-Security header forces the browser to access the web server through HTTPS alone. The header will redirect all HTTP requests to HTTPS which means that the user is viewing only the encrypted and secure version of the website.

Strict-Transport-Security: max-age=31536000; includeSubDomains

X-Content-Type-Options

This header prevents the browser from trying to guess the MIME type of the requested resource. X-Content-Type-Options when used along Content-Type will specify the MIME and prevent MIME type sniffing.

 X-Content-Type-Options: nosniff

X-Frame-Options

The X-Frame-Options header will indicate if the content within tags such as <iframe> or <object> should be allowed to load. For example, the following header will allow iframes with content from the same server to load and block those with content from a different server.

X-Frame-Options: SAMEORIGIN

X-XSS-Protection

This header is an added layer of security against cross-site scripting or XSS attacks. Most modern browsers have a default filter that detects potential XSS requests and this can be strictly enforced using the x-xss-protection header.

x-xss-protection: 1; mode=block

 

The post Web Performance 101: HTTP Headers Pt. 2 appeared first on Catchpoint's Blog - Web Performance Monitoring.

Read the original blog entry...

More Stories By Mehdi Daoudi

Catchpoint radically transforms the way businesses manage, monitor, and test the performance of online applications. Truly understand and improve user experience with clear visibility into complex, distributed online systems.

Founded in 2008 by four DoubleClick / Google executives with a passion for speed, reliability and overall better online experiences, Catchpoint has now become the most innovative provider of web performance testing and monitoring solutions. We are a team with expertise in designing, building, operating, scaling and monitoring highly transactional Internet services used by thousands of companies and impacting the experience of millions of users. Catchpoint is funded by top-tier venture capital firm, Battery Ventures, which has invested in category leaders such as Akamai, Omniture (Adobe Systems), Optimizely, Tealium, BazaarVoice, Marketo and many more.

Latest Stories
SYS-CON Events announced today that Massive Networks, that helps your business operate seamlessly with fast, reliable, and secure internet and network solutions, has been named "Exhibitor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. As a premier telecommunications provider, Massive Networks is headquartered out of Louisville, Colorado. With years of experience under their belt, their team of...
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
Infoblox delivers Actionable Network Intelligence to enterprise, government, and service provider customers around the world. They are the industry leader in DNS, DHCP, and IP address management, the category known as DDI. We empower thousands of organizations to control and secure their networks from the core-enabling them to increase efficiency and visibility, improve customer service, and meet compliance requirements.
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, will describe how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launchi...
SYS-CON Events announced today that TidalScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TidalScale is the leading provider of Software-Defined Servers that bring flexibility to modern data centers by right-sizing servers on the fly to fit any data set or workload. TidalScale’s award-winning inverse hypervisor technology combines multiple commodity servers (including their ass...
As hybrid cloud becomes the de-facto standard mode of operation for most enterprises, new challenges arise on how to efficiently and economically share data across environments. In his session at 21st Cloud Expo, Dr. Allon Cohen, VP of Product at Elastifile, will explore new techniques and best practices that help enterprise IT benefit from the advantages of hybrid cloud environments by enabling data availability for both legacy enterprise and cloud-native mission critical applications. By rev...
As popularity of the smart home is growing and continues to go mainstream, technological factors play a greater role. The IoT protocol houses the interoperability battery consumption, security, and configuration of a smart home device, and it can be difficult for companies to choose the right kind for their product. For both DIY and professionally installed smart homes, developers need to consider each of these elements for their product to be successful in the market and current smart homes.
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, will lead you through the exciting evolution of the cloud. He'll look at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering ...
SYS-CON Events announced today that N3N will exhibit at SYS-CON's @ThingsExpo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. N3N’s solutions increase the effectiveness of operations and control centers, increase the value of IoT investments, and facilitate real-time operational decision making. N3N enables operations teams with a four dimensional digital “big board” that consolidates real-time live video feeds alongside IoT sensor data a...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, will provide a fun and simple way to introduce Machine Leaning to anyone and everyone. Together we will solve a machine learning problem and find an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intellige...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere delivers a more modern architectural approach to storage that doesn't require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbui...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.