Welcome!

Related Topics: Microservices Expo, Java IoT, Machine Learning , Agile Computing

Microservices Expo: Article

2014 Super Bowl Tips to Avoid Ad Site Fails

Tracking the ads for the Super Bowl can be tough as some advertisers don’t indicate whether they are advertising during the game

This year, the Seattle Seahawks dealt Denver one of the worst beatings in recent Super Bowl history; however, the only highlights of the broadcast were the commercials. They ranged from serious and thought-provoking to funny and quirky. Each ad was meant to do one thing: drive eyes to a brand. With most of the population watching with their phones and tablets, every advertiser's site had to be ready for those eyeballs.

Everyone wants to interview the winners and losers after the game. There is a dissection of every drive as analysts want to understand key aspects of success and failure:

  1. MVPs and who's to blame
  2. The Breakdown on both sides
  3. What to do for next season

I, of course, love football, but I also love watching Super Bowl ads and how they perform. I love looking at who was the fastest, who was the slowest, and understand why. The Internet is a level playing field on which everyone (with enough money) has the same options as everyone else; so when it comes to game time strategy, why do sites perform so differently?

MVPs and Who's to Blame

To review our full wrap-up of how the ad websites aired during the course of the game, click here:

Tracking the ads for the Super Bowl can be tough as some advertisers don't indicate whether or not they are advertising during the Super Bowl; others promote their ads well in advance. To compensate, our team added tests during the game as the ads aired but the methodology we used was the same for all.

We tested the ad URLs using real browser agents from end-user locations across the US. The tests ran from the following locations every 10 minutes during the game:

  • CA: Los Angeles - Verizon
  • CA: San Jose - AT&T
  • FL: Miami - Internap
  • IL: Chicago - Level3
  • MO: St. Louis - SAVVIS
  • NY: New York - Sprint
  • TX: Dallas - AT&T
  • VA: Reston - Savvis
  • WA: Seattle - Internap

We call this methodology a "9" Box as it divides the US into East, Mid-West and West; with three locations in each area running north to south. This gives us good coverage across the continental US; we recommend this approach for basic synthetic monitoring.

The browser agents doing the tests are the same as a real user opening a browser and making a connection to the page. It performs actions such as resolving the DNS address(s) for the ad as well as the ad's content including third parties; establishing the TCP connection(s) to all the domains contributing to the page; downloading the base ad page and reads, the HTML, executing all the JavaScript and CSS; downloading all the images and content being requested by the HTML and JavaScript; calculating how much time it takes the server to respond to request (First Byte Time) and then how much time it takes to download all of the content requested by the page. This allows us to understand which company had the fastest response time, which had the slowest, and how each got that way.

The Breakdown on Both Sides
For additional details on our impression of the Super Bowl advertisers and of the holiday season retailers, and practical advice we can all benefit from - click here for the full article.

More Stories By David Jones

David Jones is the Director of Sales Engineering and APM Evangelism for Dynatrace. He has been with Dynatrace for 10 years, and has 20 years’ experience working with web and mobile technologies from the first commercial HTML editor to the latest web delivery platforms and architectures. He has worked with scores of Fortune 500 organizations providing them the most recent industry best practices for web and mobile application delivery. Prior to Dynatrace he has worked at Gomez (Waltham), S1 Corp (Atlanta), Broadvision (Bay Area), Interleaf/Texcel (Waltham), i4i (Toronto) and SoftQuad (Toronto).

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyer belt between the Software Factory and production stages. Artifacts are m...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes a lot of work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost ...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...