Welcome!

News Feed Item

Arbor Networks Reports Unprecedented Spike in DDoS Attack Size Driven by NTP Misuse

Arbor Networks Inc., a leading provider of DDoS and advanced threat protection solutions for enterprise and service provider networks, today released global DDoS attack data derived from its ATLAS threat monitoring infrastructure. The data shows an unprecedented spike in volumetric attacks, driven by the proliferation of NTP reflection/amplification attacks.

NTP is a UDP-based protocol used to synchronize clocks over a computer network. Any UDP-based service including DNS, SNMP, NTP, chargen, and RADIUS is a potential vector for DDoS attacks because the protocol is connectionless and source IP addresses can be spoofed by attackers who have control of compromised or ‘botted’ hosts residing on networks which have not implemented basic anti-spoofing measures. NTP is popular due to its high amplification ratio of approximately 1000x. Furthermore, attacks tools are becoming readily available, making these attacks easy to execute.

ATLAS is a collaborative partnership with nearly 300 service provider customers who share anonymous traffic data with Arbor in order to deliver a comprehensive, aggregated view of global traffic and threats. ATLAS collects 80 TB/sec of traffic and provides the data for the Digital Attack Map, a visualization of global attack traffic created by Google Ideas. An online presentation with a full summary of Arbor’s findings for Q1 2014 is available here.

NTP Attacks Highlights

  • Average NTP traffic globally in November 2013 was 1.29 GB/sec, by February 2014 it was 351.64 GB/sec
  • NTP was used in 14% of DDoS events overall but 56% of events over 10 GB/sec and 84.7% of events over 100 GB/sec
  • US, France and Australia were the most common targets overall
  • US and France were the most common targets of large attacks

“Arbor has been monitoring and mitigating DDoS attacks since 2000. The spike in the size and frequency of large attacks so far in 2014 has been unprecedented,” said Arbor Networks Director of Solutions Architects Darren Anstee. “These attacks have become so large, they pose a very serious threat to Internet infrastructure, from the ISP to the enterprise.”

NTP Resources:

Arbor Networks has covered the rise in NTP attacks extensively, providing a wide range of data, research, analysis and best practices.

About Arbor Networks

Arbor Networks, Inc. helps secure the world’s largest enterprise and service provider networks from DDoS attacks and advanced threats. Arbor is the world’s leading provider of DDoS protection in the enterprise, carrier and mobile market segments, according to Infonetics Research. Arbor’s advanced threat solutions deliver complete network visibility through a combination of packet capture and NetFlow technology, enabling the rapid detection and mitigation of malware and malicious insiders. Arbor also delivers market leading analytics for dynamic incident response, historical analysis, visualization and forensics. Arbor strives to be a “force multiplier,” making network and security teams the experts. Our goal is to provide a richer picture into networks and more security context - so customers can solve problems faster and reduce the risk to their business.

To learn more about Arbor products and services, please visit our website at arbornetworks.com. Arbor’s research, analysis and insight, together with data from the ATLAS global threat intelligence system, can be found at the ATLAS Threat Portal.

Trademark Notice: Arbor Networks, Peakflow, ArbOS, How Networks Grow, ATLAS, Pravail, Arbor Optima, Cloud Signaling, the Arbor Networks logo and Arbor Networks: Smart. Available. Secure. are all trademarks of Arbor Networks, Inc. All other brand names may be trademarks of their respective owners.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?
All zSystem customers have a significant new business opportunity to extend their reach to new customers and markets with new applications and services, and to improve the experience of existing customers. This can be achieved by exposing existing z assets (which have been developed over time) as APIs for accessing Systems of Record, while leveraging mobile and cloud capabilities with new Systems of Engagement applications. In this session, we will explore business drivers with new Node.js apps ...
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
The technologies behind big data and cloud computing are converging quickly, offering businesses new capabilities for fast, easy, wide-ranging access to data. However, to capitalize on the cost-efficiencies and time-to-value opportunities of analytics in the cloud, big data and cloud technologies must be integrated and managed properly. Pythian's Director of Big Data and Data Science, Danil Zburivsky will explore: The main technology components and best practices being deployed to take advantage...
For years the world's most security-focused and distributed organizations - banks, military/defense agencies, global enterprises - have sought to adopt cloud technologies that can reduce costs, future-proof against data growth, and improve user productivity. The challenges of cloud transformation for these kinds of secure organizations have centered around data security, migration from legacy systems, and performance. In our presentation, we will discuss the notion that cloud computing, properl...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will d...
When applications are hosted on servers, they produce immense quantities of logging data. Quality engineers should verify that apps are producing log data that is existent, correct, consumable, and complete. Otherwise, apps in production are not easily monitored, have issues that are difficult to detect, and cannot be corrected quickly. Tom Chavez presents the four steps that quality engineers should include in every test plan for apps that produce log output or other machine data. Learn the ste...
Digital Transformation is well underway with many applications already on the cloud utilizing agile and devops methodologies. Unfortunately, application security has been an afterthought and data breaches have become a daily occurrence. Security is not one individual or one's team responsibility. Raphael Reich will introduce you to DevSecOps concepts and outline how to seamlessly interweave security principles across your software development lifecycle and application lifecycle management. With ...
The vast majority of businesses now use cloud services, yet many still struggle with realizing the full potential of their IT investments. In particular, small and medium-sized businesses (SMBs) lack the internal IT staff and expertise to fully move to and manage workloads in public cloud environments. Speaker Todd Schwartz will help session attendees better navigate the complex cloud market and maximize their technical investments. The SkyKick co-founder and co-CEO will share the biggest challe...
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Dhiraj Sehgal works in Delphix's product and solution organization. His focus has been DevOps, DataOps, private cloud and datacenters customers, technologies and products. He has wealth of experience in cloud focused and virtualized technologies ranging from compute, networking to storage. He has spoken at Cloud Expo for last 3 years now in New York and Santa Clara.
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.