Welcome!

Related Topics: @DevOpsSummit, Microservices Expo, Containers Expo Blog, Cloud Security

@DevOpsSummit: Article

Software Supply Chain Report | @DevOpsSummit #DevOps #ContinuousTesting

Analysis of 25,000 applications reveals 6.8% of packages / components used included known defects

Analysis of 25,000 applications reveals 6.8% of packages/components used included known defects. Organizations standardizing on components between 2 - 3 years of age can decrease defect rates substantially.

Open source and third-party packages/components live at the heart of high velocity software development organizations.  Today, an average of 106 packages / components comprise 80 - 90% of a modern application, yet few organizations have visibility into what components are used where.

Use of known defective components leads to quality and security issues within applications. While developers save tremendous amounts of time by sourcing software components from outside their organizations, they often don't have time to check those component versions against known vulnerability databases or internal policies.

In Sonatype's 2016 State of the Software Supply Chain report, analysis of 25,000 scans reveals that 1 in 16 (6.8%) components being used in applications contained at least one known security vulnerability.  This finding demonstrates that defective components are making their way across the entire software supply chain -- from initial sourcing to use in finished goods.

Screen Shot 2016-08-01 at 10.53.06 AM.png

Newer components make better software
Analysis of the scanned applications also revealed that the latest versions of components had the lowest percentage of known defects. Components under three years in age represented 38% of parts used in the average application; these components had security defect rates under 5%.

By comparison, components between five and seven years old had 2x the known security defect rate. The 2016 Verizon Data Breach and Investigations Report confirms that the vast majority of successful exploits last year were from CVE's (Common Vulnerabilities and Exposures) published 1998 - 2013. Combining the Verizon data with Sonatype's analysis further demonstrates the economic value of using newer, higher quality components.

Screen Shot 2016-08-01 at 11.03.29 AM.png

In summary, components greater than two years old represent 62% of all components scanned and account for 77% of the risk. Better component selection not only improves the quality of the finished application, it also reduces the number of break-fixes and unplanned work to remediate the defects.

Older components die off
Research shows that new versions of open source components are released an average of 14x per year. The new versions deliver greater functionality, improved performance, and fewer known defects. Just as in traditional manufacturing, using the newest versions of any part typically results in a higher quality finished product.

In their 2016 report, Sonatype discovered that component versions seven years or older made up approximately 18% of the footprint of the 25,000 application scans. For the older components, analysis showed that as many as 23% were on the latest version -- meaning, the open source projects for those components were inactive, dead...or perhaps they are just incredibly stable.

Screen Shot 2016-08-01 at 11.04.16 AM.png

Discovery of components with known security vulnerabilities or other defects used in applications is not something anyone desires. Unfortunately, when these defects are discovered in older components, chances of remediating the issue by upgrading to a newer component version are greatly diminished. If a new version does not exist, only a few options exist:

  1. Keep the vulnerable component in the application

  2. Wwitch to a newer like component from another open source project

  3. Make a software change to add a mitigating control, or

  4. Code the functionality required from scratch in order to replace the defect.

None of these options comes without a significant cost.
As discussed in Cisco's 2015 Midyear Security Report, "With open-source software in place in many enterprises, security professionals need to gain a deeper understanding of where and how open-source is used in their organizations, and whether their open-source packages or libraries are up to date. This means that, moving forward, software supply chain management becomes even more critical."

More information about software supply chain management practices and open source component quality can be found in the 2016 State of the Software Supply Chain Report.

More Stories By Derek Weeks

In 2015, Derek Weeks led the largest and most comprehensive analysis of software supply chain practices to date across 160,000 development organizations. He is a huge advocate of applying proven supply chain management principles into DevOps practices to improve efficiencies, reduce costs, and sustain long-lasting competitive advantages.

As a 20+ year veteran of the software industry, he has advised leading businesses on IT performance improvement practices covering continuous delivery, business process management, systems and network operations, service management, capacity planning and storage management. As the VP and DevOps Advocate for Sonatype, he is passionate about changing the way people think about software supply chains and improving public safety through improved software integrity. Follow him here @weekstweets, find me here www.linkedin.com/in/derekeweeks, and read me here http://blog.sonatype.com/author/weeks/.

Latest Stories
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, sha...
The Internet of Things (IoT) promises to simplify and streamline our lives by automating routine tasks that distract us from our goals. This promise is based on the ubiquitous deployment of smart, connected devices that link everything from industrial control systems to automobiles to refrigerators. Unfortunately, comparatively few of the devices currently deployed have been developed with an eye toward security, and as the DDoS attacks of late October 2016 have demonstrated, this oversight can ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, will discuss how AI can simplify cloud operations. He will cover the following topics: why clou...
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.