Welcome!

Related Topics: @DevOpsSummit, Microservices Expo, Containers Expo Blog, Cloud Security

@DevOpsSummit: Article

Software Supply Chain Report | @DevOpsSummit #DevOps #ContinuousTesting

Analysis of 25,000 applications reveals 6.8% of packages / components used included known defects

Analysis of 25,000 applications reveals 6.8% of packages/components used included known defects. Organizations standardizing on components between 2 - 3 years of age can decrease defect rates substantially.

Open source and third-party packages/components live at the heart of high velocity software development organizations.  Today, an average of 106 packages / components comprise 80 - 90% of a modern application, yet few organizations have visibility into what components are used where.

Use of known defective components leads to quality and security issues within applications. While developers save tremendous amounts of time by sourcing software components from outside their organizations, they often don't have time to check those component versions against known vulnerability databases or internal policies.

In Sonatype's 2016 State of the Software Supply Chain report, analysis of 25,000 scans reveals that 1 in 16 (6.8%) components being used in applications contained at least one known security vulnerability.  This finding demonstrates that defective components are making their way across the entire software supply chain -- from initial sourcing to use in finished goods.

Screen Shot 2016-08-01 at 10.53.06 AM.png

Newer components make better software
Analysis of the scanned applications also revealed that the latest versions of components had the lowest percentage of known defects. Components under three years in age represented 38% of parts used in the average application; these components had security defect rates under 5%.

By comparison, components between five and seven years old had 2x the known security defect rate. The 2016 Verizon Data Breach and Investigations Report confirms that the vast majority of successful exploits last year were from CVE's (Common Vulnerabilities and Exposures) published 1998 - 2013. Combining the Verizon data with Sonatype's analysis further demonstrates the economic value of using newer, higher quality components.

Screen Shot 2016-08-01 at 11.03.29 AM.png

In summary, components greater than two years old represent 62% of all components scanned and account for 77% of the risk. Better component selection not only improves the quality of the finished application, it also reduces the number of break-fixes and unplanned work to remediate the defects.

Older components die off
Research shows that new versions of open source components are released an average of 14x per year. The new versions deliver greater functionality, improved performance, and fewer known defects. Just as in traditional manufacturing, using the newest versions of any part typically results in a higher quality finished product.

In their 2016 report, Sonatype discovered that component versions seven years or older made up approximately 18% of the footprint of the 25,000 application scans. For the older components, analysis showed that as many as 23% were on the latest version -- meaning, the open source projects for those components were inactive, dead...or perhaps they are just incredibly stable.

Screen Shot 2016-08-01 at 11.04.16 AM.png

Discovery of components with known security vulnerabilities or other defects used in applications is not something anyone desires. Unfortunately, when these defects are discovered in older components, chances of remediating the issue by upgrading to a newer component version are greatly diminished. If a new version does not exist, only a few options exist:

  1. Keep the vulnerable component in the application

  2. Wwitch to a newer like component from another open source project

  3. Make a software change to add a mitigating control, or

  4. Code the functionality required from scratch in order to replace the defect.

None of these options comes without a significant cost.
As discussed in Cisco's 2015 Midyear Security Report, "With open-source software in place in many enterprises, security professionals need to gain a deeper understanding of where and how open-source is used in their organizations, and whether their open-source packages or libraries are up to date. This means that, moving forward, software supply chain management becomes even more critical."

More information about software supply chain management practices and open source component quality can be found in the 2016 State of the Software Supply Chain Report.

More Stories By Derek Weeks

In 2015, Derek Weeks led the largest and most comprehensive analysis of software supply chain practices to date across 160,000 development organizations. He is a huge advocate of applying proven supply chain management principles into DevOps practices to improve efficiencies, reduce costs, and sustain long-lasting competitive advantages.

As a 20+ year veteran of the software industry, he has advised leading businesses on IT performance improvement practices covering continuous delivery, business process management, systems and network operations, service management, capacity planning and storage management. As the VP and DevOps Advocate for Sonatype, he is passionate about changing the way people think about software supply chains and improving public safety through improved software integrity. Follow him here @weekstweets, find me here www.linkedin.com/in/derekeweeks, and read me here http://blog.sonatype.com/author/weeks/.

Latest Stories
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, discussed how given the magnitude of today's application ...
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
Mobile device usage has increased exponentially during the past several years, as consumers rely on handhelds for everything from news and weather to banking and purchases. What can we expect in the next few years? The way in which we interact with our devices will fundamentally change, as businesses leverage Artificial Intelligence. We already see this taking shape as businesses leverage AI for cost savings and customer responsiveness. This trend will continue, as AI is used for more sophistica...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams. In his session at 22nd Cloud Expo | DXWorld Expo, Daniel Jones, CTO of EngineerBetter, will answer: How can we improve willpower and decrease technical debt? Is the present bias real? How can we turn it to our advantage? Can you increase a team’s effective IQ? How do DevOps & Product Teams increase empathy, and what impact does empath...
DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to close th...
As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that's no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, explored how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He expla...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
"Digital transformation - what we knew about it in the past has been redefined. Automation is going to play such a huge role in that because the culture, the technology, and the business operations are being shifted now," stated Brian Boeggeman, VP of Alliances & Partnerships at Ayehu, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.