world of open source launch at KubeCon

10-Point Open Source and Software Supply Chain Security Mobilization Plan Released with Initial Pledges Surpassing $30M

WASHINGTON, DC – May 12, 2022 – The Linux Foundation and the Open Source Software Security Foundation (OpenSSF) brought together over 90 executives from 37 companies and government leaders from the NSC, ONCD, CISA, NIST, DOE, and OMB to reach a consensus on key actions to take to improve the resiliency and security of open source software. 

Open Source Software Security Summit II, is a follow-up to the first Summit held January 13, 2022 that was led by the White House’s National Security Council. Today’s meeting was convened by the Linux Foundation and OpenSSF on the one year after the anniversary of President Biden’s Executive Order on Improving the Nation’s Cybersecurity

The Linux Foundation and OpenSSF, with input provided from all sectors, delivered a first-of-its-kind plan to broadly address open source and software supply chain security. The Summit II plan outlines approximately $150M of funding over two years to rapidly advance well-vetted solutions to the ten major problems the plan identifies. The 10 streams of investment include concrete action steps for both more immediate improvements and building strong foundations for a more secure future. 

A subset of participating organizations have come together to collectively pledge an initial tranche of funding towards implementation of the plan. Those companies are Amazon, Ericsson, Google, Intel;, Microsoft, and VMWare, pledging over $30M. As the plan evolves further more funding will be identified, and work will begin as individual streams are agreed upon.

This builds on the existing investments that the OpenSSF community members make into open source software. An informal poll of our stakeholders indicates they spend over $110M and employ nearly a hundred full-time equivalent employees focused on nothing but securing the open source software landscape. This plan adds to those investments.


Jim Zemlin – Executive Director, Linux Foundation:  “On the one year anniversary of President Biden’s executive order, today we are here to respond with a plan that is actionable, because open source is a critical component of our national security and it is fundamental to billions of dollars being invested in software innovation today. We have a shared obligation to upgrade our collective cybersecurity resilience and improve trust in software itself.  This plan represents our unified voice and our common call to action. The most important task ahead of us is leadership.”

Brian Behlendorf – Executive Director, Open Source Security Foundation (OpenSSF):  “What we are doing here together is converging a set of ideas and principles of what is broken out there and what we can do to fix it.  The plan we have put together represents the 10 flags in the ground as the base for getting started.  We are eager to get further input and commitments that move us from plan to action.”

Anne Neurenberger, Deputy National Security Advisor, Cyber & Emerging Tech at National Security Council, The White House:

“President Biden signed the Executive Order on Cybersecurity last year to ensure the software our government relies on is secure and reliable, including software that runs our critical infrastructure.  Earlier this year, the White House convened a meeting between government and industry participants to improve the security of Open Source software.  The Open Source security foundation has followed up on the work at that meeting and convened participants from across industry to make substantial progress.  We are appreciative of all participants’ work on this important issue.”


Adrian Ludwig, Chief Trust Officer

“Open source software is critical to so many of the tools and applications that are used by thousands of development teams worldwide. Consequently, the security of software supply chains has been elevated to the top of most organizations’ priorities in the wake of recent high-profile vulnerabilities in open source software. Only through concerted efforts by industry, government and other stakeholders can we ensure that open source innovation continues to flourish in a secure environment. This is why we are happy to be participating in OpenSSF, where we can collaborate on key initiatives that raise awareness and drive action around the crucial issues facing software supply chain security today. We’re excited to be a key contributor to driving meaningful change and we are optimistic about what we can achieve through our partnership with OpenSSF and like-minded organizations within its membership.”


Eric Wenger, Senior Director, Technology Policy, Cisco Systems

“Open source software (OSS) is a foundational part of our modern computing infrastructure. As one of the largest users of and contributors to OSS, Cisco makes significant investments in time and resources to improve the security of widely-used OSS projects. Today’s effort shows the stakeholder community’s shared commitment to making open-source development more secure in ways that are measurable and repeatable.”


Jim Medica, Technologist in Dell Technologies’ Office of the CTO

“Never before has software security been a more critical part of the global supply chain. Today, in a meeting led by Anne Neuberger [], Deputy National Security Advisor for Cyber and Emerging Technology, Dell and my Open Source Security Foundation colleagues committed our software security expertise to execute the Open Source Software Security Mobilization Plan. Dell’s best and brightest engineers will engage with peers  to develop risk-based metrics and scoring dashboards, digital signature methodologies for code signing, and Software Bill of Materials (SBoM) tools – all to address the grand challenge of open source software security. This is an excellent example of the leadership Dell provides to proactively impact software security and open-source security solutions, and reinforces our commitment to the open source software community, to our supply chain and to our national security.”


“Ericsson is one of the leading promoters and supporters of the open source ecosystem, accelerating the adoption and industry alignment in a number of key technology areas. The Open Source Security Foundation (OpenSSF) is an industry-wide initiative with the backing of the Linux Foundation with the objective of improving supply chain security in the open source ecosystem.

“As a board member of OpenSSF, we are committed to open source security and we are fully supportive of the mobilization plan with the objective of improving supply chain security in the open source ecosystem. Being an advocate and adopter of global standards, the initiatives aim to strengthen open source security from a global perspective.”


Mike Hanley, Chief Security Officer

“Securing the open source ecosystem starts with empowering developers and open source maintainers with tools and best practices that are instrumental to securing the software supply chain. As home to 83M developers around the world, GitHub is uniquely positioned and committed to advance these efforts, and we’ve continued our investments to help developers and maintainers realize improved security outcomes through initiatives including 2FA enforcement on and npm, open sourcing the GitHub Advisory Database, financial enablement for developers through GitHub Sponsors, and free security training through the GitHub Security Lab

“The security of open source is critical to the security of all software. Summit II has been an important next step in bringing the private and public sector together again and we look forward to continuing our partnerships to make a significant impact on the future of software security.”


Eric Brewer, VP of Infrastructure at Google Cloud & Google Fellow

“We’re thankful to the Linux Foundation and OpenSSF for convening the community today to discuss the open source software security challenges we’re facing and how we can work together across the public and private sectors to address them. Google is committed to supporting many of the efforts we discussed today, including the creation of our new Open Source Maintenance Crew, a team of Google engineers who will work closely with upstream maintainers on improving the security of critical open source projects, and by providing support to the community through updates on key projects like SLSA, Scorecards; and Sigstore, which is now being used by the Kubernetes project. Security risks will continue to span all software companies and open source projects and only an industry-wide commitment involving a global community of developers, governments and businesses can make real progress. Google will continue to play our part to make an impact.”


Jamie Thomas, Enterprise Security Executive

“Today, we had the opportunity to share our IBM Policy Lab’s recommendations on how understanding the software supply chain is key to improving security. We believe that providing greater visibility in the software supply chain through SBoMs ( Software Bill of Materials) and using the Open Source Software  community as a valuable resource to encourage passionate developers to create, hone their skills, and contribute to the public good can help strengthen our resiliency. It’s great to see the strong commitment from the community to work together to secure open source software. Security can always be strengthened and I would like to thank Anne Neuberger today  for her deep commitment and open, constructive, technical dialogue that will help us pave the way to enhancing OSS security. ”


Greg Lavender, Chief Technology Officer and General Manager of the Software and Advanced Technology Group

“Intel has long played a key role in contributing to open source. I’m excited about our role in the future building towards Pat’s Open Ecosystem vision. As we endeavor to live into our core developer tenets of openness, choice and trust – software security is at the heart of creating the innovation platforms of tomorrow.”

Melissa Evers, Vice President, Software and Advanced Technology, General Manager of Strategy to Execution

“Intel commends the Linux Foundation in their work advancing open source security. Intel has a history of leadership and investment in open source software and secure computing: over the last five years, Intel has invested over $250M in advancing open-source software security. As we approach the next phase of Open Ecosystem initiatives, we intend to maintain and grow this commitment by double digit percentages continuing to invest in software security technologies, as well as advance improved security and remediation practices within the community and among those who consume software from the community.”


Stephen Chin, Vice President of Developer Relations

“While open source has always been seen as a seed for modernization, the recent rise of software supply chain attacks has demonstrated we need a more hardened process for validating open-source repositories. As we say at JFrog, ‘with great software comes great responsibility’, and we take that job seriously. As a designated CNA, the JFrog Security Research team constantly monitors open-source software repositories for malicious packages that may lead to widespread software supply chain attacks and alerts the community accordingly. Building on that, JFrog is proud to collaborate with the Linux Foundation and other OpenSSF members on designing a set of technologies, processes, accreditations, and policies to help protect our nation’s critical infrastructure while nurturing one of the core principles of open source – innovation.” 

JPMorgan Chase

Pat Opet, Chief Information Security Officer

“We are proud to have worked with Open Source Security Foundation (OpenSSF) and its members to create the new Open Source Software Security Mobilization Plan, This plan will help to address security issues in the software supply chain which is critical to making the world’s software safer and more secure for everyone.”


Mark Russinovich, CTO, Microsoft Azure

“Open source software is core to nearly every company’s technology strategy. Collaboration and investment across the open source ecosystem will strengthen and sustain security for everyone. Microsoft’s commitment to $5M in funding for OpenSSF supports critical cross-industry collaboration. We’re encouraged by the community, industry, and public sector collaboration at today’s summit and the benefit this will have to strengthen supply chain security.”

OWASP Foundation

Andrew van der Stock, Executive Director

“OWASP’s mission is to improve the state of software security around the world. We are contributing to the Developer Education and Certification, as well addressing the Executive Order for improving the state and adoption of SBOMs. In particular, we would like to see a single, consumable standard across the board.” 

Mark Curphey (founder of OWASP) and John Viega (author of the first book on software security), Stream Coordinators

“We’re excited to see the industry’s willingness to come together on a single ‘bill of materials’ format. It has the potential to help the entire industry solve many important problems, including drastically improving response speed for when major new issues in open source software emerge.” 


Tim McKnight, SAP Executive Vice President & Chief Information Security Officer

“SAP is proud to be a part of the Open Source Software Security Summit II and contribute to the important dialogue on the topic of Open Source software security.

“SAP is firmly committed to supporting the execution of the Open Source Software Security Mobilization Plan and we look forward to continuing our collaboration with our government, industry, and academic partners.”


Brian Fox, CTO of Sonatype and steward of Maven Central

“It’s rare to see vendors, competitors, government, and diverse open source ecosystems all come together like they have today. It shows how massive a problem we have to solve in securing open source, and highlights that no one entity can solve it alone. The Open Source Software Security Mobilization Plan is a great step toward bringing our community together with a number of key tactics, starting with securing OSS production, which will make the entire open source ecosystem stronger and safer.” 


Andrew Aitken, Global Head of Open Source

“Wipro is committed to helping ensure the safety of the software supply chain through its engagement with OpenSSF and other industry initiatives and is ideally suited to enhance efforts to provide innovative tooling, secure coding best practices and industry and government advocacy to improve vulnerability remediation.

“As the only global systems integrator in the OpenSSF ecosystem and in line with its support of OpenSSF objectives, Wipro will commit to training 100 of its cybersecurity experts to the level of trainer status in LF and OpenSSF secure coding best practices and to host training workshops with its premier global clients and their developer and cybersecurity teams. 

“Further, Wipro will increase its public contributions to Sigstore and the SLSA framework by integrating them into its own solutions and building a community of 50+ contributors to these critical projects.”


Three Goals of the 10-Point Plan

  • Securing Open Source Security Production
      1. Make baseline secure software development education and certification the new normal for pro OSS developers
      2. Establish a public, vendor-neutral, objective-metrics based risk assessment dashboard for the top 10,000 open source components.
      3. Accelerate the adoption of digital signatures on software releases
      4. Eliminate root causes of many vulnerabilities through replacement of non-memory-safe languages.
  • Improving Vulnerability Discovery and Remediation
      1. Accelerate discovery of new vulnerabilities by maintainers and experts.
      2. Establish the corps of “volunteer firefighter” security experts to assist open source projects during critical times.
      3. Conduct third-party code reviews (and any necessary remediation work) of 200 of the most-critical open source software components yearly
      4. Coordinate industry-wide data sharing to improve the research that helps determine the most critical open source software.
  • Shorten ecosystem Patching Response Time
    1. Software Bill of Materials (SBOM) Everywhere – improve SBOM tooling and training to drive adoption
    2. Enhance the 10 most critical open source security build systems, package managers, and distribute systems with better supply chain security tools and best practices.

The 10-Point Plan Summarized (available in full here)

  1. Security Education Deliver baseline secure software development education and certification to all. 
  2. Risk Assessment Establish a public, vendor-neutral, objective-metrics-based risk assessment dashboard for the top 10,000 (or more) OSS components.
  3. Digital Signatures Accelerate the adoption of digital signatures on software releases.
  4. Memory Safety Eliminate root causes of many vulnerabilities through replacement of non-memory-safe languages.
  5. Incident Response Establish the OpenSSF Open Source Security Incident Response Team, security experts who can step in to assist open source projects during critical times when responding to a vulnerability.
  6. Better Scanning Accelerate discovery of new vulnerabilities by maintainers and experts through advanced security tools and expert guidance.
  7. Code Audits Conduct third-party code reviews (and any necessary remediation work) of up to 200 of the most-critical OSS components once per year. 
  8. Data Sharing Coordinate industry-wide data sharing to improve the research that helps determine the most critical OSS components.
  9. SBOMs Everywhere Improve SBOM tooling and training to drive adoption. 
  10. Improved Supply Chains Enhance the 10 most critical OSS build systems, package managers, and distribution systems with better supply chain security tools and best practices.

Media Contact

Edward Cooper

brian behlendorf testifying at a U.S. House hearing

This post originally appeared on OpenSSF’s blog

On Wednesday, May 11, 2022, Brian Behlendorf, OpenSSF General Manager, testified to the United States House of Representatives Committee on Science, Space, and Technology. Brian’s testimony shares the work being done within the Open Source Security Foundation and broader open source software community to improve security and trustworthiness of open source software.

A copy of Brian’s written remarks are below and linked here (PDF). Visit the Committee’s website to view a recording of the hearing.

Also testifying at the hearing were:

May 9th, 2022 

The Honorable Eddie Bernice Johnson, Chairwoman
The Honorable Frank Lucas, Ranking Member
Committee on Science, Space, and Technology
2321 Rayburn House Office Building
Washington, DC 20515-6301 

Dear Chairwoman Johnson, Congressman Lucas, and distinguished members of the Committee on Science, Space and Technology, 

Thank you for your invitation to address you today, and the opportunity to share with you the work being done within the Open Source Security Foundation and the broader open source software community to raise the level of security and trustworthiness of open source software. 

  1. What are the consequences of insecure open-source software and what is industry as a whole, and the Open Source Security Foundation in particular, doing to tackle such Vulnerabilities? 

Open source software (“OSS”) has become an integral part of the technology landscape, as inseparable from the digital machinery of modern society as bridges and highways are from the physical equivalent. According to one report, typically 70% to 90% of a modern application “stack” consists of pre-existing OSS, from the operating system to the cloud container to the cryptography and networking functions, sometimes up to the very application running your enterprise or website. Thanks to copyright licenses that encourage no-charge re-use, remixing, and redistribution, OSS encourages even the most dogged of competitors to work together to address common challenges, saving money by avoiding duplication of effort, moving faster to innovate upon new ideas and adopt emerging standards. 

However, this ubiquity and flexibility can come at a price. While OSS generally has an excellent reputation for security, the developer communities behind those works can vary significantly in their application of development practices and techniques that can reduce the risk of a defect in the code, or in responding quickly and safely when one is discovered by others. Often, developers trying to decide what OSS to use have difficulty determining which ones are more likely to be secure than others based on objective criteria. Enterprises often don’t have a well-managed inventory of the software assets they use, with enough granular detail, to know when or if they’re vulnerable to known defects, and when or how to upgrade. Even those enterprises who may be willing to invest in increasing the security of the OSS they use often don’t know where to make those investments, nor their urgency relative to other priorities. 

There are commercial solutions to some of these problems. There are vendors like Gitlab or Red Hat who sell support services for specific open source software, or even entire aggregate distributions of OSS. There are other vendors, like Snyk and Sonatype, who sell tools to help enterprises track their use of OSS and flash an alert when there is a new critical vulnerability in software running deep inside an enterprise’s IT infrastructure.

However, fighting security issues at their upstream source – trying to catch them earlier in the development process, or even reduce the chances of their occurrence at all – remains a critical need. We are also seeing new kinds of attacks that focus less on vulnerabilities in code, and more on the supply chain itself – from rogue software that uses “typosquatting” on package names to insert itself unexpectedly into a developer’s dependency tree, to attacks on software build and distribution services, to developers turning their one-person projects into “protest-ware” with likely unintended consequences. 

To address the urgent need for better security practices, tools, and techniques in the open source software ecosystem, a collection of organizations with deep investments into the OSS ecosystem came together in 2020 to form the Open Source Security Foundation, and chose to house that effort at the Linux Foundation. This public effort has grown to hundreds of active participants across dozens of different public initiatives housed under 7 working groups, with funding and partnership from over 75 different organizations, and reaching millions of OSS developers. 

The OpenSSF’s seven working groups are: 

  1. Best Practices for Open Source Developers: This group works to provide open source developers with best practices recommendations, and easy ways to learn and apply them. Among other things, this group has developed courseware for teaching developers the fundamentals of secure software development, and implement the OpenSSF Best Practices Badge program. 
  2. Securing Critical Projects: This group exists to identify and help to allocate resources to secure the critical open source projects we all depend on. Among other things, this has led to a collaboration with Harvard Business School to develop a list of the most critical projects. 
  3. Supply Chain Integrity: This group is helping people understand and make decisions on the provenance of the code they maintain, produce and use. Among other things, this group has developed a specification and software called “SLSA”, for describing and tracking levels of confidence in a software supply chain. 
  4. Securing Software Repositories: This group provides a collaborative environment for aligning on the introduction of new tools and technologies to strengthen and secure software repositories, which are key points of leverage for security practices and the promotion to developers of more trustworthy software. 
  5. Identifying Security Threats in Open Source Projects: This group enables informed confidence in the security of OSS by collecting, curating, and communicating relevant metrics and metadata. For example, it is developing a database of all known security reviews of OSS. 
  6. Security Tooling: This group’s mission is to provide the best security tools for open source developers and make them universally accessible. Among other activities, this group has released code to better enable a security testing technique called “fuzzing” among open source projects. 
  7. Vulnerability Disclosures: This group is improving the overall security of the OSS ecosystem by helping advance vulnerability reporting and communication. For example, this group has produced a Guide to Coordinated Vulnerability Disclosure for OSS

There are also a series of special projects under the OpenSSF worthy of special mention: 

  • Project sigstore: an easy-to-use toolkit and service for signing software artifacts, ensuring that the software you are holding is the same as what the developer intended, addressing a wide array of supply chain attacks. 
  • The Alpha-Omega Project: an effort to systematically search for new vulnerabilities in open source code, and work with critical open source projects to improve their vulnerability handling and other security practices. 
  • The GNU Toolchain Initiative: this effort supports the build ecosystems for perhaps the most critical set of developer libraries and compilers in the world, the GNU Toolchain, as a means to ensure its safety and integrity. 

All the above efforts are public-facing and developed using the best practices of open source software communities. Funding from our corporate partners goes towards supporting the core staff and functions that enable this community, but all the substance comes from voluntary efforts. In some cases funds flow to assist with specific efforts – for example, recently the Alpha-Omega project decided to allocate funding towards the NodeJS community to augment its security team with a part-time paid employee and to fund fixes for security issues. 

The Linux Foundation has also begun to adapt its “LFX” platform, a set of services designed to support the open source communities hosted by the Foundation, to incorporate security-related data such as vulnerability scans from Snyk and BluBracket, along with information from the OpenSSF Best Practices Badge program and the OpenSSF Security Scorecards initiative, to provide a unified view of the security risks in a particular collection of open source code, and what maintainers and contributors to those projects can do to improve those scores and reduce those risks. We expect to see more kinds of risk-related data coming into a unified view like this, helping developers and enterprises make better decisions about what open source components and frameworks to use, and how to reduce risk for those components they depend upon. 

Guiding all of this is a deep conviction among the OpenSSF community that while there are many different ways in which security issues manifest themselves in the OSS ecosystem, every one of them is addressable, and that there are lots of opportunities for investment and collective action that will pay a return many times over in the form of lower risk of a future major vulnerability in a widely-used package, and lesser disruption if one is discovered. 

Other efforts at the Linux Foundation include “Prossimo”, an effort focused on moving core Internet-related services to “memory-safe” languages like Rust, Go, or Java, which would eliminate an entire category of vulnerabilities that other languages allow too easily. Another is the SPDX standard for Software Bill of Materials (“SBOMs”), addressing the needs identified by White House Executive Order 14028 in a vendor-neutral and open way. 

This is by no means a comprehensive list of all such efforts in the OSS ecosystem to improve security. Every OSS foundation either has a security team in operation today or is scrambling to identify volunteers and funding to establish one. There is a greater emphasis today than I’ve seen in my 30 years of using and contributing to OSS (since before it was called OSS) on the importance of such efforts. Clear metrics for progress are elusive since we lack clear metrics for evaluating software risk; in fact developing ways to measure and represent that risk is a key priority for OpenSSF. We will never see a time when open source software is free from security defects, but we are getting better at determining the tools and techniques required to more comprehensively address the risk of vulnerabilities in open source code. Scaling up those tools and techniques to address the tens of thousands of widely used OSS components and to get them more quickly updated remains a challenge. 

  1. How can the Federal government improve collaboration with industry to help secure open-source software? 

I’ll focus here on principles and methods for collaboration that will lead to more secure OSS, and then for question 3 on specific opportunities to collaborate on. 

First, focus on resourcing long-term personal engagements with open source projects. 

Over the last few years, we have seen a healthy degree of engagement by the Federal government with OSS projects and stakeholders on the topic of improving security. The push established by Executive Order 14028 for the adoption of SBOMs aligned nicely with the standardization and growing adoption of the SPDX standard by a number of OSS projects, but it was aided substantially by the involvement of personnel from NIST, CISA, and other agencies engaging directly with SPDX community members. 

Often the real secret to a successful OSS effort is in the communities of different stakeholders that come together to create it – the software or specification is often just a useful byproduct. The Federal government, both through its massive use of open source code and the role that it traditionally performs in delivering and protecting critical infrastructure, should consider itself a stakeholder, and like other stakeholders prioritize engagement with upstream open source projects of all sizes. That engagement need not be so formal; most contributors to open source projects have no formal agreement covering that work aside from a grant of intellectual property in those contributions. But as they say, “history is made by those who show up.” If the IT staff of a Federal agency (or of a contractor under a Federal contract) were authorized and directed to contribute to the security team of a critical open source project, or to addressing known or potential security issues in important code, or to participating in an OpenSSF working group or project, that would almost certainly lead to identifying and prioritizing work that would result in enhanced security in the Federal government’s own use of open source code, and likely to upstream improvements that make OSS more secure for everyone else. 

Second, engage in OSS development and security work as a form of global capacity building, and in doing so, in global stability and resilience. OSS development is inherently international and has been since its earliest days. Our adversaries and global competitors use the same OSS that we do, by and large. When our operating systems, cloud containers, networking stacks and applications are made to be more secure, there are fewer chances for rogue actors to cause disruption, and that can make it harder to de-escalate tensions or protect the safety of innocent parties. Government agencies in France, Taiwan, and more have begun to establish funded offices focused on the adoption, development, and promotion of OSS, in many ways echoing the Open Source Program Offices being set up by companies like Home Depot and Walmart or intergovernmental agencies like the WHO. The State Department in recent years has funded the development of software like Tor to support the security needs of human rights workers and global activists. The Federal government could use its convening authority and statecraft to bring like-minded activities and investment together in a coordinated way more effectively than any of us in the private sector can. 

Third, many of the ideas for improving the security of OSS involve establishing services – services for issuing keys to developers like Project sigstore does, or services for addressing the naming of software packages for SBOMs, or services for collecting security reviews, or providing a comprehensive view of the risk of open source packages. Wherever possible, the Federal government should avoid establishing such services themselves when suitable instances of such services are being built by the OSS community. Instead of owning or operating such services directly, the Federal Government should provide grants or other resources to operators of such services as any major stakeholder would. Along similar lines, should the Federal government fund activities like third party audits of an open source project, or fund fixes or improvements, it should ensure not only that such efforts don’t duplicate work already being done, it should ensure that the results of that work are shared (with a minimum of delay) publicly and upstream so that everyone can benefit from that investment. 

These three approaches to collaboration would have an outsized impact on any of the specific efforts that the Federal government could undertake. 

  1. Where should Congress or the Administration focus efforts to best support and secure the open-sourced software ecosystem as a whole? 

The private sector and the Federal government have a common cause in seeing broad improvements in the security of OSS. I’m happy to share where I see the private sector starting to invest in enhanced OSS security, in the hopes that this may inspire similar actions from others. 

  1. Education. Very few software developers ever receive a structured education in security fundamentals, and often must learn the hard way about how their work can be attacked. The OpenSSF’s Secure Software Fundamentals courses are well regarded and themselves licensed as open source software, which means educational institutions of all kinds could deliver the content. Enterprises could also start to require it of their own developers, especially those who touch or contribute to OSS. There must be other techniques for getting this content into more hands and certifications against it into more processes. 
  2. Metrics and benchmarks. There are plenty of efforts to determine what are suitably objective metrics for characterizing the risks of OSS packages. But running the cloud systems to perform that measurement across the top 100,000 or even 10,000 open source projects may cost more than what can be provided for free by a single company, or may be fragile if only provided by a single vendor. Collective efforts funded by major stakeholders are being planned-for now, and governments as a partner to that would not be turned away. 
  3. Digital signatures. There is a long history of U.S. Government standards for identity proofing, public key management, signature verification, and so on. These standards are very sophisticated, but in open source circles, often simplicity and support are more important. This is pulling the open source ecosystem towards Project sigstore for the signing of software artifacts. We would encourage organizations of all sorts to look at sigstore and consider it for their OSS needs, even if it may not be suitable for all identity use cases. 
  4. Research and development investments into memory-safe languages. As detailed above, there are opportunities to eliminate whole categories of defects for critical infrastructure software by investing in alternatives written in memory-safe languages. This work is being done, but grants and investments can help accelerate that work. 
  5. Fund third-party code reviews for top open source projects. Most OSS projects, even the most critical ones, never receive the benefit of a formal review by a team of security experts trained to review code not only for small bugs that may lead to big compromises, but to look at architectural issues and even issues with the features offered by the software in the search for problems. Such audits vary tremendously in cost based on the complexity of the code, but an average for an average-sized code base would be $150K-250K. Covering the top 100 OSS projects with a review every other year, or even 200 every year, seems like a small price compared to the costs on US businesses to remedy or clean up after a breach caused by just one bug. 
  6. Invest into better supply chain security support in key build systems, package managers, and distribution sites. This is partly about seeing technologies like SBOMs, digital signatures, specifications like SLSA and others built into the most widely used dev tools so that they can be adopted and meaningfully used with a minimum of fuss. Any enterprise (including the Federal government) that has software certification processes based on the security attributes of software should consider how those tools could be enhanced with the above technologies, and automate many processes so that updates can be more frequent without sacrificing security. 

These activities, if done at sufficient scale, could dramatically lower the risks of future disruptive events like we have seen. As a portfolio of different investments and activities they are mutually reinforcing, and none of them in isolation is likely to have much of a positive impact. Further econometrics research could help quantify the specific reduction of risk from each activity. But I believe that each represents a very cost-effective target for enhancing security in OSS no matter who is writing the check. 

Thank you again for the opportunity to share these thoughts with you. I look forward to answering any questions you may have or providing you with further information. 


Brian Behlendorf
General Manager, Open Source Security Foundation
The Linux Foundation

I am old enough to remember when organizations developed software in-house – all of it. I also clearly remember my information systems college professor teaching it is almost always less expensive and better to use code/programs already written and adapting them for your use than to recreate the wheel from scratch. 

It is a different world now – software is built on a foundation of other programs, libraries, and code bases. Free and open source software (FOSS) is key to this because it is so easy to pickup, use, share, and create code. What an opportunity to speed development and focus innovation on the next thing rather than creating what already exists. This is part of the value of open source software – collaborate on the building blocks and innovate and differentiate on top of that. 

However, there are also challenges in this space, with a good example being the question of how to address licensing. There are A LOT of types of licenses that can apply to a piece of software/code. Each license needs to be understood and tracked with each piece of software it is included in for an organization to ensure nothing is missed. This can quickly multiply into a significant catalog that requires lots of manual work. On top of that, you also need to provide that license information to each of your customers, and they will have their own system and/or processes for providing that information to them and making sure it is up-to-date with each new version of the software. 

You can see where this can quickly consume valuable staff resources and open doors to mistakes. Imagine the possibility of a standard way to track and report the licenses so your teams don’t need to worry about all of the digital paperwork and can instead focus on innovation and adding value to you and your customers.

This is exactly the problem a team of lawyers and governance experts sought to fix back in 2016 and created the OpenChain Project to do just that. They asked, what are the key things for open source compliance that everyone needs, and how do we unify the systems and processes. They envisioned an internationally accepted standard to track and report all of the licenses applicable to a software project. The end result is a more trustable supply chain where organizations don’t need to spend tons of time checking compliance again and again and then remediating. 

The result – a ISO standard  (ISO/IEC 5230) was approved in Q4 2020. The OpenChain Project also hosts a library of 1,000 different reference documents in a wide variety of languages – some are official and many more are community documents, like workflow examples, FAQs, etc.

How are organizations benefiting from OpenChain? I find it encouraging that Toyota is one of the leaders in this. As anyone who has had at least one business class in college knows, Toyota is a leader in innovations for manufacturing over several decades. In the 1970s they pioneered supply chain management techniques with the Toyota Production System (please tell me they had to do TPS reports) – adopted externally as Just in Time manufacturing. They are also known for adopting the philosophy of Kaizen, or continuous improvement. So, as they looked at how to manage software supply chains and all of the licensing, they adopted the OpenChain Specification. They implemented it, in part, with a governance structure and an official group to manage OSS risks and community contributions.

Toyota’s OSS governance structure

diagram of toyota's open source software governance structure - OSS Developer; Security Specialist; IP Specialist over R&D over Developing OSS Culture and Handling OSS Risks

They are also an active participant in the OpenChain Japan Working Group to help identify bottlenecks across the supply chain, and the group enabled Toyota to develop information sharing guidelines to address licensing challenges with Tier 1 suppliers. They now see reduced bottlenecks, more data for better decision making, and decreased patent and licensing risks. Read more.

PwC is a global auditing, assurance, tax, and consulting firm. As an auditor, much of their business revolves around building trust in society. They also develop software solutions for thousands of clients around the world and receive software from providers of all sizes and maturity levels, making OSS compliance difficult. It was a tremendous effort and caused time delays for them and their clients. Now, PwC is able to provide clients with an Open Source Software compliance assessment based on the latest OpenChain specification. Their clients can share an internationally-recognized PwC audit report to verify OSS compliance. Read more.

And just last month, SAP, a market leader in enterprise application software, announced they are adopting the OpenChain ISO/IEC 5230 standard. It marks the first time that an enterprise application software company has undergone a whole entity conformance. Their reach across the global supply chain is massive – its customers are involved in almost 90% of global trade.

As the ISO/IEC standard is done, what is next for OpenChain? They are looking at security, export control, and more. 

If you or your organization are interested in learning more about OpenChain, adopting the standard, or getting involved in what is next, head over to We also host an online training course when you are ready to dig in: Introduction to Open Source License Compliance Management

My hope is that you now spend less time on compliance and more time on innovation.

Key Elements of a Secure Software Supply Chain

Here at The Linux Foundation’s blog, we share content from our projects, such as this article from the Cloud Native Computing Foundation’s blog. The guest post was originally published on Contino Engineering’s blog by Dan Chernoff. 

Supply chain attacks rose by 42% in the first quarter of 2021 [1] and are becoming even more prevalent [2]. In response to secure software supply chain breaches like Solar Winds [3], Kaseya[4], and other less publicized compromises [5], the Biden administration issued an executive order that includes guidance designed to improve the federal government’s defense against cyber threats. With all of this comes the inevitable slew of blog posts that detail a software supply chain and how you would protect it. The Cloud Native Computing Foundation recently released a white paper regarding software supply chain security [7], an excellent summary of the current best practices for securing your software supply chain.

The genesis for the content in this article is work done to implement secure supply chain patterns and practices for a Contino customer. The core goals for the effort were; implement a pipeline agnostic solution that ensures the security of the pipelines and enables secure delivery for the enterprise. We’ll talk a little about why we chose the tools we did in each section and how they supported the end goal.

As we start our journey, we’ll first touch on what a secure software supply chain is and why you should have one to set the context for the rest of the blog post. But let’s assume that you have already decided that your software supply chains need to be secure, and you want to implement the capability for your enterprise. So let’s get into it!

Anteing Up

Before you embark upon the quest of establishing provenance for your software at scale, there are some table stakes elements that teams should already have in place. We won’t delve deeply into any of them here other than to list and briefly describe them.

Centralized Source Control, Git is by far the most popular choice. This ensures a single source of truth for development teams. Beyond just having source control, teams should also implement the signing of their Git commits.

Static Code Analysis. This identifies possible vulnerabilities within ‘static’ (non-running) source code by using techniques such as Taint Analysis and Data Flow Analysis. Analysis and results need to be incorporated into the cadence of development.

Vulnerability Scanning. Implement automated tools that scan the applications and containers that are built to identify potential vulnerabilities in the compiled and sometimes running applications.

Linting is a tool that analyzes source code to flag programming errors, bugs, and stylistic errors. Linting is important to reduce errors and improve the overall code quality. This in turn accelerates development.

CI/CD Pipelines. New code changes are automatically built, tested, versioned, and delivered to an artifact repository. A pipeline then automatically deploys the updated applications into your environments (e.g. test, staging, production, etc.).

Artifact Repositories. Provide management of the artifacts built by your CI/CD systems. An artifact repository can help with the version and access control of your artifacts.

Infrastructure as Code (IaC) is the process of managing and provisioning infrastructure (e.g. virtual machines, databases, load balancers, etc.) through code. As with applications, IaC provides a single source of truth for what the infrastructure should look like. It also provides the ability to test before deploying to production.

Automated…well, everything. Human-in-the-loop systems are not deterministic. They are prone to error which can and will cause outages and security gaps. Manual systems also inhibit the ability of platforms to scale quickly.

What is a Secure Software Supply Chain

A software supply chain consists of anything that goes into the creation of your end software product and the mechanisms you use to deliver the product to customers. This includes things like your source code, your build systems, the 3rd party libraries, deployment infrastructure, or delivery repositories.


  • Establishes Provenance — One part of establishing provenance is ensuring that any artifact that is created and accessed by the customer should be able to trace its lineage all the way back to the developer(s) that merged the latest commit. The other part is the ability to demonstrate (or attest) that for each step in the process, the software, components, and other materials that go into creating the final product are tamper-free.
  • Trust — Downstream systems and users need a mechanism to verify that the software that is being installed or deployed came from your systems and that the version being used is the correct version. This ensures that malicious artifacts have not been substituted or that older, vulnerable versions have not been relabeled as the current version.
  • Transparent — It should be easy to see the results and details for all steps that go into the creation of the final artifact. This can include things like test results, output from vulnerability scans, etc.

Key Elements of a Secure Software Supply Chain

Let’s take a closer look at the things that need to be layered into your pipelines to establish provenance, enable transparency, and ensure tamper resistance.

Here is what a typical pipeline might look like that creates a containerized application. We’ll use this simple pipeline and add elements as we discuss them.

Key Elements of a Secure Software Supply Chain

Establishing Provenance Using in-toto

The first step in our journey is to establish that artifacts built via a pipeline have not been tampered with and to do so in a reliable and repeatable way. As we mentioned earlier, part of this is creating evidence to use as part of the verification. in-toto is an open-source tool that creates a snapshot of the workspace where the pipeline step is running.

These snapshots (“link files” in the in-toto terminology) verify the integrity of the pipeline. The core idea behind in-toto is the concept of materials and products and how they flow, just like in a factory. Each step in the process usually has some material that will create its product. An example of the flow of materials and products is the build step. The build step uses code as the material, and the built artifact (jar, war, etc.) is the product. A later step in the pipeline will use the built artifact as the material and produce another product. In this way, in-toto allows you to chain the materials and products together and identify if a material has been tampered with during or between one of the pipeline steps. For example, if the artifact constructed during the build step changed before testing.

Key Elements of a Secure Software Supply Chain

At the end of the pipeline, in-toto evaluates the link data (the attestation created at each step) against an in-toto layout (think Jenkins file for attestation) and verifies that all the steps were done correctly and by the approved people or systems. This verification can run anytime the product of the pipeline (container, war, etc.) needs to be verified.

Critical takeaways for establishing provenance

in-toto runs at every step of the process. The attestation compares to an overarching layout during verification. This process enables consumers (users and/or deployment systems) to have confidence that the artifacts built were not altered from start to finish.

Establishing Trust using TUF

You can use in-toto verification to know that the artifact was delivered or downloaded without modification. To do that, you will need to download the artifact(s), the in-toto link files used during the build, the in-toto layout, and the public keys to verify it all. That is a lot of work. An easier way is to sign the artifacts produced with a system that enables centralized trust. The most mature framework for doing so is TUF (The Update Framework).

TUF is a framework that gives consumers of artifacts guarantees that the artifact downloaded or automatically installed came from your systems and is the correct version. The guts of how to accomplish this are outside the scope of this blog post. The functionality we are interested in is verifying that an artifact came from the producer we expected and that the version is the expected version.

Implementing TUF on your own is a fair bit of work. Fortunately, an “out of the box” implementation of TUF is available for use, Docker Content Trust (a.k.a. Notary). Notary enables the signing of regular files as well as containers. In our example pipeline, we sign the container image during build time. This signing allows any downstream system or user to verify the authenticity of the container.

Key Elements of a Secure Software Supply Chain

Transparency Centralized Data Storage

One of the gaps that in-toto has as a solution is a mechanism to persist the link data it creates. It is up to the team to implement in-toto to capture and store the link data somewhere. All the valuable metadata for each step can be captured and stored outside of the build system. The goal is twofold; the first is to store the link data outside the pipeline to enable teams to retrieve the link data and use it anytime verification needs to run on the artifacts produced from the pipeline. The second goal is to store the metadata around the build process outside the pipeline. That enables teams to implement visualizations, monitoring, metrics, and rules on the data produced from the pipeline without necessarily needing to keep it in the pipeline.

The Contino team created metadata capture tooling that is independent and agnostic of the pipeline. We chose to write a simple python tool that captures the meta and in-toto data and stores it in a database. If the CI/CD platform is reasonably standard, you can likely use built-in mechanisms to achieve the same results. For example, the Jenkins LogStash plugin can capture the output of a build step and persist data to an elastic datastore.

Key Elements of a Secure Software Supply Chain

PGP and Signing Keys

A core component for in-toto and Notary are keys used to sign and verify link data and artifacts/containers. in-toto uses PGP private keys to sign the link data produced at each step internally. That signing ensures a relationship between the person or system that did the action and the link data. It also ensures that it can be easily detected if the link data gets altered or tampered with in any way.

Notary uses public and private keys generated using the Docker or Notary CLI. The public keys get stored in the notary database. The private keys sign the containers or other artifacts.

Scaling Up

For a small set of pipelines, manually implementing and managing secure software supply chain practices is straightforward. Management of an enterprise that has hundreds if not thousands of pipelines requires some additional automation.

Automate in-toto layout creation. As mentioned earlier, in-toto has a file akin to a Jenkins file that dictates what person or systems can complete a pipeline step, the material and product flow, and how to inspect/verify the final artifact(s). Embedded in this layout are the IDs for the PGP keys of the people or systems who can perform steps. Additionally, the layout is internally signed to ensure that any tampering can be detected once the layout gets created. To manage this at scale, the layouts need to be automatically created/re-created on demand. We approach this as a pipeline that automatically runs on changes to the code that creates layouts. The output of the pipeline is layouts, which are treated as artifacts themselves.

Treat in-toto layouts like artifacts. in-toto payouts are artifacts, just like containers, jars, etc. Layouts should be versioned, and the layout version linked to the version of the artifact. This versioning enables artifacts to be re-verified with the layout, link files, and relevant keys at artifact creation time.

Automate the creation of the signing keys. Signing keys that are used by autonomous systems should be rotated frequently and through automation. Doing this limits the likelihood for compromise of the signing keys used by in-toto and Notary. For in-toto, this frequent rotation will require the automatic re-creation of the in-toto layouts. For Notary, cycling the signing keys will require revocation of the old key when we implement the new key.

Store and use signing keys from a secret store. When generating signing keys for use by automated systems, storing the keys in secret management systems like Hashicorp’s Vault is an important practice. The automated system can retrieve the signing keys (e.g., Jenkins, GitLab ci, etc.) when needed. Centrally storing the signing keys combats “secrets sprawl” in an enterprise and enables easier management.

Pipelines should be roughly similar. A single in-toto layout can be used by many pipelines, as long as they operate in the same way. For example, pipelines that build a Java application that creates a WAR as the artifact probably operates in roughly the same way. These pipelines can all use the same layout if they are similar enough.

Wrapping it All Up

Using the technologies, patterns, and practices here the Contino team was able to deliver an MVP grade solution for the enterprise. The design will be able to scale up to thousands of application pipelines and help ensure software supply chain security for the enterprise.

At its core, a secure software supply chain encompasses anything that goes into building and delivering an application to the end customer. It is built on the foundations of secure software development practices (e.g. following OWASP top 10, SAST, etc.). Any implementation of secure supply chain best practices needs to establish provenance about all aspects of the build process, provide transparency for all steps and create mechanisms that ensure trustworthy delivery.









Brian Behlendorf
Brian Behlendorf

As someone who has spent their entire career in open source software (OSS), the Log4Shell scramble (an industry-wide four-alarm-fire to address a serious vulnerability in the Apache Log4j package) is a humbling reminder of just how far we still have to go. OSS is now central to the functioning of modern society, as critical as highway bridges, bank payment platforms, and cell phone networks, and it’s time OSS foundations started to act like it.

Organizations like the Apache Software Foundation, the Linux Foundation, the Python Foundation, and many more, provide legal, infrastructural, marketing and other services for their communities of OSS developers. In many cases the security efforts at these organizations are under-resourced and hamstrung in their ability to set standards and requirements that would mitigate the chances of major vulnerabilities, for fear of scaring off new contributors. Too many organizations have failed to apply raised funds or set process standards to improve their security practices, and have unwisely tilted in favor of quantity over quality of code.

What would “acting like it” look like? Here are a few things that OSS foundations can do to mitigate security risks:

  1. Set up an organization-wide security team to receive and triage vulnerability reports, as well as coordinate responses and disclosures to other affected projects and organizations.
  2. Perform frequent security scans, through CI tooling, for detecting unknown vulnerabilities in the software and recognizing known vulnerabilities in dependencies.
  3. Perform occasional outside security audits of critical code, particularly before new major releases.
  4. Require projects to use test frameworks, and ensure high code coverage, so that features without tests are discouraged and underused features are weeded out proactively.
  5. Require projects to remove deprecated or vulnerable dependencies. (Some Apache projects are not vulnerable to the Log4j v2 CVE, because they are still shipping with Log4j v1, which has known weaknesses and has not received an update since 2015!)
  6. Encourage, and then eventually require, the use of SBOM formats like SPDX to help everyone track dependencies more easily and quickly, so that vulnerabilities are easier to find and fix.
  7. Encourage, and then eventually require, maintainers to demonstrate familiarity with the basics of secure software development practices.

Many of these are incorporated into the CII Best Practices badge, one of the first attempts to codify these into an objective comparable metric, and an effort that has now moved to OpenSSF. The OpenSSF has also published a free course for developers on how to develop secure software, and SPDX has recently been published as an ISO standard.

None of the above practices is about paying developers more, or channeling funds directly from users of software to developers. Don’t get me wrong, open source developers and the people who support them should be paid more and appreciated more in general. However, it would be an insult to most maintainers to suggest that if you’d just slipped more money into their pockets they would have written more secure code. At the same time, it’s fair to say a tragedy-of-the-commons hits when every downstream user assumes that these practices are in place, being done and paid for by someone else.

Applying these security practices and providing the resources required to address them is what foundations are increasingly expected to do for their community. Foundations should begin to establish security-related requirements for their hosted and mature projects. They should fundraise from stakeholders the resources required for regular paid audits for their most critical projects, scanning tools and CI for all their projects, and have at least a few paid staff members on a cross-project security team so that time-critical responses aren’t left to individual volunteers. In the long term, foundations should consider providing resources to move critical projects or segments of code to memory-safe languages, or fund bounties for more tests.

The Apache Software Foundation seems to have much of this right, let’s be clear. Despite being notified just before the Thanksgiving holiday, their volunteer security team worked with the Log4j maintainers and responded quickly. Log4j also has almost 8,000 passing tests in its CI pipeline, but even all that testing didn’t catch the way this vulnerability could be exploited. And in general, Apache projects are not required to have test coverage at all, let alone run the kind of SAST security scans or host third party audits that might have caught this.

Many other foundations, including those hosted at the Linux Foundation, also struggle to do all this — this is not easy to push through the laissez-faire philosophy that many foundations have regarding code quality, and third-party code audits and tests don’t come cheap. But for the sake of sustainability, reducing the impact on the broader community, and being more resilient, we have got to do better. And we’ve got to do this together, as a crisis of confidence in OSS affects us all.

This is where OpenSSF comes in, and what pulled me to the project in the first place. In the new year you’ll see us announce a set of new initiatives that build on the work we’ve been doing to “raise the floor” for security in the open source community. The only way we do this effectively is to develop tools, guidance, and standards that make adoption by the open source community encouraged and practical rather than burdensome or bureaucratic. We will be working with and making grants to other open source projects and foundations to help them improve their security game. If you want to stay close to what we’re doing, follow us on Twitter or get involved in other ways. For a taste of where we’ve been to date, read our segment in the Linux Foundation Annual Report, or watch our most recent Town Hall.

Hoping for a 2022 with fewer four alarm fires,


Brian Behlendorf is General Manager of the Linux Foundation’s Open Source Security Foundation (OpenSSF). He was a founding member of the Apache Group, which later became the Apache Software Foundation, and served as president of the foundation for three years.

Backed by many of the world’s largest companies for more than a decade, SPDX formally becomes an internationally recognized ISO/IEC JTC 1 standard during a transformational time for software and supply chain security

SAN FRANCISCO, September 9, 2021 – The Linux Foundation, Joint Development Foundation, and the SPDX community, today announced the Software Package Data Exchange® (SPDX®) specification has been published as ISO/IEC 5962:2021 and recognized as the international open standard for security, license compliance, and other software supply chain artifacts. ISO/IEC JTC 1 is an independent, non-governmental standards body. 

Intel, Microsoft, Siemens, Sony, Synopsys, VMware, and WindRiver are just a small sample of the companies already using SPDX to communicate Software Bill of Materials (SBOM) information in policies or tools to ensure compliant, secure development across global software supply chains. 

“SPDX plays an important role in building more trust and transparency in how software is created, distributed, and consumed throughout supply chains. The transition from a de-facto industry standard to a formal ISO/IEC JTC 1 standard positions SPDX for dramatically increased adoption in the global arena,” said Jim Zemlin, executive director, the Linux Foundation. “SPDX is now perfectly positioned to support international requirements for software security and integrity across the supply chain.” 

Between eighty and ninety percent (80%-90%) of a modern application is assembled from open source software components. An SBOM accounts for the software components contained in an application — open source, proprietary, or third-party — and details their provenance, license, and security attributes. SBOMs are used as a part of a foundational practice to track and trace components across software supply chains. SBOMs also help to proactively identify software issues and risks and establish a starting point for their remediation.

SPDX results from ten years of collaboration from representatives across industries, including the leading Software Composition Analysis (SCA) vendors – making it the most robust, mature, and adopted SBOM standard. 

“As new use cases have emerged in the software supply chain over the last decade, the SPDX community has demonstrated its ability to evolve and extend the standard to meet the latest requirements. This really represents the power of collaboration on work that benefits all industries,” said Kate Stewart, SPDX tech team co-lead. “SPDX will continue to evolve with open community input, and we invite everyone, including those with new use cases, to participate in SPDX’s evolution and securing the software supply chain.”  

For more information on how to participate in and benefit from SPDX, please visit:

To learn more about how companies and open source projects are using SPDX, recordings from the “Building Cybersecurity into the Software Supply Chain” Town Hall that was held on August 18th are available and can be viewed at: 

ISO/IEC JTC 1 is an independent, non-governmental international organization based in Geneva, Switzerland. Its membership represents more than 165 national standards bodies with experts who share knowledge and develop voluntary, consensus-based, market-relevant international standards that support innovation and provide solutions to global challenges.

Supporting Comments


“Software security and trust are critical to our Industry’s success. Intel has been an early participant in the development of the SPDX specification and utilizes SPDX both internally and externally for a number of software use-cases,” said Melissa Evers, Vice President – Software and Advanced Technology Group, General Manager of Strategy to Execution, Intel.


“Microsoft has adopted SPDX as our SBOM format of choice for software we produce,” says Adrian Diglio, Principal Program Manager of Software Supply Chain Security at Microsoft. “SPDX SBOMs make it easy to produce U.S. Presidential Executive Order compliant SBOMs, and the direction that SPDX is taking with the design of their next gen schema will help further improve the security of the software supply chain.”


“With ISO/IEC 5962:2021 we have the first official standard for metadata of software packages. It’s natural that SPDX is that standard, as it’s been the de facto standard for a decade. This will make license compliance in the supply chain much easier, especially because several open source tools like FOSSology, ORT, scancode, and sw360 already support SPDX,” said Oliver Fendt, senior manager, open source at Siemens. 


”The Sony team uses various approaches to managing open source compliance and governance,” says Hisashi Tamai, Senior Vice President, Deputy President of R&D Center, Representative of the Software Strategy Committee, Sony Group Corporation. “An example is the use of an OSS management template sheet that is based on SPDX Lite, a compact subset of the SPDX standard. It is important for teams to be able to quickly review the type, version, and requirements of software, and using a clear standard is a key part of this process.”


“The Black Duck team from Synopsys has been involved with SPDX since its inception, and I personally had the pleasure of coordinating the activities of the project’s leadership for more than a decade. Representatives from scores of companies have contributed to the important work of developing a standard way of describing and communicating the content of a software package,” said Phil Odence, General Manager, Black Duck Audits.


“SPDX is the essential common thread among tools under the Automating Compliance Tooling (ACT) Umbrella. SPDX enables tools written in different languages and for different software targets to achieve coherence and interoperability around SBOM production and consumption. SPDX is not just for compliance, either; the well-defined and ever-evolving spec is also able to represent security and supply chain implications. This is incredibly important for the growing community of SBOM tools as they aim to thoroughly represent the intricacies of modern software,” said Rose Judge, ACT TAC Chair and open source engineer at VMware.

Wind River

“The SPDX format greatly facilitates the sharing of software component data across the supply chain. Wind River has been providing a Software Bill of Materials (SBOM) to its customers using the SPDX format for the past 8 years. Often customers will request SBOM data in a custom format. Standardizing on SPDX has enabled us to deliver a higher quality SBOM at a lower cost,” said Mark Gisi, Wind River Open Source Program Office Director and OpenChain Specification Chair.

About SPDX

SPDX is an open standard for communicating software bill of material information, including provenance, license, security, and other related information. SPDX reduces redundant work by providing common formats for organizations and communities to share important data, thereby streamlining and improving compliance, security, and dependability. For more information, please visit us at


The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page: Linux is a registered trademark of Linus Torvalds.

Media Contact

Jennifer Cloer

for the Linux Foundation


Open source software (OSS) is vitally important to the functioning of society today; it underpins much of the global economy. However, some OSS is highly secure, while others are not as secure as they need to be.

By its very nature, open source enables worldwide peer review, yet while its transparency has the potential for enhanced software security, that potential isn’t always realized. Many people are working to improve things where it’s needed. Most of that work is done by volunteers or organizations outside the Linux Foundation (LF) who directly pay people to do the work (typically as employees). Often those people work together within a foundation that’s part of the Linux Foundation. Sometimes, however, the LF or an LF foundation/project (e.g., a fund) directly funds people to do security work.

At the Linux Foundation (LF), I have the privilege of overseeing focused work to improve OSS security by the very people paid to do it. This work is funded through various grants and foundations, with credits to organizations like Google, Microsoft, the Open Source Security Foundation (OpenSSF), the LF Public Health foundation, and the LF itself.

The LF and its foundations do much more that I don’t oversee, so I’ve only listed the ones I am personally involved with in the interest of brevity. I hope it will give you a sense of some of the things we’re doing that you might not know about otherwise.

The typical LF oversight process for this work is described in “Post-Approval LF Security Funding.” Generally, performers must provide a periodic summary of their work so they can get paid. Most of those summaries are public, and in those cases, it’s easy for others to learn about their interesting work!

Here’s a sample of the work I oversee:

  • Ariadne Conill is improving Alpine Linux security, including significant improvements to its vulnerability processing and making it reproducible. For example, as noted in the July 2021 report, this resulted in Alpine 3.14 being released with the lowest open vulnerability count in the final release in a long time. Alpine Linux’s security is important because many containers use it. For more information, see “Bits relating to Alpine security initiatives in June” and “Bits relating to Alpine security initiatives in July.”
  • kpcyrd is doing a lot of reproducible build work on Linux distributions, especially Alpine Linux (including on the Raspberry Pi) and Arch Linux. Reproducible builds are a strong countermeasure against build system attacks (such as the devastating attack on SolarWinds Orion). More than half of the currently unreproducible packages in Arch Linux have now been reviewed and classified.
  • David Huseby has been working on modifying git to have a much more flexible cryptographic signing infrastructure. This will make it easier to verify the integrity of software source code; git is widely used to manage source code.
  • Theo de Raadt has also been receiving funding to secure the critical “plumbing” behind modern communications infrastructure:
    • This funding is being used towards improving OpenSSH (a widely-used tool whose security is critical). These include various smaller improvements, an updated configuration file parser, and a transition to using the SFTP protocol rather than the older RCP protocol inside the scp(1) program.
    • It is also being used to improve rpki-client, implementing Resource Public Key Infrastructure (RPKI). RPKI is an important protocol for protecting the Internet’s routing protocols from attack. These improvements implement the RPKI Repository Delta Protocol (RRDP) data transfer protocol and fix various edge cases (e.g., through additional validation checks). The service is even using rpki-client behind the scenes.
  • Nathan Chancellor is improving the Linux kernel’s ability to be compiled with clang (instead of just gcc). This includes eliminating warning messages from clang (which helps to reduce kernel bugs even when gcc is used) and fixing/extending the clang compiler (which helps clang users when compiling code other than the Linux kernel). Unsurprisingly this involves changing both the Linux kernel and the clang/LLVM compiler infrastructure, and sometimes other software as well.
    • In the long run, eliminating warnings that by themselves aren’t bugs is important; developers will ignore warnings if there are many irrelevant ones, but if there are only a few warnings, they’ll examine them (making warnings more useful).
    • Of notable mention for security implications is clang support for Control-Flow Integrity (CFI); this can counter many attacks on arm64, and work will eventually enable x86_64 support.
  • I oversee some security audits conducted via the Open Source Technology Improvement Fund (OSTIF) when funded through the LF. We (the LF) often work with OSTIF to conduct security audits. We work with OSTIF to define the audit scope, and then OSTIF runs a bidding process where qualified security audit firms propose to do the work. We then work with OSTIF to select the winner (who isn’t always the cheapest — we want good work, not a box-check). OSTIF & I then oversee the process and review the final result. 
    • Note that we don’t just want to do audits, we also want to fix or mitigate any critical issues the audits identify, but the audits help us find the key problems. Subject matter experts perform the audit reports, and handling bidding is OSTIF’s primary focus, so my main contribution is usually to help ensure these reports are clear to non-experts while still being accurate. Experts sometimes forget to explain their context and jargon, and it’s sometimes hard to fix that (you must know the terminology & technology to explain it).
    • This work included two security audits related to the Linux kernel, one for signing and key management policies and the other for vulnerability reporting and remediation. 
    • I’ve also overseen audits of the exposure notification applications COVID Shield and COVID Green: 
    • It’s not part of my oversight of OSTIF on behalf of the LF, but I also informally talk with OSTIF about other OSS they’re auditing (such as flux2, lodash, jackson-core, jackson-databind, httpcomponents-core, httpcomponents-client, laravel, and slf4j). A little coordination and advice-sharing among experts can make everything better.

The future is hard to predict, but we anticipate that we will be doing more. In late July, the OpenSSF Technical Advisory Council (TAC) recommended approving funding for a security audit of (part of) Symfony, a widely-used web framework. The OpenSSF Governing Board (GB) approved this on 2021-08-05 and I expect OSTIF will soon take bids on it.

The OpenSSF is also taking steps to raise more money via membership dues (this was delayed due to COVID; starting a new foundation is harder during a pandemic). Once the OpenSSF has more money, we expect they’ll be funding a lot more work to identify critical projects, do security audits, fix problems, and improve or create projects to enhance OSS security. The future looks bright.

Please remember that this is only a small part of ongoing work to improve OSS security. Almost all LF projects need to be secure, so most foundations’ projects include security efforts not listed here. As noted earlier, most development work is done by volunteers or by non-LF organizations directly paying people to do the work (typically employees). 

The OpenSSF has several working groups and many projects where people are working together to improve OSS security. These include free courses on how to develop secure software and the CII Best Practices badge project. We (at the LF) also have many other projects working to improve OSS security. For example, sigstore is making cryptographic signatures much easier; sigstore’s “cosign” tool just released its version 1.0. Many organizations have recently become interested in software bill-of-materials (SBOMs), and we’ve been working on SBOMs for a long time.

If you or your organization would like to fund focused work on improving OSS security, please reach out! You can contribute to the OpenSSF (in general or as a directed fund); just contact them (e.g., Microsoft contributed to OpenSSF in December 2020). If you’d prefer, you can create a grant directly with the Linux Foundation itself — just email me at <> if you have questions. For smaller amounts, say to fund a specific project, you can also consider using the LFX crowdfunding tools to fund or request funding. Many people & organizations struggle to pay individual OSS developers because of the need to handle taxes and oversight. If that’s your concern, talk to us. The LF has experience & processes to do all that, letting experts focus on getting the work done.

My sincere thanks to all the performers for their important work and to all the funders for their confidence in us!

About the author: David A. Wheeler is Director of Open Source Supply Chain Security for The Linux Foundation.

The National Telecommunications and Information Administration (NTIA) recently asked for wide-ranging feedback to define a minimum Software Bill of Materials (SBOM). It was framed with a single, simple question (“What is an SBOM?”), and constituted an incredibly important step towards software security and a significant moment for open standards.

From NTIA’s SBOM FAQ  “A Software Bill of Materials (SBOM) is a complete, formally structured list of components, libraries, and modules that are required to build (i.e. compile and link) a given piece of software and the supply chain relationships between them. These components can be open source or proprietary, free or paid, and widely available or restricted access.”  SBOMs that can be shared without friction between teams and companies are a core part of software management for critical industries and digital infrastructure in the coming decades.

The ISO International Standard for open source license compliance (ISO/IEC 5230:2020 – Information technology — OpenChain Specification) requires a process for managing a bill of materials for supplied software. This aligns with the NTIA goals for increased software transparency and illustrates how the global industry is addressing challenges in this space. For example, it has become a best practice to include an SBOM for all components in supplied software, rather than isolating these materials to open source.

The open source community identified the need for and began to address the challenge of SBOM “list of ingredients” over a decade ago. The de-facto industry standard, and most widely used approach today, is called Software Package Data Exchange (SPDX). All of the elements in the NTIA proposed minimum SBOM definition can be addressed by SPDX today, as well as broader use-cases.

SPDX evolved organically over the last decade to suit the software industry, covering issues like license compliance, security, and more. The community consists of hundreds of people from hundreds of companies, and the standard itself is the most robust, mature, and adopted SBOM in the market today. 

The full SPDX specification is only one part of the picture. Optional components such as SPDX Lite, developed by Pioneer, Sony, Hitachi, Renesas, and Fujitsu, among others, provide a focused SBOM subset for smaller supplier use. The nature of the community approach behind SPDX allows practical use-cases to be addressed as they arose.

In 2020, SPDX was submitted to ISO via the PAS Transposition process of Joint Technical Committee 1 (JTC1) in collaboration with the Joint Development Foundation. It is currently in the approval phase of the transposition process and can be reviewed on the ISO website as ISO/IEC PRF 5962.

The Linux Foundation has prepared a submission for NTIA highlighting knowledge and experience gained from practical deployment and usage of SBOM in the SPDX and OpenChain communities. These include isolating the utility of specific actions such as tracking timestamps and including data licenses in metadata. With the backing of many parties across the worldwide technology industry, the SPDX and OpenChain specifications are constantly evolving to support all stakeholders.

Industry Comments

The Sony team uses various approaches to managing open source compliance and governance… An example is using an OSS management template sheet based on SPDX Lite, a compact subset of the SPDX standard. Teams need to be able to review the type, version, and requirements of software quickly, and using a clear standard is a key part of this process.

Hisashi Tamai, SVP, Sony Group Corporation, Representative of the Software Strategy Committee

“Intel has been an early participant in the development of the SPDX specification and utilizes SPDX, as well as other approaches, both internally and externally for a number of open source software use-cases.”

Melissa Evers, Vice President – Intel Architecture, Graphics, Software / General Manager – Software Business Strategy

Scania corporate standard 4589 (STD 4589) was just made available to our suppliers and defines the expectations we have when Open Source is part of a delivery to Scania. So what is it we ask for in a relationship with our suppliers when it comes to Open Source? 

1) That suppliers conform to ISO/IEC 5230:2020 (OpenChain). If a supplier conforms to this specification, we feel confident that they have a professional management program for Open Source.  

2) If in the process of developing a solution for Scania, a supplier makes modifications to Open Source components, we would like to see those modifications contributed to the Open Source project. 

3) Supply a Bill of materials in ISO/IEC DIS 5962 (SPDX) format, plus the source code where there’s an obligation to offer the source code directly, so we don’t need to ask for it.

Jonas Öberg, Open Source Officer – Scania (Volkswagen Group)

The SPDX format greatly facilitates the sharing of software component data across the supply chain. Wind River has provided a Software Bill of Materials (SBOM) to its customers using the SPDX format for the past eight years. Often customers will request SBOM data in a custom format. Standardizing on SPDX has enabled us to deliver a higher quality SBOM at a lower cost.

Mark Gisi, Wind River Open Source Program Office Director and OpenChain Specification Chair

The Black Duck team from Synopsys has been involved with SPDX since its inception, and I had the pleasure of coordinating the activities of the project’s leadership for more than a decade. In addition, representatives from scores of companies have contributed to the important work of developing a standard way of describing and communicating the content of a software package.

Phil Odence, General Manager, Black Duck Audits, Synopsys

With the rapidly increasing interest in the types of supply chain risk that a Software Bill of Materials helps address, SPDX is gaining broader attention and urgency. FossID (now part of Snyk) has been using SPDX from the start as part of both software component analysis and for open source license audits. Snyk is stepping up its involvement too, already contributing to efforts to expand the use cases for SPDX by building tools to test out the draft work on vulnerability profiles in SPDX v3.0.

Gareth Rushgrove, Vice President of Products, Snyk

For more information on OpenChain:

For more information on SPDX:


Author: Kate Stewart, VP of Dependable Systems, The Linux Foundation

In a previous Linux Foundation blog, David A. Wheeler, director of LF Supply Chain Security, discussed how capabilities built by Linux Foundation communities can be used to address the software supply chain security requirements set by the US Executive Order on Cybersecurity. 

One of those capabilities, SPDX, completely addresses the Executive Order 4(e) and 4(f) and 10(j) requirements for a Software Bill of Materials (SBOM). The SPDX specification is implemented as a file format that identifies the software components within a larger piece of computer software and metadata such as the licenses of those components. 

SPDX is an open standard for communicating software bill of material (SBOM) information, including components, licenses, copyrights, and security references. It has a rich ecosystem of existing tools that provides a common format for companies and communities to share important data to streamline and improve the identification and monitoring of software.

SBOMs have numerous use cases. They have frequently been used in areas such as license compliance but are equally useful in security, export control, and broader processes such as mergers and acquisitions (M&A) processes or venture capital investments. SDPX maintains an active community to support various uses, modeling its governance and activity on the same format that has successfully supported open source software projects over the past three decades.

The LF has been developing and refining SPDX for over ten years and has seen extensive uptake by companies and projects in the software industry.  Notable recent examples are the contributions by companies such as Hitachi, Fujitsu, and Toshiba in furthering the standard via optional profiles like “SPDX Lite” in the SPDX 2.2 specification release and in support of the SPDX SBOMs in proprietary and open source automation solutions. 

This de facto standard has been submitted to ISO via the Joint Development Foundation using the PAS Transposition process of Joint Technical Committee 1 (JTC1). It is currently in the enquiry phase of the process and can be reviewed on the ISO website as ISO/IEC DIS 5962.

There is a wide range of open source tooling, as well as commercial tool options emerging as well as options available today.  Companies such as FOSSID and Synopsys have been working with the SPDX format for several years. Open Source tools like FOSSology (source code Analysis),  OSS Review Toolkit (Generation from CI & Build infrastructure), Tern (container content analysis), Quartermaster (build extensions), ScanCode (source code analysis) in addition to the SPDX-tools project have also standardized on using SPDX for the interchange are also participating in Automated Compliance Tooling (ACT) Project Umbrella.  ACT has been discussed as community-driven solutions for software supply chain security remediation as part of our synopsis of the findings in the Vulnerabilities in the Core study, which was published by the Linux Foundation and Harvard University LISH in February of 2020.   

One thing is clear: A software bill of materials that can be shared without friction between different teams and companies will be a core part of software development and deployment in this coming decade. The sharing of software metadata will take different forms, including manual and automated reviews, but the core structures will remain the same. 

Standardization in this field, as in others, is the key to success. This domain has an advantage in that we are benefiting from an entire decade of prior work in SPDX. Therefore the process becomes the implementation of this standard to the various domains rather than the creation, expansion, or additional refinement of new or budding approaches to the matter.

Start using the SPDX specification here: Development of the next revision is underway, so If there’s a use case you can’t represent with the current specification, open an issue, this is the right window for input.   

To learn more about the many facets of the SPDX project see:

Our communities take security seriously and have been instrumental in creating the tools and standards that every organization needs to comply with the recent US Executive Order


The US White House recently released its Executive Order (EO) on Improving the Nation’s Cybersecurity (along with a press call) to counter “persistent and increasingly sophisticated malicious cyber campaigns that threaten the public sector, the private sector, and ultimately the American people’s security and privacy.”

In this post, we’ll show what the Linux Foundation’s communities have already built that support this EO and note some other ways to assist in the future. But first, let’s put things in context.

The Linux Foundation’s Open Source Security Initiatives In Context

We deeply care about security, including supply chain (SC) security. The Linux Foundation is home to some of the most important and widely-used OSS, including the Linux kernel and Kubernetes. The LF’s previous Core Infrastructure Initiative (CII) and its current Open Source Security Foundation (OpenSSF) have been working to secure OSS, both in general and in widely-used components. The OpenSSF, in particular, is a broad industry coalition “collaborating to secure the open source ecosystem.”

The Software Package Data Exchange (SPDX) project has been working for the last ten years to enable software transparency and the exchange of software bill of materials (SBOM) data necessary for security analysis. SPDX, recognized and implemented as ISO/IEC standard 5962:2021, is supported by global companies with massive supply chains, and has a large open and closed source tooling support ecosystem. SPDX already meets the requirements of the executive order for SBOMs.

Finally, several LF foundations have focused on the security of various verticals. For example,  LF Public Health and LF Energy have worked on security in their respective sectors. Our cloud computing industry collaborating within CNCF has also produced a guide for supporting software supply chain best practices for cloud systems and applications.

Given that context, let’s look at some of the EO statements (in the order they are written) and how our communities have invested years in open collaboration to address these challenges.

Best Practices

The EO 4(b) and 4(c) says that

The “Secretary of Commerce [acting through NIST] shall solicit input from the Federal Government, private sector, academia, and other appropriate actors to identify existing or develop new standards, tools, and best practices for complying with the standards, procedures, or criteria [including] criteria that can be used to evaluate software security, include criteria to evaluate the security practices of the developers and suppliers themselves, and identify innovative tools or methods to demonstrate conformance with secure practices [and guidelines] for enhancing software supply chain security.” Later in EO 4(e)(ix) it discusses “attesting to conformity with secure software development practices.”

The OpenSSF’s CII Best Practices badge project specifically identifies best practices for OSS, focusing on security and including criteria to evaluate the security practices of developers and suppliers (it has over 3,800 participating projects). LF is also working with SLSA (currently in development) as potential additional guidance focused on addressing supply chain issues further.

Best practices are only useful if developers understand them, yet most software developers have never received education or training in developing secure software. The LF has developed and released its Secure Software Development Fundamentals set of courses available on edX to anyone at no cost. The OpenSSF Best Practices Working Group (WG) actively works to identify and promulgate best practices. We also provide a number of specific standards, tools, and best practices, as discussed below.

Encryption and Data Confidentiality

The EO 3(d) requires agencies to adopt “encryption for data at rest and in transit.” Encryption in transit is implemented on the web using the TLS (“https://”) protocol, and Let’s Encrypt is the world’s largest certificate authority for TLS certificates.

In addition, the LF Confidential Computing Consortium is dedicated to defining and accelerating the adoption of confidential computing. Confidential computing protects data in use (not just at rest and in transit) by performing computation in a hardware-based Trusted Execution Environment. These secure and isolated environments prevent unauthorized access or modification of applications and data while in use.

Supply Chain Integrity

The EO 4(e)(iii) states a requirement for

 “employing automated tools, or comparable processes, to maintain trusted source code supply chains, thereby ensuring the integrity of the code.” 

The LF has many projects that support SC integrity, in particular:

  • in-toto is a framework specifically designed to secure the integrity of software supply chains.
  • The Update Framework (TUF) helps developers maintain the security of software update systems, and is used in production by various tech companies and open source organizations.  
  • Uptane is a variant of TUF; it’s an open and secure software update system design which protects software delivered over-the-air to the computerized units of automobiles.
  • sigstore is a project to provide a public good / non-profit service to improve the open source software supply chain by easing the adoption of cryptographic software signing (of artifacts such as release files and container images) backed by transparency log technologies (which provide a tamper-resistant public log). 
  • OpenChain (ISO 5230) is the International Standard for open source license compliance. Application of OpenChain requires identification of OSS components. While OpenChain by itself focuses more on licenses, that identification is easily reused to analyze other aspects of those components once they’re identified (for example, to look for known vulnerabilities).

Software Bill of Materials (SBOMs) support supply chain integrity; our SBOM work is so extensive that we’ll discuss that separately.

Software Bill of Materials (SBOMs)

Many cyber risks come from using components with known vulnerabilities. Known vulnerabilities are especially concerning in key infrastructure industries, such as the national fuel pipelines,  telecommunications networks, utilities, and energy grids. The exploitation of those vulnerabilities could lead to interruption of supply lines and service, and in some cases, loss of life due to a cyberattack.

One-time reviews don’t help since these vulnerabilities are typically found after the component has been developed and incorporated. Instead, what is needed is visibility into the components of the software environments that run these key infrastructure systems, similar to how food ingredients are made visible.

A Software Bill of Materials (SBOM) is a nested inventory or a list of ingredients that make up the software components used in creating a device or system. This is especially critical as it relates to a national digital infrastructure used within government agencies and in key industries that present national security risks if penetrated. The use of SBOMs would improve understanding of the operational and cyber risks of those software components from their originating supply chain.

The EO has extensive text about requiring a software bill of materials (SBOM) and tasks that depend on SBOMs:

  • EO 4(e) requires providing a purchaser an SBOM “for each product directly or by publishing it on a public website” and “ensuring and attesting… the integrity and provenance of open source software used within any portion of a product.” 
  • It also requires tasks that typically require SBOMs, e.g., “employing automated tools, or comparable processes, that check for known and potential vulnerabilities and remediate them, which shall operate regularly….” and “maintaining accurate and up-to-date data, provenance (i.e., origin) of software code or components, and controls on internal and third-party software components, tools, and services present in software development processes, and performing audits and enforcement of these controls on a recurring basis.” 
  • EO 4(f) requires publishing “minimum elements for an SBOM,” and EO 10(j) formally defines an SBOM as a “formal record containing the details and supply chain relationships of various components used in building software…  The SBOM enumerates [assembled] components in a product… analogous to a list of ingredients on food packaging.”

The LF has been developing and refining SPDX for over ten years; SPDX is used worldwide and is approved as ISO/IEC International Standard 5962:2021.  SPDX is a file format that identifies the software components within a larger piece of computer software and metadata such as the licenses of those components. SPDX 2.2 already supports the current guidance from the National Telecommunications and Information Administration (NTIA) for minimum SBOM elements. Some ecosystems have ecosystem-specific conventions for SBOM information, but SPDX can provide information across all arbitrary ecosystems.

SPDX is real and in use today, with increased adoption expected in the future. For example:

  • An NTIA “plugfest” demonstrated ten different producers generating SPDX. SPDX supports acquiring data from different sources (e.g., source code analysis, executables from producers, and analysis from third parties). 
  • A corpus of some LF projects with SPDX source SBOMs is available. 
  • Various LF projects are working to generate binary SBOMs as part of their builds, including yocto and Zephyr
  • To assist with further SPDX adoption, the LF is paying to write SPDX plugins for major package managers.

Vulnerability Disclosure

No matter what, some vulnerabilities will be found later and need to be fixed. EO 4(e)(viii) requires “participating in a vulnerability disclosure program that includes a reporting and disclosure process.” That way, vulnerabilities that are found can be reported to the organizations that can fix them. 

The CII Best Practices badge passing criteria requires that OSS projects specifically identify how to report vulnerabilities to them. More broadly, the OpenSSF Vulnerability Disclosures Working Group is working to help “mature and advocate well-managed vulnerability reporting and communication” for OSS. Most widely-used Linux distributions have a robust security response team, but the Alpine Linux distribution (widely used in container-based systems) did not. The Linux Foundation and Google funded various improvements to Alpine Linux, including a security response team.

We hope that the US will update its Vulnerabilities Equities Process (VEP) to work more cooperatively with commercial organizations, including OSS projects, to share more vulnerability information. Every vulnerability that the US fails to disclose is a vulnerability that can be found and exploited by attackers. We would welcome such discussions.

Critical Software

It’s especially important to focus on critical software — but what is critical software? EO 4(g) requires the executive branch to define “critical software,” and 4(h) requires the executive branch to “identify and make available to agencies a list of categories of software and software products… meeting the definition of critical software.”

Linux Foundation and the Laboratory for Innovation Science at Harvard (LISH) developed the report Vulnerabilities in the Core,’ a Preliminary Report and Census II of Open Source Software, which analyzed the use of OSS to help identify critical software. The LF and LISH are in the process of updating that report. The CII identified many important projects and assisted them, including OpenSSL (after Heartbleed), OpenSSH,  GnuPG, Frama-C, and the OWASP Zed Attack Proxy (ZAP). The OpenSSF Securing Critical Projects Working Group has been working to better identify critical OSS projects and to focus resources on critical OSS projects that need help. There is already a first-cut list of such projects, along with efforts to fund such aid.

Internet of Things (IoT)

Unfortunately, internet-of-things (IoT) devices often have notoriously bad security. It’s often been said that “the S in IoT stands for security.” 

EO 4(s) initiates a pilot program to “educate the public on the security capabilities of Internet-of-Things (IoT) devices and software development practices [based on existing consumer product labeling programs], and shall consider ways to incentivize manufacturers and developers to participate in these programs.” EO 4(t) states that such “IoT cybersecurity criteria” shall “reflect increasingly comprehensive levels of testing and assessment.”

The Linux Foundation develops and is home to many of the key components of IoT systems. These include:

  • The Linux kernel, used by many IoT devices. 
  • The yocto project, which creates custom Linux-based systems for IoT and embedded systems. Yocto supports full reproducible builds. 
  • EdgeX Foundry, which is a flexible OSS framework that facilitates interoperability between devices and applications at the IoT edge, and has been downloaded millions of times. 
  • The Zephyr project, which provides a real-time operating system (RTOS) used by many for resource-constrained IoT devices and is able to generate SBOM’s automatically during build. Zephyr is one of the few open source projects that is a CVE Numbering Authority.
  • The seL4 microkernel, which is the most assured operating system kernel in the world; it’s notable for its comprehensive formal verification.

Security Labeling

EO 4(u) focuses on identifying:

“secure software development practices or criteria for a consumer software labeling program [that reflects] a baseline level of secure practices, and if practicable, shall reflect increasingly comprehensive levels of testing and assessment that a product may have undergone [and] identify, modify, or develop a recommended label or, if practicable, a tiered software security rating system.”

The OpenSSF’s CII Best Practices badge project (noted earlier) specifically identifies best practices for OSS development, and is already tiered (passing, silver, and gold). Over 3,800 projects currently participate.

There are also a number of projects that relate to measuring security and/or broader quality:


The Linux Foundation (LF) has long been working to help improve the security of open source software (OSS), which powers systems worldwide. We couldn’t do this without the many contributions of time, money, and other resources from numerous companies and individuals; we gratefully thank them all.  We are always delighted to work with anyone to improve the development and deployment of open source software, which is important to us all.

David A. Wheeler, Director of Open Source Supply Chain Security at the Linux Foundation