Project Lifecycle Transitions based on OpenSSF Scorecard

(Update: November 7, 2024): The Task Force met on August 9, 2024 and concluded the following:

  • Given the availability of both the OpenSSF scorecard and the CloMonitor, which between them cover all the project health criteria we could think of and more, there is no need to codify a set of badges for LFDT projects.
  • Various criteria (see below) will be categorized and either fulfilment or score thresholds recommended for each as a health indicator.
  • Project maintainers will be required to report these criteria (and wherever applicable, associated scores) in their annual reports. The TAC will evaluate the health of the projects and, at its discretion, determine whether or not a project ought to be moved from one stage to another (either a promotion or a demotion). A graph of the project's lifecycle annotated with criteria will be provided strictly as a guideline for the TAC.
  • Recording:


The candidate list of badges proposed in Candidate List of Badges and Associated Project Lifecycle Stages are now somewhat obsolete, given the specific health indicators that can be tracked in a quantitative manner using OpenSSF scorecards as shown by David Enyeart in the May 16, 2024 meeting (also see Meeting Minutes.)


The need for extra badges, along with the additional overhead their maintenance will impose on maintainers as well as the TOC, is not apparent now. But the need for clear-cut criteria for lifecycle transitions (forward or regression) for projects is still necessary.


Based on the new project lifecycle diagram created by the TOC in December 2023, here is an initial set of criteria for discussion.


Transition Criteria Mapping to OpenSSF Scorecard Criteria

  • Legal
    • License (10 is pass, anything below is fail)
  • Diversity
    • Consider the "Contributors" metric (need to do research on what's an appropriate threshold, and also if we can tweak the parameters)
    • Also the diversity of maintainers (try a GitHub action to parse the MAINTAINERS.md file; as a fallback, the TOC will manually inspect.)
  • Release
    • Packaging (OpenSSF seems to give a 10 easily here, based on a single publish action, so perhaps we should consider this a soft criteria and mandate the highest score)
    • Also timeliness of releases using major and minor version numbers (use a script/action that checks time since last release, then use a threshold). (Consider adding such scripts to the TOC repo so any member can run it when required for reviews and evaluations.)
  • Testing and CI/CD
    • CI Tests (think about this one, if 10 is easily attainable, maybe mandate that; perhaps require a code coverage action and pick a threshold to exceed)
  • Security
    • Dangerous Workflow (require a 10)
    • Token Permissions (require a 10)
    • Branch-Protection (check with Ry if the 2-org merge approval policy is set by default for all repos; require Tier 4, or 9/10 points)
    • Dependency-Update-Tools (require a 10, which can be obtained just by configuring a bot like dependabot)
    • Fuzzing (check if it's easy to integrate tools like OSSFuzz, and if so, mandate a 10)
    • Pinned-Dependencies (investigate if this is a 0 but give the benefit if the doubt to the maintainers if they can provide good reasons for that score)
    • SAST (check if it's easy to integrate tools like CodeQL, and if so, mandate a 10))
    • Security-Policy (at least a 9, also ensure that the project's SECURITY.md  file builds on the default template in the HL TOC governing documents)
    • Signed-Releases (don't mandate anything until the Security Artifacts Task Force comes to a conclusion)
    • Token-Permissions (mandate a 10, ensuring least privilege use of tokens)
    • Vulnerabilities (investigate if this is less than 10 but give the benefit if the doubt to the maintainers if they can provide good reasons for that score because a ready fix is unavailable for a critical feature)
  • Structure
    • Code Review (require a 10, as human maintainer review of every PR is a basic requirement)
  • Maintenance Activity (not in our original badges' list) →
    • Maintenance (require a certain amount of activity to graduate, but we won't require a high number to maintain the graduated state for mature projects, dormancy check can be done by TOC but will be subjective, treat this as a soft criterion without mandating a hard threshold)
  • Production
    • ?? (should we have an ADOPTERS.md  file; interesting to look at, but we shouldn't mandate this as a criterion for graduation)
  • Documentation
    • (subjective criterion, test for presence and publication; make a list of things that docs should cover, like setup/installation instructions and a basic tutorial)


How do we enforce a particular version of the scorecard?

  • Perhaps mention whenever an upgrade is due on the maintainers' Discord channel and expect that the maintainers of each project will submit a PR to upgrade the scorecard GitHub action.)


Meeting Recordings:

The above diagram was discussed in Task Force meetings, from which the above criteria mapping was drawn. Links to the meeting recordings are given below.

June 7, 2024

July 19, 2024