I am the technical lead of the CII best practices badge, so ask me anything!<p>I'm not sure what the point of the poster was. You can see the projects with "passing" or better here, and there are over 200:
<a href="https://bestpractices.coreinfrastructure.org/en/projects?gteq=100" rel="nofollow">https://bestpractices.coreinfrastructure.org/en/projects?gte...</a><p>There are very few projects with gold, sure, but gold is a more recent badge level and it's pretty hard to get. In addition, we've been emphasizing helping projects get the "passing" badge, and we have so far spent less time encouraging projects to work on higher level badges (silver or gold). Most projects working on higher level badges tend to work on them slowly, instead of trying to get it all at once anyway.<p>I think a more telling chart is the steady increase in the number of participating projects:
<a href="https://bestpractices.coreinfrastructure.org/en/project_stats" rel="nofollow">https://bestpractices.coreinfrastructure.org/en/project_stat...</a><p>If you are involved in an open source project, I would encourage you to get a badge. Go here for more:
<a href="https://bestpractices.coreinfrastructure.org/" rel="nofollow">https://bestpractices.coreinfrastructure.org/</a>
Looks like the Linux kernel would have a gold rating if kernel.org set X-Content-Type-Options to "nosniff".<p><a href="https://bestpractices.coreinfrastructure.org/en/projects/34?criteria_level=2#security" rel="nofollow">https://bestpractices.coreinfrastructure.org/en/projects/34?...</a>
Whats the point being made?<p>This is more useful: <a href="https://github.com/coreinfrastructure/best-practices-badge" rel="nofollow">https://github.com/coreinfrastructure/best-practices-badge</a>
You can read an article I wrote about the badging project on LWN.net: "Core Infrastructure Initiative best-practices badge"
<a href="https://lwn.net/Articles/690169/" rel="nofollow">https://lwn.net/Articles/690169/</a><p>Things have happened since then, but it's still a good overview.
I just found and read through the criteria list. It's mind-bogglingly exhaustive, but in a very good way, and an excellent catalyst for maintainable, secure software.<p>I'd regard it as universally applicable to any and all code.<p>This list gets you in the door: <a href="https://github.com/coreinfrastructure/best-practices-badge/blob/master/doc/criteria.md" rel="nofollow">https://github.com/coreinfrastructure/best-practices-badge/b...</a><p>And this is the list that gets you a gold star rating: <a href="https://github.com/coreinfrastructure/best-practices-badge/blob/master/doc/other.md" rel="nofollow">https://github.com/coreinfrastructure/best-practices-badge/b...</a><p>Prepare to have your eyes glaze over.<p>---<p>I had a go at summarizing the criteria. This is not an accurate reduction of the essence of the two lists; it comes close (I have no reason to misrepresent the criteria), but I've paraphrased here and there where I think it was editorially safe to do so (including slightly rearranging/renaming the major headings a little and swapping the ordering of points).<p>The following is NOT a substitute for reading the criteria. I wrote this to give a good idea of why you'd want to take the time to click the above links.<p>>> <i>The following is the criteria to get a _minimally_ passing score.</i> <<<p>Website(s)<p>- <Must> succinctly describe the project with a minimum of domain-specific language or siloed knowledge<p>- <Must> explain how to obtain, provide feedback and contribute<p>- <Must> use HTTPS+TLS<p>Project<p>- <Must> use an OSS license (OSI, FSF, or liked by Debian or Fedora); <may> be dual-commercially-licensed<p>- <Must> provide basic documentation in 99% of circumstances<p>- <Must> use a URL-addressable, searchable discussion system for unilateral participation (IRC or mailinglists with URL-accessible logging is acceptable)<p>- <Should> use English<p>Source version control<p>- URL-addressable version control <must> be used<p>- System <must> track who changed what, and when<p>- <Must> list all interim releases, for accountability<p>- Unique version numbers <must> be used for all user-facing releases; commit IDs are listed as acceptable version "numbers", as is use of SemVer, and use of other version representations<p>- Release notes <must> accompany all releases<p>Bug reporting<p>- Project <must> provide bug reporting process; use of an issue tracker is recommended<p>- Project <must> at least respond to (if not fix) issues after 2-12 months, and <should> acknowledge more than half of submitted enhancement requests in the same timeframe<p>Vulnerability reporting<p>- Project <must> provide a vulnerability reporting mechanism; HTTPS or email (possibly with PGP) are suggested<p>- All reports submitted within the last 6 months <must> be responded to within 14 days<p>Building<p>- IF project must be built in order to function, automatic rebuild mechanism must be provided<p>- Automated OSS test suite <must> be used<p>- Tests for new functionality <must> be added to automated test suite<p>Code warnings<p>- Project <must> use compiler warnings, language strict/safe modes, or linting tools<p>Security<p>- Project <must> be built by at least one developer who knows how to design secure software (the 8 principles from Saltzer and Schroeder are noted)<p>- At least one developer <must> have domain knowledge of common vulnerability-introducing errors relevant to the software, and how to mitigate against them<p>- Project <must> use established and reviewed security algorithms/protocols<p>- Project <must> use keylengths that meet minimum NIST requirements through 2030 (certain keylengths are noted)<p>- Project <must not> use known-broken crypto<p>- Security mechanisms <should> use perfect forward secrecy<p>- Stored passwords <must> be iterated/key-stretched hashes with per-$user salt<p>- Keys and nonces must come from a cryptographically secure RNG<p>- Project delivery mechanism <must> be MITM-resilient (HTTPS and SSH+SCP are listed as acceptable)<p>- Checksums <must not> be delivered over HTTP<p>- Medium- to high-severity vulnerabilities <must> be fixed if they are public knowledge for more than 60 days<p>- Project <must not> leak security credentials (sample credentials are exempted)<p>Code analysis<p>- Project <must> use a static code analyser, if one exists for the given project language(s) and is OSS; it is <suggested> that the tool be configured/used to look for vulnerabilities, if possible; it is also <suggested> that code analysis be rerun every commit or at least daily<p>- Project <must> fix medium/high severity (CVSS 2.0 or higher) exploitable issues found by static analysis in a timely manner<p>- The use of a dynamic code analyzer to major production releases is <suggested>, if an OSS tool exists for the project language(s)<p>- If the software is written in C/C++ or another memory-unsafe language, it is <suggested> that a dynamic analyser (fuzzer, web application scanner) be used to look for things like buffer overflows/overwrites<p>- It is <suggested> that the project make heavy use of run-time assertions that are checked by dynamic analysis tools
...showing just how useful "best practices" actually are.<p>I associate that phrase with dogmatic cargo-cult "don't think just follow" and the horrible results thereof. Lots of self-proclaimed "experts" love to say "do X and Y and Z and you will be successful because these are <i>best</i> practices", but it's all a bunch of snake oil.<p><a href="https://agilepainrelief.com/notesfromatooluser/2010/03/there-are-no-best-practices.html" rel="nofollow">https://agilepainrelief.com/notesfromatooluser/2010/03/there...</a><p><a href="http://www.satisfice.com/blog/archives/27" rel="nofollow">http://www.satisfice.com/blog/archives/27</a><p><a href="http://www.satisfice.com/presentations/nobest.pdf" rel="nofollow">http://www.satisfice.com/presentations/nobest.pdf</a><p>"Best practices are best not practiced."