In case people are wondering, these guidelines can end up being more like strict requirements in real life. For example, a partner company or customer company might require following such guidelines, even if they are just labeled “best practices”, as part of a contract with your company. Or maybe your company’s insurance agency requires it. Or maybe your auditing firm will refuse to sign off on something without implementing such guidelines.
My point is, even something advertised as “best practices” can become more like “forced requirements”, in many different and surprising ways.<p>In the case of AI, you can see how a president’s executive order (<a href="https://www.whitehouse.gov/briefing-room/presidential-actions/2023/10/30/executive-order-on-the-safe-secure-and-trustworthy-development-and-use-of-artificial-intelligence/" rel="nofollow">https://www.whitehouse.gov/briefing-room/presidential-action...</a>), which is what pushed NIST to create this draft, can ultimately force private companies to adhere to a president’s whim. This can be especially dangerous when you consider that these guidelines discuss misinformation, disinformation, and “hateful” content, as this feels a lot like violating first amendment rights through multiple steps using the flimsy claim that such guidelines are merely voluntary.