TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: Need help drafting a profession's policy for AI governance

8 点作者 killjoywashere超过 1 年前
I&#x27;ve been asked to contribute to a fairly significant policy document for my profession and I am seeking your help, particularly the cryptography, ML, and legal folks, to help draft this particular concept around the chain of custody for AI inferences:<p>========== Inferences made by an AI marketed for use in decision-making (e.g. decision support) should be cryptographically signed using a certificate on the vendor’s machine, who&#x27;s certificates should be managed in a Public Key Infrastructure program, so the inferences are immutable and their provenance is traceable, and those signed inferences should retained as part of the record.<p>Additionally, any verification or validation procedure performed by a person should result in the machine&#x27;s signing certificate being countersigned by the person performing the verification or validation, such that this procedure is also captured in the signed inference. ==========<p>Is that a sensible way to ensure inferences are admissible as evidence? Does it cover causal interventions? What am I missing? Critique <i>most</i> welcome.

2 条评论

bobbiechen超过 1 年前
It&#x27;s somewhat implied in your phrasing, but I think it is also important to be explicit that you should also record the software&#x2F;model version of the AI used (as part of the signature&#x2F;metadata). Otherwise it&#x27;s not very meaningful, since there&#x27;s no way to know whether the inference was generated by a malicious program or not. There&#x27;s a corresponding challenge for the vendor to do proper versioning of the system as they develop and improve on it.<p>Depending on your needs, it may be useful to know about all inferences which happened (even if they were not used). Some kind of append-only ledger which records metadata could be used.<p>Happy to chat more on the topic, as I am actively working on something similar - bobbie.chen@anjuna.io
wsh超过 1 年前
On your specific text:<p>- What if the AI is supplied by a vendor but runs on a computer system controlled by the user or by a third party? This could happen if the user doesn’t want, or isn’t allowed, to disclose the inputs or their derivatives.<p>- Assuming there’s a need to mandate cryptographic digital signatures at all, why require certificates and PKI? Wouldn’t it suffice for the signer to announce a public key and, if necessary, its revocation?<p>- Cryptographic signatures are still overwhelmingly the exception, not the rule, in legal evidence. Courts routinely admit ordinary paper and electronic business records, authenticated, when necessary, by their creators or custodians. (See, for example, Rules 901 and 902 in the Federal Rules of Evidence.) Digital signatures might not make this easier; consider the potential for conflicting expert testimony about signing and key management schemes and their weaknesses.<p>More generally:<p>As a professional engineer who uses my own and others’ software, I don’t think an AI model is fundamentally different from a spreadsheet, a card deck with a FORTRAN program, or a table or formula in a printed handbook. If I’m relying on something for my work, it’s my professional responsibility to assess its validity, suitability for purpose, and limitations; to know how to use it properly; and to interpret and evaluate its output.<p>The standard of care with which I do those things, the nature and extent of any documentation I might produce, and the arrangements for the retention, protection, and future authentication of those materials in case of a dispute, will vary with the circumstances, including the potential for harm to the client or to the public and my own organization’s appetite for risk.<p>Perhaps your context is different, but I hesitate to endorse a highly prescriptive approach. Engineering regulators use very broad language; for example, Florida’s rule says only, “The engineer shall be responsible for the results generated by any computer software and hardware that he or she uses in providing engineering services” [1], and Professional Engineers Ontario has guidelines [2] but not specific standards.<p>[1] Florida Administrative Code, Rule 61G15-30.008<p>[2] “Professional Engineers Using Software-Based Engineering Tools,” April 2011, <a href="https:&#x2F;&#x2F;www.peo.on.ca&#x2F;sites&#x2F;default&#x2F;files&#x2F;2019-07&#x2F;Professional%20Engineers%20Using%20Software-Based%20Engineering%20Tools.pdf" rel="nofollow noreferrer">https:&#x2F;&#x2F;www.peo.on.ca&#x2F;sites&#x2F;default&#x2F;files&#x2F;2019-07&#x2F;Profession...</a>
评论 #37713586 未加载