TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Show HN: Remove personal identifiable data from a ChatGPT prompt

2 点作者 l1am09 个月前
Most companies we are talking to (in Germany) have the problem that they don&#x27;t want their employees to enter sensitive data into ChatGPT (or other LLMs).<p>Mostly this is &quot;ensured&quot; through workshops and guidelines. But removing PII from a prompt is super annoying, especially when you copy-paste a lot of data back and forth.<p>For this use case we did build PromptSecure. A webapp that automatically removes all PII from a prompt. Fully in the browser. NO DATA IS EVER SEND TO US!<p>We hope that by making the barrier to entry super low, employees actually use this.<p>Hope you find this useful and discover a few bugs in it ;)<p>Monetization: This widget&#x2F;in-browser tool will stay free forever! If you want to embed the technology into your company internal chat system, talk to us :D

暂无评论

暂无评论