TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

How crawlers impact the operations of the Wikimedia projects

7 点作者 panic大约 1 个月前

1 comment

xacky大约 1 个月前
Using AI with Wiki content will eventually be seen as bad as vandalizing Wikis. The Wikimedia commons has over 100 million files, and most of them will not be seen by humans but instead be just scraped by AI bots which drains donors funds and wastes bandwidth and puts stresses on cpu and hard drive resources. If I ran Wikimedia I would enforce a strict human generated content and human access only policy to protect the project.