TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

How crawlers impact the operations of the Wikimedia projects

7 pointsby panicabout 1 month ago

1 comment

xackyabout 1 month ago
Using AI with Wiki content will eventually be seen as bad as vandalizing Wikis. The Wikimedia commons has over 100 million files, and most of them will not be seen by humans but instead be just scraped by AI bots which drains donors funds and wastes bandwidth and puts stresses on cpu and hard drive resources. If I ran Wikimedia I would enforce a strict human generated content and human access only policy to protect the project.