TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Removing your files from S3 can cost thousands of dollars

46 点作者 petewarden大约 15 年前

3 条评论

cperciva大约 15 年前
This guy must have an incredibly large number of files if 2 XL instances running deletes for a month only cuts his S3 usage by 10%. I routinely see 700 DELETEs per second from a small EC2 instance; if performance scales linearly with CPU speed, he should be able to do 700 * 8 * 2 = 11200 DELETEs per second, or 29 billion DELETEs per month; if that's 10% of his objects, he must have 290 billion objects stored.<p>Except that, oops, S3 only passed 100 billion objects a couple of months ago, and at its current rate of growth is probably still under 150 billion, never mind 200 or 290 billion.<p>My guess is that the "FAIL" here is whatever process he's using for deleting files -- not in S3 itself.
评论 #1334145 未加载
评论 #1333868 未加载
评论 #1333985 未加载
评论 #1335611 未加载
评论 #1334076 未加载
评论 #1333885 未加载
danielrhodes大约 15 年前
I have also had problems deleting large amounts of data from S3.<p>The problem that underlies the problem the author was having is that S3 is quite slow and you can't do batch deletes. On top of that, they limit the number of API requests you can make per second. In some case it has been faster to download an object and re-upload it again than do a copy.
评论 #1334038 未加载
ck2大约 15 年前
$1500 would buy a really nice server or two on a colo rack.