TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Amazon S3 Batch Operations

75 pointsby jeffbarrabout 6 years ago

5 comments

zacharyozerabout 6 years ago
Batch delete, batch delete, wherefore art thou batch delete?
评论 #19793560 未加载
评论 #19793682 未加载
评论 #19793547 未加载
usr1106about 6 years ago
Symptomatic for that business that cost is not mentioned in the whole announcement. I am getting more and more skeptic against all serverless because the cost is really difficult to estimate, plan, and manage. Of course if used right some of these services can be cost-efficient. But in real life not all SW is done right...<p>If you buy a server and run a poorly architectured system on it you note that it does not perform and need to make changes.<p>If you use serverless and run a poorly architectured system on it you pay and you need to make changes (after someone noted the bill). Yes, there are cost reports but they are not easy to use and understand. With a performance bottleneck the system limits while you are trying to understand the performance measurements. In the cloud case you are paying while trying to understand what is wrong.<p>Of course in a big corporation money does not matter to a software developer. But in a small company the bill paid to the cloud provider might have a direct impact on the company being able to pay your salary in the near future.
评论 #19796067 未加载
social_quotientabout 6 years ago
“Invoking AWS Lambda Functions ... I can invoke a Lambda function for each object, and that Lambda function can programmatically analyze and manipulate each object. ”<p>Wow thanks!
评论 #19794037 未加载
seancolemanabout 6 years ago
A few months back, I designed a small background system requiring a flat key&#x2F;value store for tracking large amounts of data (&gt;10GB&#x2F;day). I was hoping to use S3 as a cheap key&#x2F;value store, but the lack of batch operations, requiring individual puts, made it performance-prohibitive, so I went with DynamoDB. It&#x27;s worked out great but I&#x27;ll always wonder what could have been with S3 if I had batch operations back then.
评论 #19794502 未加载
moes_devabout 6 years ago
I was hoping to use this for moving large video files to a different prefix, but just spotted a limitation of the PUT Object Copy - &quot;Objects to be copied can be up to 5 GB in size.&quot;<p>Cool feature otherwise.