TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: S3 but with append?

2 pointsby dhbradshawover 6 years ago
Ideally, append-only.<p>Basically, what do you do if you want to pool a bunch of streams into a (potentially very large) log file?

3 comments

abd12over 6 years ago
You can use Kinesis Firehose to stream data to S3. It&#x27;ll buffer data for a while -- you set thresholds based on size of data or time, whichever is hit first -- then it will save the data to S3.<p>It won&#x27;t be a single large file, but they&#x27;ll all have the same prefix based on date. Most data processing tools will let you suck up an entire prefix and treat it like a single file.
idunno246over 6 years ago
If you know the size ahead of time, you can use multipart uploads. Otherwise you would have to buffer to disk. you could consider kinesis firehose which had dumping to s3 built in<p>The google storage api has a mode where you can stream bytes, and it doesn’t become visible until you close it(and then can’t modify like s3). And unlike s3, it requires a delete permission to be able to overwrite, though with s3 you can turn on versioning and not give deleteobjectversion.
nikonyrhover 6 years ago
You could always parition them into chunks and upload to S3 as individual objects.