You can use Kinesis Firehose to stream data to S3. It'll buffer data for a while -- you set thresholds based on size of data or time, whichever is hit first -- then it will save the data to S3.<p>It won't be a single large file, but they'll all have the same prefix based on date. Most data processing tools will let you suck up an entire prefix and treat it like a single file.
If you know the size ahead of time, you can use multipart uploads. Otherwise you would have to buffer to disk. you could consider kinesis firehose which had dumping to s3 built in<p>The google storage api has a mode where you can stream bytes, and it doesn’t become visible until you close it(and then can’t modify like s3). And unlike s3, it requires a delete permission to be able to overwrite, though with s3 you can turn on versioning and not give deleteobjectversion.