I was considering something like this to have a local (or local networked) cache store for S3, I am doing 99,9% of readings, so it could be nice to put the cache in the S3-layer.<p>Does anyone knows if that exists?
How does this work when the object stores don't support all the features of S3?<p>For instance Atmos doesn't have public buckets, DNS names for buckets, named keys implemented like S3, regions, etc.<p>But it does have features that S3 doesn't have like byte range updates, erasure coding, etc.
Little help, please.<p>We have a process that uses s3express (aka using the S3 API) to copy files from local filesystem to S3.<p>S3Proxy would allow the process to copy files to [another_server, atmos, azure] ?
This is great timing for me, currently working on trying to iron out some bugs in a project with S3, and this helps me eliminate one line of inquiry. Nice project!