Now many tools/libs/framework (docker, chef, vagrant etc) downloads files from s3 on the fly. The download link's schema is the awesome "https". But a normal proxy can't cache the https downloads, any idea on how to tackle it?<p>E.g, When using vagrant and chef, each vm will run some cookbooks, and it is quite normal that a cookbook triggers a wget download in each vm. And when you tear down the whole env and start from scratch? Painful.<p>A possible solution: A wget alternative which sends the file url to cache server explicitly. And a customized cache server which can read the param and stream the file from local or internet.<p>Any idea? Thanks in advance.