I'm fairly certain this has been a feature of S3 since its launch in 2006. Here's as far back as I could find on archive.org[0] it's May 24th 2006 and the URL indicates the documentation was published on March 2006.<p>[0] <a href="http://web.archive.org/web/20060421112025/http://docs.amazonwebservices.com/AmazonS3/2006-03-01/" rel="nofollow">http://web.archive.org/web/20060421112025/http://docs.amazon...</a>
Anyone downloading software or any content over a slow and flaky network would question why people don't use torrents for distribution with some "permanent" seeder.<p>I really had a horrible experience sometime back, while downloading a software of around 2 GB. Where a network would die down and chrome would discard the partial download.
Just moved a bunch of our company's large downloads over to S3 recently (mostly BSP releases and drivers), and this would be awesome for that 100MB - 4GB package size range that they have!<p>Tried it with a few clients: Transmission does not seem to work, though (both web client, and GTK): "Tracker gave HTTP response code 404..."; rtorrent seems to be able to download (but looks like no upload?); deluge downloads well and also kickstarts the transmission clients.<p>I wonder what's unacceptable in the Amazon infrastructure for transmission... And the 80kb/s max seed rate mentioned in the comments might be a showstopper already.
I'm currently using this to build a decentralized distribution system to ship relatively big database files to iOS clients without having to go through S3 all the time. It works quite well but some providers throttle the bittorrent traffic (or worse, throw in RST packets) so I'm not sure how well this will work in practice. Also note that the S3 default seeders only upload at around 80kb/s, so you'll always need at least one external seed to get good performance.
A bit OT but...<p><i>Javascript is disabled or is unavailable in your browser.
To use the AWS Documentation, Javascript must be enabled.</i><p>...really? Their documentation is perfectly readable without it. All the links are real, bookmarkable links; even the buttons for the PDF, forums, and Kindle version work.
Are there easy to use multiplartform BT libs around that would let you use this just for high bandwidth, nearby peers? Eg filter peers by latency as a first pass.
I've used this in the past to distribute scrapes of black-markets (eg <a href="https://www.reddit.com/r/DarkNetMarkets/comments/2zps7q/evolution_forums_mirrorscrapes_torrent_released/" rel="nofollow">https://www.reddit.com/r/DarkNetMarkets/comments/2zps7q/evol...</a> & <a href="https://www.reddit.com/r/DarkNetMarkets/comments/2zllmv/evolution_market_mirrorscrapes_torrent_released/" rel="nofollow">https://www.reddit.com/r/DarkNetMarkets/comments/2zllmv/evol...</a> ). It saves a lot of bandwidth (which is not all <i>that</i> cheap on S3) and so far the <5gb restriction hasn't been an issue.
Wow! This may lower the entry barrier for a new movie/TV streaming company. They can lower their costs by offsetting the bandwidth to users. Legal torrent streaming might have just got the boost it requires.
I once read an article or email about why web browsers can't/won't implement BitTorrent clients -- one of the biggest barriers IMO to making BitTorrent more ubiquitous for file download/upload is the fact that I can't just click on a link and have my browser handle the rest, like I can with an HTTP/FTP link.<p>Does anyone have a link that explains why browsers won't do this, or at least a brief explanation?
Apart from myself, anyone else hoping to see <i>downloading</i> straight into an S3 bucket? (Seeding from a bucket has been around for some time, you could imagine my disappointment when clicking the link)