TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: AWS S3 billed us over $1k. What cheaper alternatives do you recommend?

15 pointsby zeeshanmover 10 years ago
I run an image sharing website and AWS recently billed us over $1k for bandwidth costs:<p>http:&#x2F;&#x2F;i.imgur.com&#x2F;sGrYboT.png<p>What cheaper alternatives do you recommend?

17 comments

davismwflover 10 years ago
While you might be able to reduce costs through dedicated services, it also means you are going to have to write more code, maintain it and also maintain the machines etc. Don&#x27;t underestimate that there are significant costs to do this reliably and maintain it over time as things grow. If you do need to do it though, I might suggest trying at least at first to use a csync2 with lsync on linux to manage the distribution of the files across the machines. We use this even in AWS for a site and it works really reliably, quick and was easier then managing a distributed FS.<p>But before I would go that route I&#x27;d probably first try setting up a proper CDN for the images and take advantage of the caching settings so you can reduce your S3 bandwidth charges.<p>Also, if you don&#x27;t already have it, I&#x27;d also setup some monitoring on your AWS account to give you a weekly update on account charges &amp; usage so you can see what is going on.
willejsover 10 years ago
Set up a cheap CDN (MaxCDN is a good bet as someone else has mentioned) this website will help - <a href="http://www.cdncalc.com/" rel="nofollow">http:&#x2F;&#x2F;www.cdncalc.com&#x2F;</a> this will half your monthly costs.<p>I would stick with S3 for the origin to start with, setting up a bunch of servers to reliably store and serve data is a pain, and one you should avoid it if you can. On the CDN enable origin shielding, and set TTL of the images to never expire if you can. This will lead to images only to be served once from s3. When they upload, link to the CDN&#x27;d asset, it will pre-warm the CDN for subsequent requests.<p>Without knowing the trends in your traffic, how much data you are storing etc, its hard to give you really good advice.
评论 #8801278 未加载
radqover 10 years ago
What we do is run Varnish in front of S3 to cache files on machines with cheaper bandwidth.<p><a href="https://github.com/hummingbird-me/ansible-deploy/blob/master/roles/varnish/templates/default.vcl" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;hummingbird-me&#x2F;ansible-deploy&#x2F;blob&#x2F;master...</a><p>It is very little work to set up, and you don&#x27;t need to modify your code and migrate your files to a different service. Setting up a CDN might have been easier but this works out cheaper for us.
vitovitoover 10 years ago
Host it yourself with dedicated servers in a colocation center.<p>S3 is expensive for use cases like an image sharing service. Running your own servers with dedicated, unmetered bandwidth (or at least metered bandwidth in the 20TB+ range) is cheaper.<p>If 11TB of network transfer is spread out evenly over the month, a 100mbit uplink would handle it with plenty of room to spare. (Your traffic is probably not evenly distributed, it&#x27;s probably very bursty.)
nodesocketover 10 years ago
Setup an origin box (or a few and use a load balancer) running nginx. Then use a CDN (we love MaxCDN) to pull from the origin. Make sure you setup the cache headers right in nginx. Something like:<p><pre><code> location ~* \.(?:ico|js|css|gif|jpe?g|png|xml)$ { expires 30d; add_header Pragma public; add_header Cache-Control &quot;public, must-revalidate, proxy-revalidate&quot;; }</code></pre>
andsmi2over 10 years ago
This is a lot of data to transfer. As suggested caching and cloud front. And start figuring out how to pay for it (advertising or charging customers-- I would think with that much data being transferred there should be a revenue stream already to cover $1k a month or perhaps it isn&#x27;t scalable)
olalondeover 10 years ago
Another option that was not mentioned is to make sure your HTTP server has caching setup properly.
JonMover 10 years ago
Perhaps consider an enterprise agreement with AWS? We&#x27;ve saved $large without needing to switch providers just by committing to a minimum monthly volume of data transfer.<p>Worth seeing the numbers before you invest in dev &amp; operations.
iSlothover 10 years ago
Go buy some cheap dedicated servers from places like OVH and create your own fairly simple CDN&#x2F;Hosting. You could easily chop that cost by 70%+<p>Or is there anything specific in S3 feature that you need replicated?
评论 #8800514 未加载
bhaumikover 10 years ago
Not exactly an alternative but you can get $1000 off by finishing two entrepeneurship courses on EdX.<p><a href="https://www.edx.org/AWS-activate" rel="nofollow">https:&#x2F;&#x2F;www.edx.org&#x2F;AWS-activate</a>
评论 #8802082 未加载
zeeshanmover 10 years ago
Thanks for all the ideas. For now we are going to route traffic via cloud front and set cache to almost never expire. We&#x27;ve also compressed images and it looks good so far.
Terrettaover 10 years ago
Other suggestions here are good. In the meantime, start saving money instantly by fronting S3 with AWS CloudFront instead of serving images directly from S3.
rememberlennyover 10 years ago
People have mentioned a CDN or Varnish. You need something that will cache the images on a CDN level to reduce traffic. Cloudflare, Fastly, etc.
x3sphereover 10 years ago
I&#x27;d recommend 100TB.com, can even get a box in Softlayer&#x27;s datacenter from there for $50&#x2F;extra a month.
hubotover 10 years ago
CDN would help a lot in this case and is a lot cheaper.
infinitoneover 10 years ago
I find digitalocean pretty cheap.
ddorian43over 10 years ago
runabove(cheap s3 alternative) or soyoustart(cheap 200mbit servers)