We are looking for someone who likes building distributed systems to crawl the web to help people make smarter purchase decisions.<p>Crawling & indexing millions of pages per day is not an easy task, but you're good at it. You enjoy optimizing systems, making them perform faster, and appreciate git commits where more code is removed than is added.<p>Our backend stack currently consists of Postgres, Celery, Django, and ElasticSearch. Last week, we had a company hackathon to see who could build the fastest autocompleter to peruse a database of 2MM products. The fastest returned in 31ms.<p>We ship code quickly. It breaks sometimes, but we're very quick to fix it. See the following examples:<p>* http://news.ycombinator.com/item?id=3930043<p>* http://news.ycombinator.com/item?id=3384686<p>* http://news.ycombinator.com/item?id=3384401<p>If you enjoy early stage companies (you'll have a huge impact) and hard scalability problems, send us a message. We'd be thrilled to hear from you. omar@priceonomics.com. Or check out our jobs page: http://priceonomics.com/jobs.<p>tl;dr: write fast code, acquire glory: omar@priceonomics.com.