TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Show HN: Ingest data from your customers (Prequel YC W21)

40 pointsby ctc24about 2 years ago
Hey HN! Charles here from Prequel (https:&#x2F;&#x2F;prequel.co). We just launched the ability for companies to import data from their customer’s data warehouse or database, and we wanted to share a little bit more about it with the community.<p>If you just want to see how it works, here’s a demo of the product that Conor recorded: https:&#x2F;&#x2F;www.loom.com&#x2F;share&#x2F;4724fb62583e41a9ba1a636fc8ea92f1.<p>Quick background on us: we help companies integrate with their customer’s data warehouse or database. We’ve been busy helping companies export data to their customers – we’re currently syncing over 40bn rows per month on behalf of companies. But folks kept on asking us if we could help them import data from their customers too. They wanted the ability to offer a 1st-party reverse ETL to their customers, similar to the 1st-party ETL capability we already helped them offer. So we built that product, and here we are.<p>Why would people want to import data? There are actually plenty of use-cases here. Imagine a usage-based billing company that needs to get a daily pull from its customers of all the billing events that happened, so that they can generate relevant invoices. Or a fraud detection company who needs to get the latest transaction data from its customers so it can appropriately mark fraudulent ones.<p>There’s no great way to import customer data currently. Typically, people solve this one of two ways today. One is they import data via CSV. This works well enough, but it requires ongoing work on the part of the customer: they need to put a CSV together, and upload it to the right place on a daily&#x2F;weekly&#x2F;monthly basis. This is painful and time-consuming, especially for data that needs to be continuously imported. Another one is companies make the customer write custom code to feed data to their API. This requires the customer to do a bunch of solutions engineering work just to get started using the product – which is a suboptimal onboarding experience.<p>So instead, we let the customer connect their database or data warehouse and we pull data directly from there, on an ongoing basis. They select which tables to import (and potentially map some columns to required fields), and that’s it. The setup only takes 5 minutes, and requires no ongoing work. We feel like that’s the kind of experience every company should provide when onboarding a new customer.<p>Importing all this data continuously is non-trivial, but thankfully we can actually reuse 95% of the infrastructure we built for data exports. It turns out our core transfer logic remains pretty much exactly the same, and all we had to do was ship new CRUD endpoints in our API layer to let users configure their source&#x2F;destination. As a brief reminder about our stack, we run a GoLang backend and Typescript&#x2F;React frontend on k8s.<p>In terms of technical design, the most challenging decisions we have to make are around making database’s type-systems play nicely with each other (kind of an evergreen problem really). For imports, we allow the data recipient to specify whether they want to receive this data as JSON blob, or as a nicely typed table. If they choose the latter, they specify exactly which columns they’re expecting, as well as what type guarantees those should uphold. We’re also working on the ability to feed that data directly into an API endpoint, and adding post-ingestion validation logic.<p>We’ve mentioned this before but it bears worth repeating. We know that security and privacy are paramount here. We&#x27;re SOC 2 Type II certified, and we go through annual white-box pentests to make sure that all our code is up to snuff. We never store any of the data anywhere on our servers. Finally, we offer on-prem deployments, so data never even has to touch our servers if our customers don&#x27;t want it to.<p>We’re really stoked to be sharing this with the community. We’ll be hanging out here for most of the day, but you can also reach us at hn (at) prequel.co if you have any questions!

6 comments

alexpetraliaabout 2 years ago
So is this like Fivetran, except between clients as opposed to vendor-client?<p>If so, any idea why most data integrations tools have not done this (or have they)? What is so tricky that they could not extend their tools to cover a customer&#x27;s Postgres database?
评论 #35173613 未加载
gregw2about 2 years ago
Is it a full refresh from the source each time, or is it incremental, and if incremental, what assumptions do you make or not make about keys, dupes, etc?
评论 #35174580 未加载
mdanielabout 2 years ago
So is the value-add that the customers of Company-A (who is <i>your</i> customer) entrust <i>you</i> with credentials to their databases versus entrusting Company-A with them directly?
评论 #35174956 未加载
ttpphdabout 2 years ago
If one side of your business is ingesting data, is the other side excreting it?
评论 #35174585 未加载
lisasaysabout 2 years ago
<i>The setup only takes 5 minutes,</i><p>Nothing ever takes 5 minutes. Remember these are engineers you&#x27;re talking to here.
评论 #35173437 未加载
conormccarterabout 2 years ago
Hey HN -- Conor here, aka the guy from the demo. Happy to answer any questions you have!