My cofounder and I used to work at Robinhood where we shipped the company’s first OAuth integrations, so we know a lot about how data moves between companies.<p>For example, we know that the pain of building new API integrations scales with the level of fragmentation and number of competing "standards". In the current meta, we see this pain with a lot of AI startups who invariably need to connect to their customers data, but have to support 50+ integrations before they even scale to 50+ customers.<p>This is the process for an AI startup to add a new integration for a customer:<p>- Pore over the API docs for each source application and write a connector for each<p>- Play email tag to find the right stakeholders and get them to share sensitive API keys, or give them an OAuth app. It can take 6+ weeks for some platforms to review new OAuth apps<p>- Normalize data that arrives in a different formats from each source (HTML, XML, text dumps, 3 different flavors of markdown, JSON, etc)<p>- Figure out what data should be vectorized, what should be stored as SQL, and what should be discarded<p>- Detect when data has been updated and synchronize it<p>- Monitor when pipelines break so data doesn’t go stale<p>This is a LOT of work for something that doesn’t move the needle on product quality.<p>That’s why we built Psychic.dev to be the fastest and most secure way for startups to connect to their customer’s data. You integrate once with our universal APIs and get N integrations with CRMs, knowledge bases, ticketing systems and more with no incremental engineering effort.<p>We abstract away the quirks of each data source into Document and Conversation data models, and try to find a good balance to allow for deep integrations while maintaining broad utility. Since it’s open source, we encourage founders to fork and extend our data models to fit their needs as they evolve, even if it means migrating off our paid version.<p>To see an example in action, check out our demo repo here: <a href="https://github.com/psychic-api/psychic-langchain-tutorial/">https://github.com/psychic-api/psychic-langchain-tutorial/</a><p>We are also open source and open to contributions, learn more at docs.psychic.dev or by emailing us at founders@psychic.dev!
This looks like a promising idea, and potentially solves a problem I’ve faced recently.<p>It’s been a challenge getting my SaaS app connected to fragmented APIs belonging to many of my customers, each with their own use cases.<p>One of the biggest hurdles I faced was Asana’s API. A customer wanted us to hook into an Asana webhook: when a task was added to their project, they needed to push the data to their account on our platform (and vice-versa).<p>But because Asana is so “flexible” (ha!), all the field names in their API responses were UUIDs. It was a total nightmare to figure out which key/values were the ones we wanted. I’m not sure if/how Psychic can figure this out.<p>Secondly, maybe it’s just how your landing page is phrased — but this feels like “IFTTT <i>for</i> AI tooling”, rather than “IFTTT <i>powered by</i> AI”.<p>I see a lot more commercial value in the latter direction. To most prospective customers, your headline “Easy to set up” doesn’t mean a React hook and Python SDK. Just give us a REST API! :)
I have just built a Notion integration that pulls pages into our statically built API documentation website, and it was, frankly, horrible. While the end result works (the team can write docs in the tool they know, the site is built and released from the structure there automatically), it was a lot of pain to even discern children from their parent pages, parse attributes or let alone get databases right.<p>Considering I’ll need to get other data in there soon, probably, I’m in the market for Psychic. The question I have, though, is: can you really reconcile the Schema of several apps into one, without settling for the smallest common denominator? What do you do about platforms like Notion, that don’t even provide webhooks? We settled on polling, but obviously that won’t scale.
The reason to use the Pro hosted plan is for support and the convenience of not needing to self-host? Or is there actual functionality you don't get by self-hosting?
Congrats on the launch! I am curious how you see apps evolving to provide natural language interfaces on top of existing APIs. Also, do you plan on strictly remaining the data layer (between a startup and its API integrations) or do you plan on dogfooding your platform for a particular killer use case?