TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: If I were to build an LLM app today...

3 pointsby thatxlinerover 1 year ago
There are so many libraries and frameworks out there, from LangChain to Ollama. While I could just use OpenAI&#x27;s Assistants API, I want to compose my LLM responses in a way that&#x27;s unique to my app; I don&#x27;t want to build yet another thin wrapper around ChatGPT, where the only selling point is the proprietary prompt behind the app. Additionally, I would like to be able to switch to a different LLM provider (e.g. in case OpenAI fails on us or local LLMs have finally caught up to the same quality).<p>What are your recommendations for AI libraries? Tell me about your experience.

4 comments

bbszover 1 year ago
There&#x27;s no reason to think that you can&#x27;t craft unique &quot;prompt flow&quot; without any of mentioned libraries. Standard OpenAI library, &quot;requests&quot; and a long thought about what exactly do you need from prompt engineering in your app it&#x27;s a time better spent than learning high-level representations and mundane of pretty big libraries. Remember that those will also obscure a lot useful knowledge for you in process and prompt engineering today is not that big of a field to benefit from such (I guess I wouldn&#x27;t want to build neural networks without helper libraries)<p>My experience with building around LLMs is that before you reach the prompt engineering phase and begin to craft fancy chain of reasoning flows on your data (<i>if you won&#x27;t discover a long the way that there are easier ways than LLM</i>), you&#x27;ll be pulling hairs for days trying to first define <i>what type of a data and in what format</i> to deliver to prompt so your output even falls within your base expectation.
meiralealover 1 year ago
The open AI API is becoming a default API. It&#x27;s easier to create wrappers for each LLM than using langchain in my opinion, it&#x27;s an unnecessary abstraction. The software around those wrappers is what matters, using langchain wouldn&#x27;t change that
lgrammelover 1 year ago
In case you&#x27;re using JS&#x2F;TS, I&#x27;ve been working on an abstraction layer for integrating AI models applications, unifying the API for common operations such as text streaming, object generation, and tool usage: <a href="https:&#x2F;&#x2F;github.com&#x2F;lgrammel&#x2F;modelfusion">https:&#x2F;&#x2F;github.com&#x2F;lgrammel&#x2F;modelfusion</a>
altdatasellerover 1 year ago
Use the tool that best solves your problem. What’s your problem? I would focus on that first