TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: How to train a LLM against a knowledge base

3 pointsby njsubedi10 months ago
I understand that this might be a bit late to ask this question, and I don’t know a lot about AI&#x2F;ML in general. I have to train&#x2F;tune a pre-trained model for specific context. By context, I mean a knowledge base, a product’s documentation, or user manual, or in my case, an inventory of electronic items in our warehouse.<p>I tried dumping the inventory information and basic BOM content as a system message for ChatGPT4-o model using their platform playground, and asking questions like “we are making 1000 power banks this month, so which components should I pre-order so that we don’t run out of them?”. This works as expected, but it took me a while to realize that each query used more than 30K tokens! That’s a quick way to lose money.<p>I am looking for a solution from someone who trained&#x2F;tuned a decent LLM on custom data, and I’m pretty sure a lot of other small business owners are looking for something like this. Thank you!

1 comment

PaulHoule10 months ago
I do most of my work so far with BERT models but if I was trying to fine-tune a generative model I think I&#x27;d try a T5 model.<p><a href="https:&#x2F;&#x2F;huggingface.co&#x2F;docs&#x2F;transformers&#x2F;en&#x2F;model_doc&#x2F;t5" rel="nofollow">https:&#x2F;&#x2F;huggingface.co&#x2F;docs&#x2F;transformers&#x2F;en&#x2F;model_doc&#x2F;t5</a><p><a href="https:&#x2F;&#x2F;medium.com&#x2F;nlplanet&#x2F;a-full-guide-to-finetuning-t5-for-text2text-and-building-a-demo-with-streamlit-c72009631887" rel="nofollow">https:&#x2F;&#x2F;medium.com&#x2F;nlplanet&#x2F;a-full-guide-to-finetuning-t5-fo...</a><p>Specifically you can show a T5 model input and output texts and it will try to learn the transformation between them. People tell me T5 models are relatively easy to train and they perform well on many tasks.<p>Note another approach to your problem is RAG<p><a href="https:&#x2F;&#x2F;www.promptingguide.ai&#x2F;techniques&#x2F;rag" rel="nofollow">https:&#x2F;&#x2F;www.promptingguide.ai&#x2F;techniques&#x2F;rag</a><p>If you have some specific documentation on your topic you could use the embedding to find some text that is relevant to the query. In fact this stacks great with the fine-tuning because you could train the model to, given a question and a relevant document, give an answer. T5 is good at that kind of basically summarization task.
评论 #40977543 未加载