TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: How should I burn $8k for Google Gemini 1.5 Pro?

12 pointsby hoerzu11 months ago
Was looking into applying LLMs onto entity extraction in earnings calls. (Around 400mb of text data). 1 Million tokes is around 1-3$.<p>Open for ideas and experiments.

7 comments

reneberlin11 months ago
Just be careful with the &quot;temperature&quot; when dealing with numbers and financial data as well. You should check samples of the retrieved data by hand.<p>Setting the temperature lower than 0.9 means reducing the &quot;creativity&quot; and making it less prone to hallucinate.
reneberlin11 months ago
That&#x27;s not how you do it. You use a Vectordatabase and a Retriever. This way, not all tokens of a document are used with the prompt, just the relevant parts will end up in the conversation &#x2F; prompts. This way you save a lot of money and you are NOT limited to use GoogleAI - you can use whatever AI you want.
评论 #40641158 未加载
reneberlin11 months ago
Use Flowise and a VectorDB like upstash or pinecone.<p>Flowise Tutorial <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=V7uBy3VQJAc" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=V7uBy3VQJAc</a>
langcss11 months ago
Can’t you pre-filter it with a vector lookup or something cheap first (gpt3.5?). Or even simply filter first by other things you are considering for the trading decision (price to earnings for example). That might save you a lot of cost.
BOOSTERHIDROGEN11 months ago
Could someone from hedge funds kindly share their current technology stack if they utilize LLMs?
dissahc11 months ago
are you sure you need a powerful model for this? llama3-8b is at least 10 times cheaper and might suffice for something like this
评论 #40676439 未加载
infecto11 months ago
What’s the goal?