TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Show HN: ColBERT Build from Sentence Transformers

66 pointsby raphaeltyover 1 year ago

6 comments

ramozover 1 year ago
Anecdote: neural-cherche seems useful as I have analysts creating positive &amp; negative feedback data (basically thumbs-up&#x2F;down signals) that we will use to fine tune retrieval models.<p>Assuming not much effort is required to make this work for similar models? (i.e. BGE)
评论 #38319521 未加载
tinyhouseover 1 year ago
Looks cool. A couple of questions: 1. Does it support fine tuning with different losses? For example, where you don&#x27;t need to provide negatives and it uses the other examples in the batch as negatives 2. Can you share inference speed info? I know that Colbert should be slow since it creates many embeddings per passage
评论 #38318703 未加载
kamranjonover 1 year ago
What sort of high level user facing feature could you build with this?
评论 #38323235 未加载
espadrineover 1 year ago
I like the inclusion of both positive and negative examples!<p>Do you have advice for how to measure the quality of the finetuning beyond seeing the loss drop?
评论 #38318734 未加载
barefegover 1 year ago
Do you need to have the same number of positive and negatives? Is there any meaning of pairing a positive an a negative in the triplet?
评论 #38319541 未加载
vorticalboxover 1 year ago
Is a negative document one that doesn&#x27;t match the query?
评论 #38318887 未加载