TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Show HN: ColBERT Build from Sentence Transformers

66 点作者 raphaelty超过 1 年前

6 条评论

ramoz超过 1 年前
Anecdote: neural-cherche seems useful as I have analysts creating positive &amp; negative feedback data (basically thumbs-up&#x2F;down signals) that we will use to fine tune retrieval models.<p>Assuming not much effort is required to make this work for similar models? (i.e. BGE)
评论 #38319521 未加载
tinyhouse超过 1 年前
Looks cool. A couple of questions: 1. Does it support fine tuning with different losses? For example, where you don&#x27;t need to provide negatives and it uses the other examples in the batch as negatives 2. Can you share inference speed info? I know that Colbert should be slow since it creates many embeddings per passage
评论 #38318703 未加载
kamranjon超过 1 年前
What sort of high level user facing feature could you build with this?
评论 #38323235 未加载
espadrine超过 1 年前
I like the inclusion of both positive and negative examples!<p>Do you have advice for how to measure the quality of the finetuning beyond seeing the loss drop?
评论 #38318734 未加载
barefeg超过 1 年前
Do you need to have the same number of positive and negatives? Is there any meaning of pairing a positive an a negative in the triplet?
评论 #38319541 未加载
vorticalbox超过 1 年前
Is a negative document one that doesn&#x27;t match the query?
评论 #38318887 未加载