TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Dynamic Examples LLM optimization (2023)

1 点作者 ashiban超过 1 年前

1 comment

ashiban超过 1 年前
TLDR:<p>&quot;The primary reason to introduce a new example is when the LLM is incorrectly identifying technology as ABSTRACT vs not, missing connections, or even missing resources entirely. However, we don’t have unlimited tokens for every example we might need. An optimization we came up with here is Dynamic Examples using pre-filtering. Instead of providing examples that would generalize to everything, we focus the examples on what we can guess is in the queries.<p>We extract a list of technologies using word lists, which is easier than extracting their intents, and if we don’t find many matches, we assume that more ABSTRACT resources are present. Once extracted, we can then create a custom prompt by selecting specific examples to the technologies mentioned in the query and a set of bedrock examples including baseline rules for the different actions that expand language understanding.&quot;