TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Gradient Descent in Activation Space: A Tale of Two Papers

4 点作者 kofejnik大约 2 年前

1 comment

PaulHoule大约 2 年前
It seems to me that the examples in the prompt supply context that affects the conditional probability distribution for what gets generated. These are going to activate parts of the network that give better answers. It doesn't seem any more mysterious to me than the fact that you can ask the network "who wrote this?" and it is also able to sense the conditional probability distribution of the language in the sample.