TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Tighter bounds on the expressivity of transformer encoders

76 点作者 bmc7505大约 2 年前

3 条评论

sharemywin大约 2 年前
We connect and strengthen these results by identifying a variant of first-order logic with counting quantifiers that is simultaneously an upper bound for fixed-precision transformer encoders and a lower bound for transformer encoders.<p>not sure what a fixed precision transformer is?
评论 #35704224 未加载
评论 #35704209 未加载
meltyness大约 2 年前
Does this mean that transformers are most akin to rule-based expert systems with opaque (currently) statistically-devised rules?
transformi大约 2 年前
is it possible to solve NP-hard problems with transformers&#x2F;LLM?
评论 #35710110 未加载
评论 #35709507 未加载
评论 #35708296 未加载
评论 #35708221 未加载
评论 #35708791 未加载