TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Google is improving searches by using BERT and understanding language context

12 点作者 bratao超过 5 年前

2 条评论

rkagerer超过 5 年前
<i>The way BERT recognizes that it should pay attention to those words is basically by self-learning on a titanic game of Mad Libs. Google takes a corpus of English sentences and randomly removes 15 percent of the words, then BERT is set to the task of figuring out what those words ought to be. Over time, that kind of training turns out to be remarkably effective at making a NLP model “understand” context</i><p>Great example of &quot;explain it to your grandpa&quot; ability.
nailer超过 5 年前
&gt; In one example Google discussed at a briefing with journalists yesterday, its search algorithm was able to parse the meaning of the following phrase: “Can you get medicine for someone pharmacy?”<p>Does that phrase have meaning?
评论 #21356790 未加载
评论 #21356991 未加载