TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Machine Learning using Quantum Algorithms

72 点作者 vikram360超过 13 年前

7 条评论

esk超过 13 年前
Could please someone give a layman's explanation for this bombshell?<p>&#62; Assume I hide a ball in a cabinet with a million drawers. How many drawers do you have to open to find the ball? Sometimes you may get lucky and find the ball in the first few drawers but at other times you have to inspect almost all of them. So on average it will take you 500,000 peeks to find the ball. <i>Now a quantum computer can perform such a search looking only into 1000 drawers.</i> This mind boggling feat is known as Grover’s algorithm.<p>If 999,000 drawers are left unopened, how can the algorithm guarantee that the ball will be found?
评论 #3280683 未加载
评论 #3280719 未加载
评论 #3280740 未加载
评论 #3280574 未加载
benhamner超过 13 年前
This is an old article (from 2009). Hartmut Neven provided an update at ICML 2011 in the latter part of his keynote talk - <a href="http://techtalks.tv/talks/54457/" rel="nofollow">http://techtalks.tv/talks/54457/</a>
评论 #3280270 未加载
Wilya超过 13 年前
Non js-heavy version, for those who need/want it:<p><a href="http://googleresearch.blogspot.com/2009/12/machine-learning-with-quantum.html?v=0" rel="nofollow">http://googleresearch.blogspot.com/2009/12/machine-learning-...</a>
temphn超过 13 年前
Someone should modify the post to note that this is from Dec. 2009. Not sure what to think here. Google is endorsing, but IEEE is slamming. DWave actually has a reasonable looking dev kit on this page, but Scott Aaronson is quite critical.<p><a href="http://spectrum.ieee.org/computing/hardware/loser-dwave-does-not-quantum-compute" rel="nofollow">http://spectrum.ieee.org/computing/hardware/loser-dwave-does...</a><p><a href="http://www.dwavesys.com/en/dev-tutorials.html" rel="nofollow">http://www.dwavesys.com/en/dev-tutorials.html</a><p><a href="http://www.scottaaronson.com/blog/?p=306" rel="nofollow">http://www.scottaaronson.com/blog/?p=306</a><p><a href="http://www.scottaaronson.com/blog/?p=291" rel="nofollow">http://www.scottaaronson.com/blog/?p=291</a><p>Would be nice if a real expert weighed in on this thread.
rfurlan超过 13 年前
D-Wave's computer is basically a hardware solver for Ising/QUBO models. It is not programable in the traditional sense and you need to find a way to express the problem you want to solve in a way that maps well over the hardware.
评论 #3280694 未加载
评论 #3280680 未加载
zeratul超过 13 年前
In my line of work I see RAM limitation rather than CPU clock speed limitation. Will this technology also solve the big data problem?
评论 #3280620 未加载
viscanti超过 13 年前
Quantum Algorithms (run on quantum computers) are the future for Machine Learning. This article was from 2009 though, and I haven't seen a whole lot of progress in the field. Artificial Neural Networks that can model every possible value of every node at the same time is very powerful.
评论 #3280028 未加载
评论 #3280714 未加载