TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Artificial Intelligence for Quantum Computing

66 pointsby jimminyx6 months ago

11 comments

Escapado6 months ago
My master thesis was on using machine learning techniques to synthesise quantum circuits. Since any operation on a QC can be represented as a unitary matrix my research topic was that, using ML, given a set of gates, how many and in what arrangement of them you could generate or at least approximate this matrix. Another aspect was, given a unitary matrix, could a neural network predict a number of gates needed to simulate that matrix as a QC and thereby give us a measure of complexity. It was a lot of fun to test different algorithms from genetic algorithms to neural network architectures. Back then NNs were a lot smaller and I trained them mostly on one GPU and since the matrices get exponentially bigger with the amount of qbits in the circuit it was only possible for me to investigate small circuits with less than a dozen qubits but it was still nice to see that in principle this worked quite well.
评论 #42157238 未加载
评论 #42156862 未加载
评论 #42156842 未加载
评论 #42156956 未加载
aithrowawaycomm6 months ago
Outside of a few odd citations in the intro (which is innocuous) I don&#x27;t see anything shady or dishonest in this paper after a cursory skimming. But I do worry about 60% of the authors being NVIDIA employees, where the corresponding author is not a staff scientist, but rather a &quot;technical marketing engineer&quot; whose job &quot;focuses on inspiring developers by sharing examples of how NVIDIA quantum technologies can accelerate research.&quot;[1] So I am wondering about what articles <i>didn&#x27;t</i> make it to this review, where ANN&#x2F;GPU-accelerated QC research didn&#x27;t work very well or was too expensive to justify. It makes me suspect I am being advertised to, and that this paper must be read with an unusually high degree of scrutiny.<p>Like I said: there&#x27;s nothing obviously shady or dishonest here. But this is also an arXiv paper, so no need for authors to disclose conflicts of interest. (OpenAI has pioneered the technique of using the arXiv to distribute academic infomercials with zero scientific value.) And I worry about the distortion this stuff has on honest scientific work. Bell Labs was always an independent subsidiary: Bell&#x27;s marketing staff did not take ownership of Bell Labs&#x27; research. The fact that NVIDIA&#x27;s marketer has a PhD and an &quot;engineer&quot; title doesn&#x27;t actually mitigate any of this.<p>[1] <a href="https:&#x2F;&#x2F;developer.nvidia.com&#x2F;blog&#x2F;author&#x2F;mawolf&#x2F;" rel="nofollow">https:&#x2F;&#x2F;developer.nvidia.com&#x2F;blog&#x2F;author&#x2F;mawolf&#x2F;</a>
评论 #42157967 未加载
dr_dshiv6 months ago
The most remarkable thing is using frontier models to write and then run code on quantum computers (eg IBM’s). It’s amazing — you can even teach with a “creative coding” approach (without the multiple courses in quantum physics).<p>It’s not even the code that is the hardest part. In general, it is very difficult to frame a given problem domain (ie, any other field of science or optimization problem) as a problem addressable with a quantum computer. Quantum computer scientists have not made this easy—with LLMs, it still isn’t trivial, but it is a huge leap in accessibility.<p>Warning: LLMs hallucinate a ton within this area. Many things can go wrong. But the fact that sometimes they are correct is amazing to me.<p>We’ve run some studies showing the importance of expertise in this area: participants with quantum background and coding skills were much more effective at solving quantum problems with LLMs than novices, for instance.
medo-bear6 months ago
My next paper will be titled<p>Artificial Intelligence for Quantum Computing with Applications to Blockchain ... in Rust
评论 #42156702 未加载
bastloing6 months ago
Throw in some crypto and you&#x27;ve got a recipe for some great word salad!
评论 #42156960 未加载
JanisErdmanis6 months ago
One thing that comes to my mind and reminds me by looking into the examples is the fragility of existing implementation processes for quantum computers. Pulse optimisation, frequency allocation to minimise cross-talk between qubits, and the development of custom error correction codes tailored for specific quantum computers, circuits, or computations are unpredictable in their successes and, hence, inherently unscalable. It looks like the primary use for AI here is to optimise benchmark results for publishing rather than enable practical quantum computations.
itchyjunk6 months ago
Petition to add AI to HN so it lists out the gist of papers and articles posted xD . I also always thought AI would overlap with Q Annealing in some way vaguely because both are optimization process. But maybe this is where QC helps AI and not the other way around.
评论 #42156197 未加载
vivzkestrel6 months ago
me and my friend used to joke about how to get VC funding a few years ago and it&#x27;s about to be true soon at this rate VC: what are you building? me: Decentralized deep learning powered vertical farming using quantum computing on the blockchainfor onions
评论 #42157293 未加载
mg6 months ago
Is there a metric which lets us track how far&#x2F;close we are to do something useful with quantum computers?<p>And when we look at that metric and extrapolate its progression over the last years, when do we expect QCs to hit datacenters?
评论 #42156816 未加载
评论 #42156185 未加载
评论 #42156245 未加载
评论 #42172458 未加载
Jabbs6 months ago
Aliens for Time Travel (is how I read this)
评论 #42156431 未加载
hulitu6 months ago
&gt; Artificial Intelligence for Quantum Computing<p>They forgot the blockchain. &#x2F;s