TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

From models of galaxies to atoms, simple AI shortcuts speed up simulations

85 pointsby DarkContinentover 5 years ago

9 comments

dukoidover 5 years ago
For some reason this reminds me of the famous xerox copier where the compression algorithm would swap out digits: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=6156238" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=6156238</a>
choegerover 5 years ago
I doubt it.<p>Simulations based on PDEs, ODEs, or DAEs have a particular mathematical foundation. Within the bounds of their solvers, they deliver precise results and can actually <i>forecast</i> physical behavior.<p>If such an &quot;emulator&quot; is much better in many cases but completely wrong in just one, it is basically useless, as presumably the verification of a solution takes as long as a classical simulation.
评论 #22339397 未加载
评论 #22338803 未加载
评论 #22338758 未加载
Fomiteover 5 years ago
Interestingly, my lab has been working in emulators for one of our simulation models, and we&#x27;re <i>really</i> struggling to make meaningful improvements.<p>It&#x27;s faster, but we&#x27;re not there yet on accuracy.
fxtentacleover 5 years ago
&quot;When they were turbocharged with specialized graphical processing chips, they were between about 100,000 and 2 billion times faster than their simulations.&quot;<p>Now the critical question is: How much faster is it without AI, just because of the specialized dedicated processing chips?<p>Otherwise, they might be comparing a single virtualized CPU core against a high-end GPU for things like matrix multiplication ... and then the result that GPU &gt; slow CPU isn&#x27;t really that impressive.
评论 #22337945 未加载
评论 #22338240 未加载
评论 #22338807 未加载
评论 #22340946 未加载
评论 #22338914 未加载
willis936over 5 years ago
I was at a talk last week where the speaker spent a little bit of time on using machine learning on a regression matrix that is trained by the results of a simulation. The simulation and variables in the regression matrix were chosen such that the AI could recreate an approximation of a known physical law. This is fairly exciting to me because if used to recreate a lot of laws in this field, it could then be used on experimental data to untangle some of the mess and identify the relationships for us. I could see this speeding along development of science.
aimoderateover 5 years ago
&gt; It randomly inserts layers of computation between the networks’ input and output, and tests and trains the resulting wiring with the limited data. If an added layer enhances performance, it’s more likely to be included in future variations.<p>Sounds a lot like genetic algorithms but with neural networks. I suspect we&#x27;ll see more of this as people figure out how to run the search over neural network architectures that fit their own domains. Convolutions and transformers are great and all but we might as well let the computers do the search and optimization as well instead of waiting on human insights for stacking functions.
joe_the_userover 5 years ago
The underlying paper was previously discussed on hn here:<p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=22132867" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=22132867</a><p>Note: The published paper is titled &quot;Up to 2B times acceleration of scientific simulations with deep neural search&quot;, which can raise some hackles, including mine. Doesn&#x27;t <i>prove</i> anything but still.
RoboTeddyover 5 years ago
Here&#x27;s a potential way to use adversarial techniques to generate training examples that could improve the accuracy of this approach: <a href="https:&#x2F;&#x2F;twitter.com&#x2F;RoboTeddy&#x2F;status&#x2F;1228828411050655744" rel="nofollow">https:&#x2F;&#x2F;twitter.com&#x2F;RoboTeddy&#x2F;status&#x2F;1228828411050655744</a>
chewxyover 5 years ago
Who&#x27;d think compression works so well?<p>(yes, neural networks are compression engines)
评论 #22337949 未加载
评论 #22337785 未加载
评论 #22337804 未加载
评论 #22337898 未加载
评论 #22337836 未加载