TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ideas in statistics that have powered AI

204 pointsby MAXPOOLalmost 4 years ago

10 comments

317070almost 4 years ago
&gt; Generative adversarial networks, or GANs, are a conceptual advance that allow reinforcement learning problems to be solved automatically. They mark a step toward the longstanding goal of artificial general intelligence while also harnessing the power of parallel processing so that a program can train itself by playing millions of games against itself. At a conceptual level, GANs link prediction with generative models.<p>What? Every sentence here is so wrong I have a hard time seeing what kind of misunderstanding would lead to this.<p>GAN&#x27;s are a conceptual advance of generative models (i.e. models that can generate more, similar data). Reinforcement learning is a separate field. Parallel processing is ubiquitous, and has nothing to do with GANs or reinforcement learning (they are both usually pretty parallellized). Self-play sounds like they wanted to talk about the alphago&#x2F;alphazero papers? And GANs are infamously not really predictive&#x2F;discriminative. If anything, they thoroughly disconnected predicition from generative models.
评论 #27765813 未加载
评论 #27765521 未加载
评论 #27764607 未加载
hyttioaoaalmost 4 years ago
&quot;Generalized adversarial networks, or GANs, are a conceptual advance that allow reinforcement learning problems to be solved automatically.&quot; -<p>&quot;Generalized&quot; :D Also the description is nonsense. This has nothing to do with reinforcement learning. Makes me wonder about the rest.
评论 #27761895 未加载
评论 #27762038 未加载
评论 #27761766 未加载
bjornsingalmost 4 years ago
I’m sorely missing Maximum Likelihood Estimation (MLE). It’s a statistical technique that goes back to Gauss and Laplace but was popularized by Fisher. In AI&#x2F;ML it’s often referred to as “minimizing cross-entropy loss”, but this is just a misappropriation &#x2F; reinvention of the wheel. The math is the same and MLE is a much more sane theoretical framework.
评论 #27763292 未加载
评论 #27765531 未加载
ehw3almost 4 years ago
&gt; 2. John Tukey (1977). Exploratory Data Analysis.<p>&gt; This book has been hugely influential and is a fun read that can be digested in one sitting.<p>Wow. The PDF is over 700 pages. That seems fairly impressive for single-sitting digestion.
heinrichhartmanalmost 4 years ago
Out of the 10 papers I am able to download 3 of them freely.<p>- For the papers I am quoted 26EUR - 39EUR<p>- For the books I am quoted 129EUR - 133EUR<p>This is audacious. Some of these papers are form the 70ies. And I highly doubt that the authors get any royalties from those sales.
评论 #27762571 未加载
评论 #27762615 未加载
评论 #27767699 未加载
评论 #27764091 未加载
vcdimensionalmost 4 years ago
I&#x27;m surprised they didn&#x27;t mention support vector machines and the kernel trick which was discovered by statisticians.
评论 #27761728 未加载
评论 #27762770 未加载
sgt101almost 4 years ago
How have they attributed GANs and Deep Learning to Statistics? I thought Goodfellow was doing an AI PhD and that Hinton is a biologically inspired &#x2F; neuroscience fellow?
评论 #27762748 未加载
评论 #27762022 未加载
评论 #27763243 未加载
评论 #27767755 未加载
bmc7505almost 4 years ago
<a href="https:&#x2F;&#x2F;statmodeling.stat.columbia.edu&#x2F;2020&#x2F;12&#x2F;09&#x2F;what-are-the-most-important-statistical-ideas-of-the-past-50-years&#x2F;" rel="nofollow">https:&#x2F;&#x2F;statmodeling.stat.columbia.edu&#x2F;2020&#x2F;12&#x2F;09&#x2F;what-are-t...</a>
sjg007almost 4 years ago
&lt;sarcasm&gt; Psssh.. it&#x27;s all math. &lt;&#x2F;scarcasm&gt;
master_yoda_1almost 4 years ago
half of these are relevant to small data problem which is not exactly we mean when we say AI.
评论 #27765290 未加载