TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Noam Chomsky on the Future of Deep Learning

20 pointsby Gedxxover 4 years ago

1 comment

bra-ketover 4 years ago
Deep learning is mostly irrelevant for AGI but the best part of the article is bringing up the &quot;recursive process called Merge”.<p>This Merge [0] is called &quot;chunking&quot; in cognitive psychology [1, 2], first mentioned in classic paper &quot;The Magical Number Seven&quot; by George A. Miller [3].<p>In the original Chomsky work[0] it is buried so deep in linguistics jargon it&#x27;s easy to miss the centrality of this concept, which is the essence of generalization capability in biological mind.<p>It&#x27;s the JOIN in Leslie Valiant LINK&#x2F;JOIN model [4, 5]:<p>&quot;The first basic function, JOIN, implements memory formation of a new item in terms of two established items: If two items A and B are already represented in the neural system, the task of JOIN is to modify the circuit so that at subsequent times there is the representation of a new item C that will fire if and only if the representations of both A and B are firing.&quot;<p>Papadimitriou &amp; Vempala [6] extend it to &quot;predictive join&quot; (PJOIN) model.<p>Edit: As I think about it deep learning might be useful in implementing this &quot;Merge&quot; by doing nonlinear PCA (Principal Component Analysis) via stacked sparse autoencoders, kind of like in that &quot;Cat face detection&quot; paper by Quoc Le [7]. The only thing missing is hierarchical memory representation for those principal components, where NEW objects emerge by joining most similar existing objects.<p>[0] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Merge_(linguistics)" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Merge_(linguistics)</a><p>[1] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Chunking_(psychology)" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Chunking_(psychology)</a><p>[2] <a href="http:&#x2F;&#x2F;www.columbia.edu&#x2F;~nvg1&#x2F;Wickelgren&#x2F;papers&#x2F;1979cWAW.pdf" rel="nofollow">http:&#x2F;&#x2F;www.columbia.edu&#x2F;~nvg1&#x2F;Wickelgren&#x2F;papers&#x2F;1979cWAW.pdf</a><p>[3] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;The_Magical_Number_Seven,_Plus_or_Minus_Two" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;The_Magical_Number_Seven,_Plus...</a><p>[4] <a href="http:&#x2F;&#x2F;citeseerx.ist.psu.edu&#x2F;viewdoc&#x2F;download?doi=10.1.1.208.8491&amp;rep=rep1&amp;type=pdf" rel="nofollow">http:&#x2F;&#x2F;citeseerx.ist.psu.edu&#x2F;viewdoc&#x2F;download?doi=10.1.1.208...</a><p>[5] <a href="https:&#x2F;&#x2F;www.amazon.com&#x2F;Circuits-Mind-Leslie-G-Valiant&#x2F;dp&#x2F;0195126688" rel="nofollow">https:&#x2F;&#x2F;www.amazon.com&#x2F;Circuits-Mind-Leslie-G-Valiant&#x2F;dp&#x2F;019...</a><p>[6] <a href="https:&#x2F;&#x2F;arxiv.org&#x2F;pdf&#x2F;1412.7955.pdf" rel="nofollow">https:&#x2F;&#x2F;arxiv.org&#x2F;pdf&#x2F;1412.7955.pdf</a><p>[7] <a href="https:&#x2F;&#x2F;ieeexplore.ieee.org&#x2F;abstract&#x2F;document&#x2F;6639343" rel="nofollow">https:&#x2F;&#x2F;ieeexplore.ieee.org&#x2F;abstract&#x2F;document&#x2F;6639343</a>
评论 #25328325 未加载