TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Neuralink Compression Challenge

18 点作者 crakenzak12 个月前

12 条评论

Terrk_11 个月前
Apparently, someone solved it and achieved an 1187:1 compression ratio. These are the results:<p>All recordings were successfully compressed. Original size (bytes): 146,800,526 Compressed size (bytes): 123,624 Compression ratio: 1187.47<p>The eval.sh script was downloaded, and the files were decode and encode without loss, as verified using the &quot;diff&quot; function.<p>What do you think? Is this true?<p><a href="https:&#x2F;&#x2F;www.linkedin.com&#x2F;pulse&#x2F;neuralink-compression-challenge-cspiral-31pae&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.linkedin.com&#x2F;pulse&#x2F;neuralink-compression-challen...</a> context: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=X5hsQ6zbKIo" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=X5hsQ6zbKIo</a>
评论 #40577387 未加载
djdyyz11 个月前
Analyzing the data it becomes clear that the A&#x2F;D used by Neuralink is defective, i.e. very poor accuracy. The A&#x2F;D introduces a huge amount of distortion, which in practice manifests as noise.<p>Until this A&#x2F;D linearity problem is fixed, there is no point pursuing compression schemes. The data is so badly mangled it makes it pretty near impossible to find patterns.
评论 #40577421 未加载
palaiologos12 个月前
they&#x27;re looking for a compressor that can do more than 200MB&#x2F;s on a 10mW machine (that&#x27;s including radio, so it has to run on a CPU clocked like original 8086) and yield 200x size improvement. speaking from the perspective of a data compression person, this is completely unrealistic. the best statistical models that i have on hand yield ~7x compression ratio after some tweaking, but they won&#x27;t run under these constraints.
评论 #40499788 未加载
评论 #40472336 未加载
ClassyJacket12 个月前
So, they&#x27;re asking skilled engineers to do work for them for free, and just email it in?<p>Why didn&#x27;t every other company think of this?
评论 #40464879 未加载
djdyyz12 个月前
200X is possible.<p>The sample data compresses poorly, getting down to 4.5 bits per sample easily with very simple first-order difference encoding and an decent Huffman coder.<p>However, lets assume there is massive cross-correlation between the 1024 channels. For example, in the extreme they are all the same, meaning if we encode 1 channel we get the other 1023. That means a lower limit of 4.5&#x2F;1024 = about 0.0045 bits per sample, or a compression rate of 2275. Viola!<p>If data patterns exist and can be found, then more complicated coding algorithms could achieve better compression, or tolerate more variations (i.e. less cross-correlation) between channels.<p>We may never know unless Neuralink releases a full data set, i.e. 1024 channels at 20KHz and 10 bits for 1 hour. That&#x27;s a lot of data, but if they want serious analysis they should release serious data.<p>Finally, enforcing the requirement for lossless compression has no apparent reason. The end result -- correct data to control the cursor and so on -- is the key. Neuralink should allow challengers to submit DATA to a test engine that compares cursor output for noiseless data to results for the submitted data, and reports the match score, and maybe a graph or something. That sort of feedback might allow participants to create a satisfactory lossy compression scheme.
评论 #40527446 未加载
crakenzak12 个月前
This reminds me a lot of the Hutter Prize[1]. Funnily enough, the Hutter Prize shifted my thinking 180 degrees towards intelligence ~= compression, because to truly compress information well you <i>must</i> understand its nuanced.<p>[1]<a href="http:&#x2F;&#x2F;prize.hutter1.net&#x2F;" rel="nofollow">http:&#x2F;&#x2F;prize.hutter1.net&#x2F;</a>
codingdave12 个月前
And in exchange for solving their problem for them, you get... ???<p>I&#x27;m all for challenges, but it is fairly standard to have prizes.
评论 #40473168 未加载
评论 #40460727 未加载
raffihotter12 个月前
200x compression on this dataset is mathematically impossible. The noise on the amplifier and digitizer limit the max compression to 5.3x.<p>Here’s why: <a href="https:&#x2F;&#x2F;x.com&#x2F;raffi_hotter&#x2F;status&#x2F;1795910298936705098" rel="nofollow">https:&#x2F;&#x2F;x.com&#x2F;raffi_hotter&#x2F;status&#x2F;1795910298936705098</a>
djdyyz12 个月前
Check out this link for background info. <a href="https:&#x2F;&#x2F;mikaelhaji.medium.com&#x2F;a-technical-deep-dive-on-elon-musks-neuralink-in-40-mins-71e1100f54d4#43cf" rel="nofollow">https:&#x2F;&#x2F;mikaelhaji.medium.com&#x2F;a-technical-deep-dive-on-elon-...</a>
fattless12 个月前
&quot;aside from everything else, it seems like it&#x27;s really, really late in the game to suddenly realize &#x27;oh we need magical compression technology to make this work don&#x27;t we&#x27;&quot;<p><a href="https:&#x2F;&#x2F;x.com&#x2F;JohnSmi48253239&#x2F;status&#x2F;1794328213923188949?t=_8K1rncHLesiy46IqaIMbA&amp;s=19" rel="nofollow">https:&#x2F;&#x2F;x.com&#x2F;JohnSmi48253239&#x2F;status&#x2F;1794328213923188949?t=_...</a>
iamcreasy12 个月前
&lt; 10mW, including radio<p>Does it mean radio is using portion of this 10mW? If so, how much?
jappgar12 个月前
why should it be lossless when presumably there is a lot of noise you don&#x27;t really need to preserve
评论 #40485061 未加载