TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Analog computing may be coming back

99 pointsby mbellottiover 2 years ago

19 comments

lambdaxymoxover 2 years ago
It's interesting that the black art of analog design never really goes away even from computing. I was reading Geoffrey Hinton's "Mortal Computation" where he briefly speculated towards the end on low power (in watts used, not capability) neural networks being embedded in hardware via something like memristor networks. It makes me imagine neural networks being distributed as a template (i.e. a blob of XML describing the topology and the weights) and then the network weights get tuned a little different to each device or device model to get the best performance per watt out of them. Since every device is physically a little bit different from the next, in precision analog applications with discrete components the components have to be matched. This is often the case in assembling differential pairs in analog audio circuits. Similarly with the control system parameters with hard drives. Since each DC motor is a little bit different from the next, the control loop gets tuned at the factory to each drive. So in this case, the neural network gets matched to the circuit it's embedded in. Mortal computation indeed, each neural network becomes truly unique. I could be full of it, but it's fun to imagine at least.
choudharismover 2 years ago
The mark of a maturing domain is the evolution from only general tools to general + specialized. We&#x27;ve gone from only CPUs to CPU + GPU to specialized AI chips (Neural Engine, Tensor chips etc.) and specialized computing is a big tent which can fit many different architectures together.<p>Analog computing is the closest thing to bioengineering in fundamental computer science that I know of, so I am confident that it will find a niche. I remember reading about Mythic AI here on HN, who were doing some cool work with analog computing chips for ML. My hunch is that matrix multiplication is the most expensive mathematical operation we do as a society (not unit expensive, but in overall absolute cost) - and our progress in AI is directly proportional to how easy &#x2F; cheap it is to run.
评论 #34594122 未加载
评论 #34594464 未加载
singularity2001over 2 years ago
Analog will likely come back but for other reasons: Neural networks don&#x27;t require precise calculations and Hintons forward forward networks put into hardware would be several orders of magnitudes more efficient, even without photons. &quot;AI inferencing is heavily dependent on multiply&#x2F;accumulate operations, which are highly efficient in analog.&quot;<p>If you know of any startup working on this let me know because I&#x27;d love to join the revolution.
评论 #34594255 未加载
评论 #34608948 未加载
评论 #34595328 未加载
orbifoldover 2 years ago
One of the things that I fully not expect to be successful is optical computing. There are just a lot of academic groups that are doing optics and they like to invent new reasons why whatever they are up to is relevant. For physics reasons the integration density of optical compute elements is abysmal and will remain so <i>forever</i>. Other technologies like spintronics at least have the chance to work sometime in the future. There were projects on wafer scale optical computing already in the 80-90s at MIT Lincoln labs, so this isn&#x27;t exactly a new idea either. We have a new group at our institute doing &quot;Neuromorphic Quantum Photonics&quot;, they publish in high-impact glossy journals, doesn&#x27;t change that it is in my opinion mostly hype and bullshit.
评论 #34594143 未加载
评论 #34593866 未加载
xor99over 2 years ago
Analog computers can perform matmul operations without data movement using physical properties such as conductance changes in a very small volume. If noise and random variation can be modelled successfully then in certain cases they are obviously going to be better (e.g. energy use in edge applications). The discussion is not specific enough to applications to be useful. AI data centres are not going to be using this stuff anytime soon for example. On the other hand, you do not want an NVIDIA GPU inserted into your body.
Aldipowerover 2 years ago
This topic cannot miss a post mentioning Bern Ulmann. Here a video, where&#x27;s he demonstration the Juksowski profile. <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=nP84Dv01y4A">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=nP84Dv01y4A</a>
zozbot234over 2 years ago
Analog computing sucks. It&#x27;s inherently sensitive to noise and random variation in your devices that&#x27;s basically ubiquitous with modern chip-making processes. Digital electronics actively reject noise at every step in a computation, albeit at the cost of wasting energy in the process. With analog, a more complex computation becomes exponentially harder.
评论 #34595167 未加载
评论 #34596266 未加载
评论 #34595012 未加载
评论 #34597399 未加载
评论 #34598296 未加载
评论 #34594465 未加载
评论 #34595285 未加载
raszover 2 years ago
From Travis Blalock (first real optical mouse) Oral History:<p>&quot;each array element had nearest neighbor connectivity so you would calculate nine correlations, an autocorrelation and eight cross-correlations, with each of your eight nearest neighbors, the diagonals and the perpendicular, and then you could interpolate in correlation space where the best fit was. &quot;<p>&quot;And the reason we did difference squared instead of multiplication is because in the analog domain I could implement a difference-squared circuit with six transistors and so I was like “Okay, six transistors. I can’t do multiplication that cheaply so sold, difference squared, that’s how we’re going to do it.”<p>&quot;little chip running in the 0.8 micron CMOS could do the equivalent operations per second to 1-1&#x2F;2 giga operations per second and it was doing this for under 200 milliwatts, nothing you could have approached at that time in the digital domain.&quot;<p>Avago H2000 chip did all the heavy lifting _in analog domain_. No DSP, it was too expensive for digital domain (cost of first civilian handheld GPS receivers also doing heavy autocorrelation, 1998 Garmin StreetPilot was $400-550 retail).
ghoul2over 2 years ago
As I understand it, analog computing is entirely impractical simply from a Information Theoretic viewpoint.<p>For a signal to convey 8-bits worth of information, it will need to have 256 distinct levels. It we want the signal to range from, lets say, 0 to 5v (which is already quite high), each level only has about 2mV range. This much can easily come from cross-talk, EMI and power supply noise. So all your logic&#x2F;calculation will be wrong.<p>Once you start talking about 16 bits, it becomes entirely ridiculous: we now can only have 75uV range for each level. This is getting into RF interference territory - just receiving a phone call close to such an analog signal would disrupt it.<p>The way I understand it, there is simply not enough SNR available in our electronics (on die traces or PCB traces) for analog computing to work. Thats why we restrict the number of level we use in our signal: digital being just two level, but even with higher level-counts, we typically use 4 levels or 8. This is somewhat analog, but not really.<p>I am not an EE, so I am entirely open to being corrected on this.
jefuriiover 2 years ago
Analog computing has been used by musicians for awhile now.<p>Synthesizers are basically analog computers. Bob Moog was an engineer whose genius was figuring out how to connect keyboards to lab equipment and how to hide enough of the guts to make the gear approachable to musicians. West Coast synthesists like Buchla took the opposite approach of appreciating the sound of analog computing for what it is.<p>Synthesizers tried to hide it for awhile behind layers of user interface, but especially with the Eurorack boom of the last decade or so you can really see that synthesizers are simply specialized analog computers. Lots of synth modules openly use the same terminology as analog computing: filters, amplifiers, multipliers, low-pass gate, sample and hold, sequencer, etc. Musicians like Hainbach use actual test equipment in their music.<p>Guitar effect rigs are also basically analog computers. They&#x27;re just not used for numerical computation.<p>update: The Signal State is a Zach-like game where you solve puzzles by programming analog computers; in was inspired by Eurorack synthesizers.
评论 #34605031 未加载
asicspover 2 years ago
See also <a href="https:&#x2F;&#x2F;semiengineering.com&#x2F;can-analog-make-a-comeback&#x2F;" rel="nofollow">https:&#x2F;&#x2F;semiengineering.com&#x2F;can-analog-make-a-comeback&#x2F;</a><p>Discussion: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=32106546" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=32106546</a> <i>(130 points | 6 months ago | 125 comments)</i>
jerfover 2 years ago
It seems to me that punters overly excited to declare that everything we&#x27;ve ever done up to this point is wrong and there&#x27;s a new paradigm coming that will obsolete everything, as well as people being critical of the new thing, tend to miss a very important aspect of digital computation that will prevent analog computation from <i>ever</i> simply &quot;taking over&quot;: Digital computation is essentially infinite composable.<p>It does not matter how many gates you throw your 1s and 0s through; they will remain ones and zeros. While floating point numerical computations carry challenges, they are at least <i>deterministic</i> challenges. You can build things like an SHA512 hash which can ingest gigabytes upon gigabytes, with the entire computation critically dependent at <i>every step</i> upon <i>all previous computations in the process</i>, a cascading factor of literally billions and billions, and deterministically get the exact same SHA512 hash for the exact same input every time.<p>This property is so reliable that we don&#x27;t even think about it.<p>Analog computers can not do that. You could never build a hash function like that out of analog parts. You can not take the output of an analog computer and feed it back into the input of another analog computation, and then do it <i>billions upon billions of times</i>, and get a reliable result. Such a device would simply be a machine for generating noise.<p>Analog computing fits into the computing paradigm as another &quot;expansion card&quot;. It may take isolated computations and perform them more efficiently. Perhaps even important computations. But they will <i>always</i> be enmeshed in some digital computer paradigm. Breathless reports about how they&#x27;re &quot;coming back&quot; and coming soon and taking over are just nonsense. (I speak generally, this walled-off article may or may not have made such claims, I dunno.) So many things about how digital computers work that you just take for granted are simply impossible for analog computers, structurally; something as simple as taking a <i>compressed</i> representation of a starting state for your analog computer is something you need a digital computer for, because our best compression algorithms have the same deep data dependencies that I mentioned for the hashing case.<p>Useful, interesting, innovative, gonna make some people some money and create some jobs? Sure. Something we should all go gaga over? No more than a new database coming out. It&#x27;s going to be a tool, not a paradigm shift.
shrubbleover 2 years ago
I think that the author is wrong.<p>Analog computer&#x27;s weak point is the power supply, such that many companies making them ended up having to manufacture their own to very high standards, such as big capacitors with 0.1% tolerance. Reason is that you are using the analog voltage and thus poor power regulation leads to inaccurate results.<p>With newer analog computer setups more is integrated into the chip itself, making power supply issues much less of a problem.
评论 #34595181 未加载
amaticover 2 years ago
Another interpretation of the name &quot;analog computers&quot; is that they compute by analogy - by simulating the problem in a different medium. When analog computers were created, the there were no digital computers to make the distinction between analog (continuous) vs digital (discrete). There were mechanical computers that used gears and shafts, and angular speed as the main variable; electronic computers that used voltages, there is a famous hydraulic computer that used water levels and rates of flow to represent variables, etc.<p>You start by examining your problem in mathematical terms and write down the differential equations that describe it. For example, you have a model of a car suspension with parameters for spring stiffness, damping etc. You put the model into the computer, and play with the parameters to see how things work. Not that different from modern simulations. The one advantage over modern simulations is that you might get a better &quot;feel&quot; for the system - or so the proponents of the analog used to say, before digital computers replaced them.
jojobasover 2 years ago
Probably the best 20 minutes you can spend if you haven&#x27;t really heard of analog computers.<p><a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=IgF3OX8nT0w">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=IgF3OX8nT0w</a>
评论 #34594212 未加载
MichaelMoser123over 2 years ago
isn&#x27;t a quantum computer a kind of analog computer? Wikipedia says &quot;An analog computer or analogue computer is a type of computer that uses the continuous variation aspect of physical phenomena&quot; <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Analog_computer" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Analog_computer</a>
评论 #34594989 未加载
ratrocketover 2 years ago
<a href="https:&#x2F;&#x2F;archive.ph&#x2F;DeK8H" rel="nofollow">https:&#x2F;&#x2F;archive.ph&#x2F;DeK8H</a>
gjvcover 2 years ago
paraffin lamps may be coming back, too
giladvdnover 2 years ago
The absurdity of suggesting optical computing is a good pathway to efficiency is that our brains efficiently use electrons and are doing just fine.
评论 #34594093 未加载
评论 #34594057 未加载