The thesis on which this is based:<p><a href="http://phm.cba.mit.edu/theses/03.07.vigoda.pdf" rel="nofollow">http://phm.cba.mit.edu/theses/03.07.vigoda.pdf</a><p>edit: p 135 is where he starts talking about implementation in silicon
I'm curious how they deal with probabilities very close to 1 or 0. Usually when people are doing bayesian things with probabilities they work in logistic space so that the precision of values close to 1 or 0 is effectively unbounded. That seems like a hard thing to do with an analog circuit.
Is it likely (in the future) to see more domain specific chips? Something like what <a href="http://www.deshawresearch.com/" rel="nofollow">http://www.deshawresearch.com/</a> has created---a custom chip Anton, optimised for Molecular Dynamics simulations.
Isn't this just the revenge of the analog computer?<p>Not saying it's a bad idea... I'm really for the idea of revisiting assumptions in computer design.
I thought I'd heard something like this before. From 2004:
<a href="http://www.eetasia.com/ART_8800354714_499488_NT_92255b4a.HTM" rel="nofollow">http://www.eetasia.com/ART_8800354714_499488_NT_92255b4a.HTM</a><p>That's a turbo decoder rather than a generic probability calculator, but it's doing probability calculations in the analog domain.<p>This sort of thing may make sense for error correction, but I don't think people will run general probability calculations on it. Too difficult to debug :-)<p>Though, I do wonder if they can simulate a neuron more efficiently than digital logic.
Sounds a lot like the ByNase protocol that Ward Cunningham (inventor of the wiki) came up with:<p><a href="http://c2.com/cybords/wiki.cgi?BynaseProtocol" rel="nofollow">http://c2.com/cybords/wiki.cgi?BynaseProtocol</a>
Printer friendly, (almost) no ads, no pointless images:<p><a href="http://www.technologyreview.com/printer_friendly_article.aspx?id=26055&channel=computing&section=" rel="nofollow">http://www.technologyreview.com/printer_friendly_article.asp...</a>