Hoped to see opinions on Unum [1]<p>Correct me if I'm wrong, but most machine learning does happen around 1.0. Unum should give more precision for same bits, same precision for less bits around 0 and 1. And some other interesting features.<p>But would require new hardware and software.<p>[1] <a href="https://en.wikipedia.org/wiki/Unum_(number_format)" rel="nofollow">https://en.wikipedia.org/wiki/Unum_(number_format)</a>
There's also the whole world of fixed point inference which isn't discussed here, but quite important. All of the hardware supports fast integer operations, and with fewer platform specific caveats, so you can get better guarantee of consistent behavior in deployments.
> Floating point? In MY deep learning?<p>It's more likely than you think.<p>Maybe not the most appropriate place for an "X? in MY y?" meme despite its relatively innocuous presentation<p>It's kind of gross so I'll refrain from linking it