By "fundamental", I mean research meant to overturn or alter a field from the ground up—sort of the opposite of "incremental" in this context, I think.<p>For myself, as a machine learning researcher, I recently learned about Neural Turing Machines and Differentiable Neural Computers, and feel it's a shame they seem to have died off in popularity the past few years. The idea of using a neural network to approximate the behavior of a CPU in a von Neumman computer is one I find very compelling.<p>I'm always interested in reading papers from interesting fields where the thesis of the paper is, roughly, "we might have been doing a basic aspect of the discipline wrong all along, here's a well-reasoned argument why." Or, alternatively "here's a creative, innovative approach to an old problem and why we think it warrants further research."<p>I'm interested in hearing about other fields besides my own!
I know that embeddings are old news for ML practitioners but for my field (technical writing) they open up lots of interesting new workflows and opportunities. So for me, embeddings are the most exciting fundamental research happening in my industry.