It's just amazing how DL made most work completely irrelevant.<p>Stuff before 2010 in natural language processing is ridiculous. Dynamic programming algorithms, beam search, dependency parsing (grammar) algorithms (going from O(n^3) to O(n) with cost-sensitive algorithms), a huge focus on lexical analysis, part-of-speech, graphical models (maximum entropy, conditional random fields, etc.).<p>Today all of these algorithms are completely irrelevant. No one needs part-of-speech anymore, or dependency (grammar) trees, or cost-sensitive reinforcement learning reductions.<p>I remember being so inspired by all of the work and learned a lot, but it's quite funny how Lindy works.