This builds on the ideas behind PathNet, previously discussed at <a href="https://news.ycombinator.com/item?id=13675891" rel="nofollow">https://news.ycombinator.com/item?id=13675891</a><p>Whereas PathNet permanently freezes parameters and pathways used for previously learned tasks, in this case the authors compute how important each connection is to the most recently learned task, and protect each connection from future modification by an amount proportional to its importance. Important pathways tend to persist, and unimportant pathways tend to be discarded, gradually freeing "underused" connections for learning new tasks.<p>The authors call this process Elastic Weight Consolidation (EWC). Figure 1 in the paper does a great job of explaining how EWC finds solutions in the search space of solutions that are good for new tasks without incurring significant losses for previous tasks.<p>Very cool!