Spielman was my algorithms Prof in college and legitimately one of the smartest guys in terms of raw intelligence I’ve ever met. He was a great teacher, able to explain the most complicated of problems in language that I could intuitively understand and get excited about. Before that class I had a lot of problems with the math sides of CS (still definitely do), but he really made the subject approachable for me. I started the class getting C’s, but after visiting his office hours nearly weekly, by the end of the semester I was getting 100s on my problem sets.<p>That class, and the way he taught it, convinced me that advanced math, whilst hard for me, is not beyond my grasp or really that of anyone. It’s just a matter of breaking problems down into manageable chunks and recognizable sub-problems, a skill at which Spielman is especially adept.<p>I knew he was smart then (he had just received a $500k McArthur “genius” grant), but reading this article really put into perspective how effective he is and how lucky I was to have him as a professor. His same skill for breaking down problems that helped me succeed in his class seems to be the same skill that enabled him to solve this problem.
Quick note that the article is from 2015, and the MSS result dates back to 2013.<p>There's a blog post by Nikhil here that explains the proof of the discrepancy result that implies Kadison-Singer: <a href="https://windowsontheory.org/2013/07/11/discrepancy-graphs-and-the-kadison-singer-conjecture-2/" rel="nofollow">https://windowsontheory.org/2013/07/11/discrepancy-graphs-an...</a><p>The main result is a theorem that replaces a Chernoff bound (saying here that a random partition has discrepancy logarithmic in the number of vectors with high probability) with a bound that says achieves a better bound on the discrepancy, independent of the number of vectors, at the cost of replacing the "with high probability" with "there exists".<p>The proof uses some really beautiful techniques from the geometry of polynomial roots to prove this and is a pretty fun read: <a href="https://arxiv.org/abs/1306.3969" rel="nofollow">https://arxiv.org/abs/1306.3969</a>
"He guessed that the problem might take him a few weeks. Instead, it took him five years"<p>Perhaps the key to their success was typically programming work estimations ;)<p>More seriously, I think that if you think something is easy for you, or that you are almost there, you may be more willing to put in the necessary time, than if you knew it would take 5 years from the outset.<p>"Spielman realized that he himself might be in the perfect position to solve it. “It seemed so natural, so central to the kinds of things I think about,” he said. “I thought, ‘I’ve got to be able to prove that.’”"
Great story that reminds me of something I've had. There's just so much knowledge out there - and growing - that researchers have to be increasingly specialized. Isn't it inevitable that 4 years of undergrad + 1-2 yrs of grad school classes won't prepare you sufficiently to do novel research? Imagine when people have to be in school for something like 10 years before they know enough to do "real" work! And people will necessarily have to become more and more specialized, making situations like the one in the article even rarer.<p>Just a thought.