This is a good article, with nice insight on what cache coherency implies for the future of concurrency, but the semantics seems misleading. The word 'thread' is so overloaded that I'm not sure it's helpful here.<p>Yes, operating systems frequently allow processes to share the same data space, and we call these 'threads'. Yes, CPU's allow multiple code paths to be in progress on the same core, and we also call these 'threads'. And as the the author knows (based on the replies to comments), neither of these really constrains the form 'threads' take at the level the programmer thinks about.<p>So I'm not sure what use there is in concluding that threads "not going anywhere". This is trivially true to the extent that 'threads' are synonymous with 'concurrency', but does it say more than "concurrency is here to stay"?