If we look at Moore's Law as it was originally stated, i.e., w/r/t transistor density on chips, then yes, it is up against a serious obstacle in the near future. That obstacle is quantum physics. Below a certain nanometer count, you're working at truly atomic scale, and Heisenberg effects start to kick in.<p>Quantum computing may solve this issue, but realistically, probably not at a pace quick enough to keep Moore's Law operating on track the way it has been historically. More likely, we'll hit a plateau for awhile and eventually shatter it. When that happens, computers will be very different machines from what they are now. The shift from transistor-based computers to quantum computers will be akin to the shift from vacuum tubes to transistors.<p>In the meantime, we're probably just going to load up and more and more parallel processors.
It is a constant theme to say that Moore's party is over. But there are massive $$ incentives to maintain it and it is likely to continue in some form or another -- either in terms of power consumption, multiple cores, specialized instructions, etc.<p>I figure that chips will become much more 3D, almost like cubes, instead of flat rectangles, with just enough freespace to allow for cooling of one sort or another.<p>Graphene and other substrates that allow for faster chips at cooler temperatures is also a good bet for future chip improvements.