I'm going to say something heretical here: I don't think entrepreneurs should base their companies off the latest advances in computer science. Problem is, many of them are really "bleeding edge", and even the researchers themselves don't know all the implications for them. <i>Nobody</i> has a clue where they might lead, or if anyone will ever find them useful.<p>Instead, you should look for the stuff that came out of academia 20 years ago but was rejected as unfeasible, useless, or just plain idiotic. Then keep an eye on economic trends that change the assumptions that made those discoveries useless. If you keep in mind a large enough set of rejected technologies and a large enough set of economic changes, eventually you'll find a match between them.<p>Some examples:<p>The architecture, performance, and programming techniques for early microcomputers mimicked 1950s mainframe technology. Many of the features of PC OSes were considered incredibly backwards at the time - no multitasking, segmented memory, assembly language coding. Yet this primitive hardware cost perhaps 1/10,000th of a 1950s mainframe and fit on a desk. This opened up a whole new market, one that was willing to put up with buggy software, single-tasking, and limited functionality.<p>Java consists mostly of poor implementations of ideas from the 1960s and 1970s. Garbarge collection was invented in 1960; object orientation in 1968; monitors in 1978; virtual machines in the early 1970s. Yet Java targetted the PC, Internet, and embedded device market that had previously been limping along with C/C++ and assembly. To them, these innovations were new, and performance of devices was just barely improving to the point where they were becoming feasible.<p>HyperText was invented in 1960- actually, you could argue that Vannevar Bush came out with the concept in 1945. But there was no easy physical way to put together large amounts of information, so Ted Nelson's Xanadu project went nowhere. Fast forward to 1991: the Internet had linked together most research institutions, and PCs were becoming powerful enough to support graphical browsing. When Tim Berners-Lee put the WWW up, there was a ready infrastructure just waiting to be expanded. And the rest is history.<p>PC video has been around since the mid-1990s: I remember recording onto a Mac Centris 660AV in 1993. Flash has also been around since then, as has the Internet. Previous attempts to combine them failed miserably. Yet YouTube succeeded in 2005, because a bunch of factors in the environments had changed. People were now comfortable sharing things online, and many people now have broadband access. Cell-phone video makes it really easy to record, without expensive equipment. And the rise of MySpace and blogs made it really easy for people to share videos they'd created with their friends.