The article has a section on the math used in Google Maps, which points to<p><a href="http://algo2.iti.kit.edu/schultes/hwy/esaHwyHierarchies.pdf" rel="nofollow">http://algo2.iti.kit.edu/schultes/hwy/esaHwyHierarchies.pdf</a><p>which says - there are 24 million places in the USA, connected by 29 million roads. You need 4 hours 15 minutes to pre-process this information. From then on, it only takes 7 milliseconds to find the shortest path from one place to another by running the Multilevel Query Algorithm, which is a souped up version of Dijkstra and runs 2000 times faster than Dijkstra's Shortest Path algorithm.<p>Is that right ? 24 million choose 2 is 288 trillion, so do an all paths search, then have a lookup table with 288 trillion entries, store that in HDFS, slap an LRU caching layer atop that, and you wouldn't have to run any graph query algorithm at all, so should be able to do much better than 7 ms ... just thinking out loud.