I've gone down a bit of a rabbit hole on path finding in the last week or two (most recently, this isn't the first time). When you have some knowledge of the topology of the graph you can use different techniques to do better than djikstra's.<p>Of course, if you have lots of time and space and a completely static graph, you can run all pairs shortest paths and simply store all the results for O(1) path lookup, but there are intermediates for varying types of graphs. This stack exchange article is a good overview: <a href="https://cstheory.stackexchange.com/questions/11855/how-do-the-state-of-the-art-pathfinding-algorithms-for-changing-graphs-d-d-l" rel="nofollow">https://cstheory.stackexchange.com/questions/11855/how-do-th...</a>.<p>I've been wondering about how well D* lite would perform in practice with a somewhat varying graph. I read some suggestions that if the graph is changing even a bit on occasion, then it will mostly degrade to A*, since many changed paths would need to be re-evaluated.<p>In the context of games, I've also been thinking about a technique called true distance heurustics (TDH), where you essentially precompute the distances between some fixed set of nodes, and then use those as a part of the heurustic for A* (or D* lite in this case), but it seems like updating these TDH in the case of a changing graph might introduce just as much overhead as not having them in the first place. It might be an interesting trade off though, if you have some "lines" (e.g., think train lines) that are much faster than roadways, you could handle each of these specially via the TDH, and in exchange you would be able to assume a lower "max speed" for use with the A* heurustic, allowing you to explore fewer paths (since with a lower "max speed" paths will more rapidly increase in cost), whereas if you had to assume all car based paths could move as fast as a train, you would have to explore more paths.
> <i>Our universal optimality result reveals a surprisingly clean interplay between this property and Dijkstra’s algorithm: Any heap with the working set property enables the algorithm to efficiently leverage every structural attribute of the graph it operates on, to the fullest extent that any comparison-based algorithm possibly can.</i><p>That last bit makes me wonder: what would a shortest path algorithm <i>without</i> comparisons look like? Are there also "radix sort" like approaches to shortest-path algorithms that surpass comparison-based algorithms or something?
"Universal Optimality of Dijkstra via Beyond-Worst-Case Heaps" (2024)
<a href="https://arxiv.org/abs/2311.11793" rel="nofollow">https://arxiv.org/abs/2311.11793</a> :<p>> Abstract: <i>This paper proves that Dijkstra's shortest-path algorithm is universally optimal in both its running time and number of comparisons when combined with a sufficiently efficient heap data structure.</i><p>Dijkstra's algorithm: <a href="https://en.wikipedia.org/wiki/Dijkstra%27s_algorithm" rel="nofollow">https://en.wikipedia.org/wiki/Dijkstra%27s_algorithm</a><p>NetworkX docs > Reference > Algorithms > Shortest Paths:
<a href="https://networkx.org/documentation/stable/reference/algorithms/shortest_paths.html" rel="nofollow">https://networkx.org/documentation/stable/reference/algorith...</a><p>networkX.algorithms.shortest_path.dijkstra_path: <a href="https://networkx.org/documentation/stable/reference/algorithms/generated/networkx.algorithms.shortest_paths.weighted.dijkstra_path.html" rel="nofollow">https://networkx.org/documentation/stable/reference/algorith...</a> <a href="https://github.com/networkx/networkx/blob/main/networkx/algorithms/shortest_paths/weighted.py">https://github.com/networkx/networkx/blob/main/networkx/algo...</a><p>/? Dijkstra manim: <a href="https://www.google.com/search?q=dijkstra%20manim" rel="nofollow">https://www.google.com/search?q=dijkstra%20manim</a>
In most real situations a graph is likely to be a model with some expected characteristics or perhaps data regarding real situations. Either way with modern computing it seems like in many cases using machine learning to predict the path or next steps on the path might actually end up being a more optimal method. The issue is how much data and modeling is available and how any processing of that would best be accounted for in final results that make use of any analysis.
This came up for me not long ago. A* is a specialization of Dijkstra's that is the canonical "path finding algorithm" for game development. A* is good for finding how to get from a specific point A to a specific point B. But I wanted to know how to get from any point A to a list of point Bs. And so it turned out that the extra work that Dijkstra's does that A* skips is exactly the work you want when doing such a thing. It's also cacheable, which is incredible in the modern era of having basically infinite memory for this sort of thing.