Hmm. I'm not an expert, but some of this seems definitely not to be accurate. Some of the "Bullshit" turns out perhaps to be quite important.<p>Take the statement:<p>> Markov Chain is essentially a fancy name for a random walk on a graph<p>Is that really true? I definitely don't think so. To my understanding, a Markov process is a stochastic process that has the additional (aka "Markov") property that it is "memoryless". That is, to estimate the next state you only need to know the state now, not any of the history. It becomes a Markov chain if it is a discrete, rather than continuous process.<p>There are lots of random walks on graphs that satisfy this definition. Like say you have a graph and you just specify the end points and say "walk 5 nodes at random from this starting node, what is the expectation that you end up at a specific end node". This could be a Markov process. At any point to estimate the state you only need to know the state now.<p>But lots of random walks on graphs do not have the Markov property. For example, say I did the exact same thing as the previous example, so I have a random graph and a start and target node and I say "Walk n nodes at random from the starting node. What's the expectation that at some point you visit the target node". Now I have introduced a dependency on the history and my process is no longer memoryless. It is a discrete stochastic process and it is a random walk on a graph but is not a Markov chain.<p>An example of a Markov and non-Markov processes in real life is if I have a European option on a stock I only care about what the stock price is at the expiry date. But if I have a barrier option or my option has some knock-in/knock-out/autocallable features then it has a path dependence because I care about whether at any point in its trajectory the price hit the barrier level, not just the price at the end. So the price process for the barrier option is non-Markov.