TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Frequentism and Bayesianism: A Practical Introduction

93 pointsby atakan_gurkanover 10 years ago

10 comments

thebearover 10 years ago
It is perhaps worth drawing attention to this sentence in the article: "Though Bayes' theorem is where Bayesians get their name, it is not this law itself that is controversial, but the Bayesian interpretation of probability implied by the term P(F_true | D)." A widespread misunderstanding is that there is something fundamentally Bayesian about Bayes' theorem, or even that frequentists don't believe in it. It is rarely pointed out that this is not the case, and we should thank the authors for doing so.
atakan_gurkanover 10 years ago
The follow-ups are also well worth reading:<p><a href="http://jakevdp.github.io/blog/2014/06/06/frequentism-and-bayesianism-2-when-results-differ/" rel="nofollow">http:&#x2F;&#x2F;jakevdp.github.io&#x2F;blog&#x2F;2014&#x2F;06&#x2F;06&#x2F;frequentism-and-bay...</a><p><a href="http://jakevdp.github.io/blog/2014/06/12/frequentism-and-bayesianism-3-confidence-credibility/" rel="nofollow">http:&#x2F;&#x2F;jakevdp.github.io&#x2F;blog&#x2F;2014&#x2F;06&#x2F;12&#x2F;frequentism-and-bay...</a><p><a href="http://jakevdp.github.io/blog/2014/06/14/frequentism-and-bayesianism-4-bayesian-in-python/" rel="nofollow">http:&#x2F;&#x2F;jakevdp.github.io&#x2F;blog&#x2F;2014&#x2F;06&#x2F;14&#x2F;frequentism-and-bay...</a>
评论 #8248250 未加载
评论 #8248911 未加载
lutuspover 10 years ago
Readers should be aware that the linked article was composed entirely in the IPython notebook environment, which means Python code blocks, Latex renderings, and graphics, can all be freely mixed in a (to me) very nice, readable article format.<p><a href="http://ipython.org/" rel="nofollow">http:&#x2F;&#x2F;ipython.org&#x2F;</a>
评论 #8248044 未加载
graycatover 10 years ago
Here&#x27;s how I relax, avoid both frequentism and bayesianism, and just love probability:<p>I assume that there is a non-empty set, commonly called Omega, which I regard as the set of all experimental &#x27;trials&#x27; that I might observe. But, actually, in all the history of everything in the universe, we see only one trial, only one element of this set Omega.<p>Next, there is a non-empty collection, usually denoted by script F, of subsets of Omega. I assume that set script F is contains Omega as an element and is closed under relative complements and countable unions. By <i>relative complements,</i> suppose A is an element of script F. Then the <i>relative complement</i> of set A, maybe written A^c, is essentially set Omega - A, that is, the set of all trials in Omega and not in A. Then set script F is a sigma-algebra. Each set A in script F is an <i>event</i>. If our trial is in set A, then we say that event A has <i>occurred.</i><p>Next there is a function P: script F --&gt; [0, 1]. P assigns 0 to the empty set (event) and is countably additive. Then function P is a <i>probability measure</i>. So for each event A in script F, P(A) is a number in [0, 1] and is the <i>probability</i> of event A.<p>Now can define what is means for two events to be <i>independent</i> and can generalize to two sigma algebras being independent.<p>Next, on the set R of real numbers, I consider the <i>usual topology</i>, that is, the collection T of open subsets of R. Then I let set B, the <i>Borel sets,</i> be the smallest sigma algebra such that T is a subset of B.<p>Next I consider a function X: Omega --&gt; R such that for each Borel set A, X^{-1}(A) is an element of script F. Then X is a <i>random variable</i>.<p>Essentially anything that can have a numerical value we can regard as a random variable.<p>Then we can state and prove the classic limit theorems -- central limit theorem, weak and strong laws of large numbers, martingale convergence theorem, law of the iterated logarithm, etc.<p>Now we are ready to do applied probability and statistics. And we have never mentioned either frequentism or Bayesianism.<p>For more details, in an elegant presentation, see J. Neveu, <i>Mathematical Foundations of the Calculus of Probability.</i>
评论 #8248406 未加载
评论 #8248290 未加载
评论 #8248676 未加载
eli_gottliebover 10 years ago
Frequentists and Bayesians care about two different likelihood functions:<p>* Frequentists care about p(evidence | parameters), and interpret probability as a measure over subsets of the counterfactual set of repeated trials (usually independently identically distributed) produced by their model.<p>* Bayesians care about p(parameters | evidence), and interpret probability as &quot;belief&quot; or &quot;propensity to bet&quot;. This is, of course, philosophically ridiculous, since they proceed to ground <i>rational</i> belief <i>in</i> Bayesian statistics. What they are really doing is exactly what their likelihood function says: taking a measure over subsets of the counterfactual set of possible worlds which could have produced their evidence.<p>The frequentists have the advantage of their methods being more computationally tractable. The Bayesians have the advantages of intuitiveness and of yielding more accurate inferences from the same limited data-sets. Pick the tool you need and remember what you&#x27;re taking a measure over!
jmountover 10 years ago
Nice article. In this direction my group has been trying to help teach that you tend to need to be familiar with both frequentist and Bayesian thought (you can&#x27;t always choose one or the other: <a href="http://www.win-vector.com/blog/2013/05/bayesian-and-frequentist-approaches-ask-the-right-question/" rel="nofollow">http:&#x2F;&#x2F;www.win-vector.com&#x2F;blog&#x2F;2013&#x2F;05&#x2F;bayesian-and-frequent...</a> ) and that Bayesianism only appears to be the more complicated of the two ( <a href="http://www.win-vector.com/blog/2014/07/frequenstist-inference-only-seems-easy/" rel="nofollow">http:&#x2F;&#x2F;www.win-vector.com&#x2F;blog&#x2F;2014&#x2F;07&#x2F;frequenstist-inferenc...</a> ).
afafsdover 10 years ago
I don&#x27;t understand why Bayesian statistics needs to be an &quot;-ism&quot;, and still less why other statistics needs to be an &quot;-ism&quot; too. I don&#x27;t understand why people feel the need to line up on one side or the other or get so worked up about it. Other branches of mathematics seem to avoid this kind of thing, they have no problem with the idea that there&#x27;s different ways of doing the same thing.<p>It actually discourages me from learning more about Bayesian statistics, because the whole thing sometimes comes off as a cult.
评论 #8248608 未加载
Tarrosionover 10 years ago
What is the theoretical justification for taking a completely flat prior? &quot;If we set the prior P(Ftrue)∝1 (a flat prior),&quot;<p>There&#x27;s no probability distribution which is constant over the whole real line. Is the idea that we can pick a distribution which is constant over an arbitrarily large (but finite) interval around the observed data, and so in practice, we may get results arbitrarily close to those given?
评论 #8247762 未加载
评论 #8248299 未加载
评论 #8247764 未加载
droobover 10 years ago
&quot;37 Ways to More Accurately Read the Bones You&#x27;re Casting to Predict the Harvest&quot;
adrianbgover 10 years ago
Does anyone know how to make the formulas render properly? Even using the iPython notebook viewer hasn&#x27;t helped.
评论 #8249859 未加载
评论 #8249875 未加载