This is a shoddy analysis.<p>1. There's a statistical gadget specifically for doing this—a "scoring rule" [1] which is a principled way to compare different probabilistic predictions. A bunch of scatterplots of random quantities against each other are... not that.<p>By comparing only binary win/loss predictions instead of probabilities, like in the first chart, you throw away <i>almost all information</i> contained in the probabilistic estimates—if Democrats win a state, there's no bonus for predicting (say) 95% Dem instead of 55% dem.<p>It's plausible that 538 would actually win under a proper scoring rule, because betting markets were underconfident (relative to 538) in deep dem/rep states (predicting e.g. <95% Dem win in VT, vs 538's >99%). [2]<p>2. The calibration analysis assumes that different state win/loss rates are independent, but that's really untrue: 538's predictions were specifically <i>not</i> independent because they assumed polling errors were correlated between states.<p>3. Many of the other scatterplots look outlier-driven and don't include r^2 or p-values. With so few datapoints, it's unclear if they are meaningful at all.<p>[1]: <a href="https://en.wikipedia.org/wiki/Scoring_rule" rel="nofollow">https://en.wikipedia.org/wiki/Scoring_rule</a><p>[2]: Maybe we should cut prediction markets some slack here because liquidity constraints make them inaccurate for small probabilities. If that's the article's position, though, they should address this instead of just... not using a scoring rule.
Why PredictIt isn't efficient: smart money is restricted by the amount they can place on any bet. This leaves plenty of inefficiencies in PredictIt. One time, I bought all the contracts saying that every candidate of Democratic nominee would lose and it paid instantaneously. Easiest $50 I ever made.
Spent a lot of time/money this election cycle betting on the spread between Predictit and 538. I did act more conservatively by only making a bet when it made sense given all 3 versions of 538s' model.<p>ROI around 12% (not including 5% withdrawl fee). Expect it to go a few points higher given that called elections are still trading at 90c, but PredictIt won't close due to ongoing litigation.
Wait, what? Are you arguing the prediction markets worked? And what about this then? <a href="https://i.imgur.com/247IJhe.jpg" rel="nofollow">https://i.imgur.com/247IJhe.jpg</a> This was hours after NE-2 was called -- and at that moment Biden won, make no mistake -- and people were <i>still</i> betting against Biden very strongly so I thought I could make a statement here. I wanted to avoid people telling me later "hindsight is always 20/20" so I put my money where my mouth is.
Just a second though, betting markets are intrinsically subject to manipulation! Both in the markets themselves and in the world that the gamblers are purportedly "observing". The betters have active agency in the world they are betting on. The only thing stopping people who participate in these "markets" from manipulating outcomes in unethical, coercive ways is: nothing at all. The betting markets only serve to raise the stakes and create a medium for perverse incentives to play out. So yeah, in some cases or might be more "accurate", but what about the risks? At what cost?
We like to assume that markets are efficient because if not, there's an opportunity for arbitrage, but for betting markets, this just isn't true. Predictit, the most often referenced one, has a fee structure that makes it horribly inefficient: they take 10% of all winnings on a given bet and 5% of all withdrawals from the site. The latter actually isn't as big of a deal, but the 10% on winnings means that I can't make any money off of the spread when you have, for example, a market where Biden has a 88% chance of winning and Trump a 15% chance (which are the odds on the site as of right now). In a market without fees, I'd buy a Trump NO for 85 cents and a Biden NO for 12 cents. If Biden wins, I make 15 cents on the first bet and lose 12 cents on the second. If Trump wins, I lose 85 cents on the first bet and make 88 on the second. Either way, I'm up 3 cents. But with a 10% fee on winnings, I make 1.5 cents if Biden wins and I actually lose 5 cets if Trump wins. Then I pay another 5% when I try to withdraw my money (which isn't as bad of a fee, because in theory, I would just do a bunch of arbitrage to grow my pile fee-free and then withdraw at the end).<p>What ends up happening is that people on these sites don't get any value from betting on events that are 90%+ likely. Which is why the Trump/Biden market even puts Trump's chances as high as they are. I personally think Trump's chances of winning are less than 1%, but I'd lose money if I bet against him. Meanwhile, some of his biggest fans may not actually think that he has as high as a 15% chance winning, but if they win, they get a 600% return, so what's the difference.
Anybody who follows prediction markets will probably see that this doesn't mean prediction markets are any good, just that polls are bad. The valuations in prediction markets are simply a summary of what's been in the press over the past two hours or so. If the press is terrible, the prediction markets are equally terrible.
If anyone followed the betting odds on election night at one point the prediction markets went from heavy Biden favorite to heavy Trump favorite as he won Florida and it looked like he was ahead in a lot of the swing states, but for weeks pundits had been saying it might look like Trump would be ahead in these states early on due to propensity for conservatives to vote in person on election day and for democrats to vote by mail, which turned out to be true.<p>if these markets were actually good predictors I don’t think the wild swings we saw overnight would have happened.
After the first election day, Trump was up on predictit ~60/40. This tells me that the betters have no idea what they're doing. While Trump was leading in swing states, most metropolitan areas only counted a small fraction of ballots, so the early votes had very little statistical value.<p>Also, there are often tiny arbitrages where buying "no" for Biden would cost a few cents less than buying "yes" for Trump. In theory, you can make a guaranteed 1% return on betting no for both Trump and Biden, and it was as high as 5%. You can't take advantage of this arb due to fees on predictit, but you'd think rational investors would at least try to keep this even.
Prediction markets seem to be doing a really terrible job at predicting an election that already happened...<p>Right now predictit.org has Biden winning at 89% and Trump winning at 15%[1].<p>But if you look at it by "electoral margin victory" and add up all the Republican win margins, you get that Republicans have 28% chance of winning and Democrats have 95%.[2]<p>This is two weeks after the election that is pretty clear a Joe Biden/Democratic victory. These markets may get things right, but they also get things really, really wrong.<p>[1]Yes, adds up to over 100. You could bet against both for a guaranteed profit, however I believe the markets aren't allowing new traders and there are also fees and counter party risk to consider
[2] <a href="https://www.predictit.org/markets/detail/6653/What-will-be-the-Electoral-College-margin-in-the-2020-presidential-election" rel="nofollow">https://www.predictit.org/markets/detail/6653/What-will-be-t...</a>
there are <i>many</i> other things you have to remember when interpreting prediction market data. first, the rules of the contract may be slightly different than your intuitive interpretation based on the market title. second, you need to consider how long to expiry and overall market volume- lots of times the shares will sit at otherwise strange odds because no one wants to park their money somewhere for so long, or because there isn't liquidity. third, you have to consider the nature of the market itself- any gambling outfit can tell you how what people <i>want</i> to happen will bias the wagers. and fourth, your are looking at a very restricted view- for instance, right now donald trump has a 13% chance of winning the electoral college on predictit. a study over a longer period of time would tell us more. i've followed predictit for us politics for about 5 years fairly closely, and there are certainly lots of spurious prices. but its also true that polls can be wild outliers. since both methodologies are at least in the same ballpark as far as accuracy, i think you need a bigger more rigorous dataset to make any kind of claim. i believe someone did with the iowa electronic markets back in the day, but i havent read it
While I am an obsessive fan of 538 (I've even ordered a Fivey Fox hoodie), I'm a little amused at Nate talking down prediction markets.<p>It seems to me that if he thinks his predictions are that much better, he should be massively personally invested in those markets, using his better knowledge to clean up.
False. Betting markets claimed a Clinton victory in 2016. [1]<p>[1] <a href="https://www.realclearpolitics.com/elections/betting_odds/2016_president/" rel="nofollow">https://www.realclearpolitics.com/elections/betting_odds/201...</a>
This is lacks a strong causal analysis. We still don't understand why polls are consistently undershooting Trump support. Given the performance of polls in 2018, it seems highly correlated to Trump's presence on the ballot, and unclear if it will recur in a future presidential election. Ascribing better analysis to markets than polls requires us to ask "how?", and there isn't a clear mechanism.
>You also see betting markets are generally more Republican-leaning in ways that were ultimately born out<p>I think the first half of this is the money quote. The markets are definitely more Republican (or maybe just Trump?) leaning. I'm not convinced that the second half being true isn't just a coincidence.<p>IMO the analysis about how PredictIt bettors followed 2016 polling trends seems to be ascribing them an excess of rationality. I think if you mirror that S-curve diagonally you can see that: The betting markets put really high odds for Trump in states like New Hampshire and Minnesota(which went 55% to Biden but still had 25% odds of going blue). If you look at the mirroring red states which went 55% to Trump, there were no bettors expecting a blue wave (Missouri and South Carolina had ~5% odds of going blue).<p>When you look at markets like "Who Will Win California" or "Who Will Win the Popular Vote", I'm convinced that many people making bets are delusional.
Note that predictit gives Biden an 89% chance of winning the electoral college <i>now</i> and Trump 14% (why do the numbers never add up to 100% ?) . The election was two weeks ago. Moreover, I think the author is cherry-picking his data. Nevertheless, I did use predictit during the election as a kind of "nowcast".
Prediction markets might well have some big players trying to hedge.<p>For example, imagine you are a carmaker. You donate to politician A so he makes carmaker friendly policies if he wins. Politician B refuses your donation, so instead you bet that he'll win.<p>You size the donation and the bet so that whatever the outcome, your car company will make the same amount next year. Either through friendly laws increasing sales, or by winning the bet.<p>You've now removed political uncertainty from your business, and you can focus on designing the best cars.
Betting markets <i>currently</i> give Trump a 10% chance of winning this election. On election night, most betting markets had Trump a severe favorite (75-80% in many places), which in retrospect seems severely mis-calibrated.<p>This year betting markets were swamped, and continue to be swamped, by money supporting Trump. This year happened to also have a polling error that underestimated Trump's chances. Those two facts aligned together to form a situation where the betting markets seemed to better predict the election results than polls. However, based on one very strange election I wouldn't extrapolate out that betting markets are always better than polls.
Before the election, I applied the theory of “shy Trump voter”. Found the research that about 5% of dem leaning and 10% of gop leaning poll respondents basically lie that they are undecided. I took the average of polls from 538 and allocated undecided voters in accordance with numbers above.<p>The result was very well aligned with actual results and prediction markets. Adjusted numbers predicted that FL, NC, GA, AZ would go to Trump and Biden margins in WI, MI, and PA would be much thinner than predicted by 538.<p>Outside of GA and AZ that turned out to be on point and even in GA and AZ Biden margins were razor thin. Personally I think that Biden has to thank Stacey Abrams for Georgia, and I am not sure what happened in AZ, maybe people are more outspoken there than elsewhere.<p>Now, there are also opinions that there is no such thing as shy Trump voter and my calculation could be just dumb luck.
I <i>also</i> feel like "this analysis can be misleading if you ignore election night itself in which the prediction markets were wildly wrong and had Trump's odds of winning as high as 80% while a lot of the models still thought Biden had the edge throughout the entire process."<p>This seems to be a topic the trolls are heavily invested in, no pun intended, judging by the downvote:comment ratio.
Of course they beat polls. Polls are overwhelmingly operated by partisans in big media that are trying to act as an influence on the vote, not trying to figure out which way it's going to go. That's the very obvious reason the polls keep missing in such a humiliatingly outsized way. They don't care that they miss - otherwise they'd do the easy thing and adjust properly for factors like intimidated / scared / threatened voters - when their purpose is to try to manipulate the vote through propaganda.