I wrote a tweet thread earlier today about one of my current working papers. Here’s the start:
Anyway, the thread is so long it might as well have been a post, so I’ve coped the whole thread below after the jump!
I’m gonna do a tweet thread about the paper to make myself feel better about this! (Also because I still kinda like the paper against all odds)
It’s basically a behavioral model of belief formation and an application to multi-sender persuasion via the example of political spin.
So on one hand it’s going to be about: could there be a common reason why people are weird (sorry, non-Bayesian) when forming beliefs and also why they are so crappy at thinking about probabilities?
And on the other hand: why is political messaging so persistently extreme?
Basically my conjecture is: credulity. If enough people are credulous enough, you’ll get (i) stubborn, biased, non-Bayesian beliefs, (ii) the ‘hockey stick’ overestimation of small probabilities, and (iii) political parties that yell wild stuff with a tiny attention span.
(I think this is also going to end up consistent btw with asymmetric partisan distrust of any media org that attempted to report the maximum likelihood state of the world…)
OK, so here’s a story. You fancy yourself a bit of a painter so you paint pictures. But they don’t sell. People don’t get it, they don’t like it, you don’t win prizes.
So how good of a painter do you think you are?
Think about two stories: the most optimistic one and the most pessimistic one.
Maybe you’re truly an awesome painter and you just haven’t been appreciated yet. Keep at it! You’ll find your audience!
Maybe you’re not actually any good after all. You’ve been found out. Quit now.
So, obviously one of those stories is more plausible, in a maximum likelihood sense, than the other. Both of them could be true, but in this example, the pessimistic story is just a better fit for the data. The optimistic one is less likely.
I model someone who forms their belief by basically taking the average of those two stories but weighted for plausibility. The story that’s less likely to have generated the data gets a penalty. The more skeptical the person is, the bigger the penalty.
(btw this model is 100% inspired by Froeb, @ganglmair, and Tschantz 2016!)
So what? A credulous person ends up believing the less likely story ‘too much’ and a skeptical person believes the more likely story ‘too much’.
But! The amount of skepticism that makes you agree with Bayes isn’t always the same! It depends on the data.
Beliefs are stubborn, buggy, and, because mistakes are bigger in the less likely direction, systematically biased towards wild stories instead of more plausible ones.
There’s a bunch of ways in which that’s consistent with what we know about beliefs!
OK, so, probabilities. It’s extremely well known since at least Kahneman & Tversky that people put ‘too much’ weight on small probability events.
This model generates that behavior too, with the same mechanism!
Basically, a credulous person puts ‘too much’ weight on the low probability event. In terms of the math, the credulous belief formation model looks a heck of a lot like one-parameter ‘decision weight’ models e.g. Tverseky & Kahneman 92, Camerer & Ho 94, or Wu & Gonzalez 96.
Therefore: the same credulous person who is too receptive to unlikely stories will also have a more S-shaped decision weight function.
They just believe too much, man. Like if someone looks at an underdog and says, well it’s a 50-50 shot, they either win or lose! Like… no
Finally: politics. What if those optimistic and pessimistic stories weren’t coming from your own tortured mind but were being fed to you by spin doctors?
So I make a model: two parties send you stories that have to be consistent with the (mutually agreed upon) evidence but apart from that, go nuts.
The poor citizen listening hears Both Sides™ and comes to some opinion. Hey, they’re too busy to be Bayesian folks!
What happens? If the average person is credulous enough (credulous enough to be wonky in belief and probability), both parties tell stories that are maximally extreme: they include the evidence, sure, but they interpret it in absolutely the most favorable possible way for them.
You get the optimistic and pessimistic stories in equilibrium! We get persistent, extreme, and—here’s the kicker—effective political spin that sounds wild to someone who’s just looking at… the evidence.
What else? Should you litigate an issue over when new evidence comes in? No. Muddy the waters, maybe (if you can) make up some evidence to make the other side look bad, then move on right now to the next issue. The poor media is stuck both-sidesing but you’re long gone!
What else? Partisanship. If you’re credulous enough to believe (for example) that climate change could truly be a conspiracy among the world’s scientists to get research funding, then you’ll think that the media is hopelessly biased even if they just report like Bayesians.
And if one party is consistently anti-‘science’ (I’ll assume science=Bayesian in this analogy, but, woof) then that party’s supporters will grow to loathe those elitist Bayesians who seem, magically, to be closer to the other party on just about everything. cough
As @StephenAtHome famously said, “it is a well known fact that reality has a liberal bias”.
Which, in the sense of this, my extremely obscure behavioral economic model, yes.
Hey so that’s it! I explained belief formation, decision weights, and all of politics. Not bad!
Obviously kidding, but I do kinda like the paper. It’s a one-parameter behavioral model where the same parameter value is consistent with a bunch of different (realistic?) stuff.
Anyway here’s the abstract, and if you want something to read to your enemy to torment them the paper is at https://jimdcampbelldotcom.files.wordpress.com/2020/06/skepticism_and_credulity.pdf