Great article, thanks for sharing. I found your comments on the 'Recursion' phase of your model especially interesting. Nuanced viewpoints or the state of being undecided are phenomena which only can exist in the realm of the individual. Groups, by nature, seem to evoke rapid consensus.
The platforms of the American Democratic & Republican parties aren't exactly rigid political philosophies like Marxism or Anarchism. It has always surprised me that they each seem to come to a consensus so rapidly even across wildly different domains. Almost without fail, the consensus each party reaches is inevitably diametrically opposed to that of the opposing party. Thinking in terms of your model, I wonder if those in American politics might also participate in a sort of negative cycle in which they harden themselves against the viewpoints of their opponents.
I think you are exactly right. People tend to align themselves with political parties (“I’m a Republican” or “I’m a Democrat”) instead of with general political philosophies, specific issues, or even take candidates one by one. Once they do that, then their information feed, whether self-curated or algorithmic, starts to reaffirm their beliefs and reflect group consensus. Once the echo chamber has magnified your confidence to certainty, you are equally certain that contrary viewpoints must be wrong.
This reads brilliant to me. Upon reflection, however, I would have two queries, which may qualify your conclusions:
- The amplification property of the echo chamber is derived from Condorcet’s theorem. The ‘voters’ in Condircet’s theorem are supposed to reach each their verdict independently, aren’t they? What about the case in which voters are prone to ‘contagion’, eg from a charismatic leader or simply a ‘miraculous’ event? These are often described as cases of ‘madness’ as opposed to ‘wisdom’ of the crowds.
- Relatedly, your illustration of the working of the echo chamber seem to presuppose that individuals choose to belong to the echo chamber that reflect their beliefs. What about the case in which individuals choose their beliefs in order to belong to an echo chamber?
I am not sure how these arguments should qualify your conclusions but my intuition is that they should lead to a more pessimistic view of the phenomenon.
Good points. I didn’t go into this detail, but technically the Independence and Competence clauses of Condorcet’s Theorem aren’t needed to build an echo chamber. Condorcet needs them for group judgment to converge on the truth. But to build group confidence that exceeds the confidence of the average voter, all we really need is Condorcet Lite: the pure mathematics of the coin toss example.
I’ll be honest that it never occurred to me that someone would choose their beliefs specifically to belong to an echo chamber. That seems really weird. To me it is more plausible that someone finds certain beliefs to be attractive ones, and seeks out like-minded people as compatriots. They might find themselves in an epistemic group by accident. Next thing you know, it’s amplification, reflection, and recursion.
Just because it seems really weird is no reason to suppose it doesn't happen. That's how people with a pathological need to "belong" somewhere end up in cults, or bouncing around from socialism to fascism to Satanism or whatever.
It does not seem weird to me either, that people would choose their beliefs in order to belong. Arnold King has a good piece on the subject (https://open.substack.com/pub/arnoldkling/p/believing-in-order-to-belong?r=885jy&utm_campaign=post&utm_medium=web&showWelcomeOnShare=false). He makes the observation that beliefs generally recognised to be true are not good from a ‘tribal’ perspective, ie, a perspective that prioritises belonging, because they are not sufficiently distinctive. Generally recognised false beliefs work well as a marker of belonging but are costly in terms of cognitive dissonance and overall fitness. Unfalsifiable beliefs work best.
What’s complicating is that people will join groups that are contingently connected to epistemic groups, but they join for community, not belief. For example, I know several cultural Jews (self-identify as Jewish, engage in some of the rituals) but who are atheists.
So when they are good, they are very very good, and when they are bad they are horrid?
I guess I'm not sure I understand what the good news is.
Sure, sports playoffs that feature multiple games can help take the randomness out of the final champion when the differences between the semifinalists are real but small. (Although sometimes "any given Sunday" may be more exciting for the fans.) But I'm not sure what the merit is in amplifying a very small level of confidence. I think the good thing about having a very small level of confidence is that one remains open to the idea that one's confidence might be misplaced. I'm not sure we always want to turn a small level of confidence into unassailable certainty.
Juries and sports tournaments and legislators all have to come to a decision, so, okay, there are conditions where this kind of amplification may be beneficial. But I think those are pretty limited circumstances (and we can see pretty clearly that it doesn't work well for legislators precisely because any given issue of policy gets caught up in a more general, echo-chamber-driven, political polarization). If everybody is in the same betting pool, the power of crowds may actually work. But when one extreme or the other of the bell curve sets up their own in-group, things go awry.
Empirically, at least, it doesn't appear to be working quite the way one would hope - we aren't exercising the power of crowds to uncover the truth, we are selecting crowds that will make us believe we already have the truth. If we do happen to have the truth, well, I suppose no harm done, but if we were so sure, we wouldn't need the amplification of echo chambers. And a lot of these echo chambers seem to exist precisely in order to amplify wrong views by self-selecting the handful of outliers who are equally misguided in their beliefs.
1. Almost everybody thinks echo chambers are the result of cognitive biases, epistemic vice, motivated reasoning, and other mistakes. They are wrong. ECs naturally arise through rational processes in group reasoning. This matters because it’s important to have the right explanation for when we get things wrong. Otherwise we won’t know what to fix.
2. ECs can lead to increased confidence in the (actual) truth. Here’s why this matters. Suppose you’re 51% sure that, I dunno, some tax policy is the best. If you are only 51% confident, you could pretty easily be talked out of it, and maybe some slick-talking con man could come along and convince you otherwise. Let’s suppose that the tax policy is genuinely the best. In this case you would be talked out of believing in the truth. But if you were in a truth-amplifying EC, your 51% confidence would be boosted up to 99% confidence. Now you are inoculated against being persuaded that the tax policy is not the best. You’re 99% sure it is! This is good because you are now holding onto the truth more strongly.
The cautionary lesson is that we need to be careful about which epistemic groups we join, and try to get the best foundational beliefs we can, since our credence in them might swiftly be amplified.
(1) seems like news more than it seems specifically like good news.
But it's (2) that is the problem, and doesn't strike me as good news at all. I'm not sure I want my 51% certainty to become 100% certainty - I definitely don't want my 51% certainty to become 100% uncertainty just to get along with the crowd. With certainty so low, it is my intellectual obligation to resist the rush to judgment.
When the echo chambers throw out Independence and Competence, this strikes me as exactly a recipe for disaster - everyone having an exaggerated sense of certainty about their convictions, whether or not they are true, simply because the world is now small enough that they can find a critical mass of similarly inclined voices.
(Or worse, to mangle a metaphor, and has been alluded to above by Philalethes, you find yourself adopting spandrel views to sustain your convictions. Join as an anti-taxer and find yourself becoming an anti-vaxxer.)
We have replaced reasoning with the authority of the mob. Surely that's not something we should call good news.
Well, it is an objective fact whether you have hold of the truth or not. If you have it, it’s best to believe it strongly so that you aren’t persuaded to give it up on the basis of sophistical arguments. In this case, inhabiting an echo chambers will help. If you don’t have the truth and believe a falsehood instead, then it is best to believe it weakly so that you will easily abandon it when given good reasons. Echo chambers are harmful in this situation. I never said it was unmitigated good news!
"Why is this good? Because you’re 100% sure of something that is actually true."
This is where the article falls apart. By your model of echo chambers, it is only luck that leads you to 100% certainty in "actually true" beliefs. This would suggest that echo chambers are at best net neutral.
I think the state of initial beliefs falls under the definition of luck relating to any initial information you have received about that topic.
For instance, you use anthropomorphic climate change as an example. Presumably people with parents who don't believe in it would start with initial mild disbelief, which would turn into full disbelief in an echo chamber. That is not their own fault, it relates to circumstances out of their control e.g. luck.
Likewise if you had no information at all about a topic, for instance Blake Lively vs Justin Baldoni feud, your final belief would strongly depend on whether you read pro Justin or pro Blake content first.
People overcome sensory deficits to gain knowledge of the world (e.g. Helen Keller), they break out of cults (e.g. Tara Westover), they refuse to have an opinion about Lively vs. Baldoni without real evidence (I don’t even know who the latter person is, or have heard of this beef). Knowledge isn’t just a matter of luck. Yes, it’s bad to be raised in an epistemic group that drives to certainty in falsehoods. But people overcome bad upbringings all the time.
I live in an area where my worldview is a minority one. I find myself in echo chambers on the regular, including where I work, a couple of meetups I attend to learn specific skills (language and financial skills) and recently, a community book group I attended. The experience I keep having is one where people assume I am aligned with them politically and/or religiously (they were Buddhist or secular atheist and I am a Christian, which several in the book group referred to as antiquated). I really struggle with this, especially in the workplace. Much of the time I just accept that I have to go along to get along, but sometimes I’m contemplating straight up self censorship when people ask me pointed questions and there is an expected answer. I am not the type to engage in debate, especially with people I don’t know well/trust. Any thoughts on how to deal with this dynamic while trying to navigate “community” in the broader sense?
I think that’s a hard question. Probably everyone self-censors to a certain degree in those kinds of situations. I often think, “is this a battle I want to fight?” And “do I want to fight it right now?” And “is this the hill I want to die on?” Sometimes the answer is yes to all those questions. Often, though, the answer is no. In that condition, when I hear people saying (what I think is) dumb stuff, I just let roll off like water off a duck’s back.
I liked the reply because it reflects my own instinctive way of thinking. However, I am not sure it is the morally right answer. Personally, I feel somewhat embarrassed by my own tendency to passively go along with views I disagree with, because ‘it is not worth fighting’ or, more precisely, it because it may spoil my participation in a community that I highly value, even if I enjoy an enviably independent position, economically and otherwise.
I think a morality that requires us to go to battle over every disagreement is both unreasonably demanding and an example of short-term thinking. I’m never going to convince someone of a viewpoint of great importance to me when I have already alienated them by fighting over 100 little issues. Your point about the value of participating in a worthy community is often under appreciated.
Great article, thanks for sharing. I found your comments on the 'Recursion' phase of your model especially interesting. Nuanced viewpoints or the state of being undecided are phenomena which only can exist in the realm of the individual. Groups, by nature, seem to evoke rapid consensus.
The platforms of the American Democratic & Republican parties aren't exactly rigid political philosophies like Marxism or Anarchism. It has always surprised me that they each seem to come to a consensus so rapidly even across wildly different domains. Almost without fail, the consensus each party reaches is inevitably diametrically opposed to that of the opposing party. Thinking in terms of your model, I wonder if those in American politics might also participate in a sort of negative cycle in which they harden themselves against the viewpoints of their opponents.
I think you are exactly right. People tend to align themselves with political parties (“I’m a Republican” or “I’m a Democrat”) instead of with general political philosophies, specific issues, or even take candidates one by one. Once they do that, then their information feed, whether self-curated or algorithmic, starts to reaffirm their beliefs and reflect group consensus. Once the echo chamber has magnified your confidence to certainty, you are equally certain that contrary viewpoints must be wrong.
This reads brilliant to me. Upon reflection, however, I would have two queries, which may qualify your conclusions:
- The amplification property of the echo chamber is derived from Condorcet’s theorem. The ‘voters’ in Condircet’s theorem are supposed to reach each their verdict independently, aren’t they? What about the case in which voters are prone to ‘contagion’, eg from a charismatic leader or simply a ‘miraculous’ event? These are often described as cases of ‘madness’ as opposed to ‘wisdom’ of the crowds.
- Relatedly, your illustration of the working of the echo chamber seem to presuppose that individuals choose to belong to the echo chamber that reflect their beliefs. What about the case in which individuals choose their beliefs in order to belong to an echo chamber?
I am not sure how these arguments should qualify your conclusions but my intuition is that they should lead to a more pessimistic view of the phenomenon.
Good points. I didn’t go into this detail, but technically the Independence and Competence clauses of Condorcet’s Theorem aren’t needed to build an echo chamber. Condorcet needs them for group judgment to converge on the truth. But to build group confidence that exceeds the confidence of the average voter, all we really need is Condorcet Lite: the pure mathematics of the coin toss example.
I’ll be honest that it never occurred to me that someone would choose their beliefs specifically to belong to an echo chamber. That seems really weird. To me it is more plausible that someone finds certain beliefs to be attractive ones, and seeks out like-minded people as compatriots. They might find themselves in an epistemic group by accident. Next thing you know, it’s amplification, reflection, and recursion.
Just because it seems really weird is no reason to suppose it doesn't happen. That's how people with a pathological need to "belong" somewhere end up in cults, or bouncing around from socialism to fascism to Satanism or whatever.
It does not seem weird to me either, that people would choose their beliefs in order to belong. Arnold King has a good piece on the subject (https://open.substack.com/pub/arnoldkling/p/believing-in-order-to-belong?r=885jy&utm_campaign=post&utm_medium=web&showWelcomeOnShare=false). He makes the observation that beliefs generally recognised to be true are not good from a ‘tribal’ perspective, ie, a perspective that prioritises belonging, because they are not sufficiently distinctive. Generally recognised false beliefs work well as a marker of belonging but are costly in terms of cognitive dissonance and overall fitness. Unfalsifiable beliefs work best.
What’s complicating is that people will join groups that are contingently connected to epistemic groups, but they join for community, not belief. For example, I know several cultural Jews (self-identify as Jewish, engage in some of the rituals) but who are atheists.
So when they are good, they are very very good, and when they are bad they are horrid?
I guess I'm not sure I understand what the good news is.
Sure, sports playoffs that feature multiple games can help take the randomness out of the final champion when the differences between the semifinalists are real but small. (Although sometimes "any given Sunday" may be more exciting for the fans.) But I'm not sure what the merit is in amplifying a very small level of confidence. I think the good thing about having a very small level of confidence is that one remains open to the idea that one's confidence might be misplaced. I'm not sure we always want to turn a small level of confidence into unassailable certainty.
Juries and sports tournaments and legislators all have to come to a decision, so, okay, there are conditions where this kind of amplification may be beneficial. But I think those are pretty limited circumstances (and we can see pretty clearly that it doesn't work well for legislators precisely because any given issue of policy gets caught up in a more general, echo-chamber-driven, political polarization). If everybody is in the same betting pool, the power of crowds may actually work. But when one extreme or the other of the bell curve sets up their own in-group, things go awry.
Empirically, at least, it doesn't appear to be working quite the way one would hope - we aren't exercising the power of crowds to uncover the truth, we are selecting crowds that will make us believe we already have the truth. If we do happen to have the truth, well, I suppose no harm done, but if we were so sure, we wouldn't need the amplification of echo chambers. And a lot of these echo chambers seem to exist precisely in order to amplify wrong views by self-selecting the handful of outliers who are equally misguided in their beliefs.
Here’s the good news:
1. Almost everybody thinks echo chambers are the result of cognitive biases, epistemic vice, motivated reasoning, and other mistakes. They are wrong. ECs naturally arise through rational processes in group reasoning. This matters because it’s important to have the right explanation for when we get things wrong. Otherwise we won’t know what to fix.
2. ECs can lead to increased confidence in the (actual) truth. Here’s why this matters. Suppose you’re 51% sure that, I dunno, some tax policy is the best. If you are only 51% confident, you could pretty easily be talked out of it, and maybe some slick-talking con man could come along and convince you otherwise. Let’s suppose that the tax policy is genuinely the best. In this case you would be talked out of believing in the truth. But if you were in a truth-amplifying EC, your 51% confidence would be boosted up to 99% confidence. Now you are inoculated against being persuaded that the tax policy is not the best. You’re 99% sure it is! This is good because you are now holding onto the truth more strongly.
The cautionary lesson is that we need to be careful about which epistemic groups we join, and try to get the best foundational beliefs we can, since our credence in them might swiftly be amplified.
(1) seems like news more than it seems specifically like good news.
But it's (2) that is the problem, and doesn't strike me as good news at all. I'm not sure I want my 51% certainty to become 100% certainty - I definitely don't want my 51% certainty to become 100% uncertainty just to get along with the crowd. With certainty so low, it is my intellectual obligation to resist the rush to judgment.
When the echo chambers throw out Independence and Competence, this strikes me as exactly a recipe for disaster - everyone having an exaggerated sense of certainty about their convictions, whether or not they are true, simply because the world is now small enough that they can find a critical mass of similarly inclined voices.
(Or worse, to mangle a metaphor, and has been alluded to above by Philalethes, you find yourself adopting spandrel views to sustain your convictions. Join as an anti-taxer and find yourself becoming an anti-vaxxer.)
We have replaced reasoning with the authority of the mob. Surely that's not something we should call good news.
Well, it is an objective fact whether you have hold of the truth or not. If you have it, it’s best to believe it strongly so that you aren’t persuaded to give it up on the basis of sophistical arguments. In this case, inhabiting an echo chambers will help. If you don’t have the truth and believe a falsehood instead, then it is best to believe it weakly so that you will easily abandon it when given good reasons. Echo chambers are harmful in this situation. I never said it was unmitigated good news!
"Why is this good? Because you’re 100% sure of something that is actually true."
This is where the article falls apart. By your model of echo chambers, it is only luck that leads you to 100% certainty in "actually true" beliefs. This would suggest that echo chambers are at best net neutral.
I don’t see why it would be luck, unless you are choosing which epistemic group to join by pure chance.
I think the state of initial beliefs falls under the definition of luck relating to any initial information you have received about that topic.
For instance, you use anthropomorphic climate change as an example. Presumably people with parents who don't believe in it would start with initial mild disbelief, which would turn into full disbelief in an echo chamber. That is not their own fault, it relates to circumstances out of their control e.g. luck.
Likewise if you had no information at all about a topic, for instance Blake Lively vs Justin Baldoni feud, your final belief would strongly depend on whether you read pro Justin or pro Blake content first.
Luck, luck, luck.
People overcome sensory deficits to gain knowledge of the world (e.g. Helen Keller), they break out of cults (e.g. Tara Westover), they refuse to have an opinion about Lively vs. Baldoni without real evidence (I don’t even know who the latter person is, or have heard of this beef). Knowledge isn’t just a matter of luck. Yes, it’s bad to be raised in an epistemic group that drives to certainty in falsehoods. But people overcome bad upbringings all the time.
Of course people break out of patterns of thought. But the whole point of an echo chamber is to make that more difficult.
You have yet to prove that echo chambers do more good than harm.
I live in an area where my worldview is a minority one. I find myself in echo chambers on the regular, including where I work, a couple of meetups I attend to learn specific skills (language and financial skills) and recently, a community book group I attended. The experience I keep having is one where people assume I am aligned with them politically and/or religiously (they were Buddhist or secular atheist and I am a Christian, which several in the book group referred to as antiquated). I really struggle with this, especially in the workplace. Much of the time I just accept that I have to go along to get along, but sometimes I’m contemplating straight up self censorship when people ask me pointed questions and there is an expected answer. I am not the type to engage in debate, especially with people I don’t know well/trust. Any thoughts on how to deal with this dynamic while trying to navigate “community” in the broader sense?
I think that’s a hard question. Probably everyone self-censors to a certain degree in those kinds of situations. I often think, “is this a battle I want to fight?” And “do I want to fight it right now?” And “is this the hill I want to die on?” Sometimes the answer is yes to all those questions. Often, though, the answer is no. In that condition, when I hear people saying (what I think is) dumb stuff, I just let roll off like water off a duck’s back.
I liked the reply because it reflects my own instinctive way of thinking. However, I am not sure it is the morally right answer. Personally, I feel somewhat embarrassed by my own tendency to passively go along with views I disagree with, because ‘it is not worth fighting’ or, more precisely, it because it may spoil my participation in a community that I highly value, even if I enjoy an enviably independent position, economically and otherwise.
I think a morality that requires us to go to battle over every disagreement is both unreasonably demanding and an example of short-term thinking. I’m never going to convince someone of a viewpoint of great importance to me when I have already alienated them by fighting over 100 little issues. Your point about the value of participating in a worthy community is often under appreciated.
echo chambers : )
https://youtu.be/mHFusD9xuH4?si=HBUIhrpRNgG4Jpm1