I think this might be one reason why pair programming can be difficult at times. I’ve personally found success in pair programming, but it usually came after adopting an attitude of moving past the stuff we know and trying a more free, humble approach.<p>This isn’t the best way to put it, but for lack of better words just starting with the mentality of admitting that it’s ok not to know what to do next and that we’re here to solve it really helps fight the kind of bias the article talks about.<p>It’s when you get to the point that you’re comfortable just reasoning aloud that the beauty of pair programming comes to light. Silly mistakes happen constantly but they’re caught right away and you move on so quickly in your shared state of excitement that a lot of the inhibition just goes out the window.<p>Mileage varies though, as not everyone gets the same benefits that I described.
>><i>Deliberating in person isn't superior to online. According to research by Simon Lam and John Schaubroeck, virtual teams are more likely to overcome the common-knowledge effect compared to face-to-face teams, perhaps due to easier access to notes and materials.</i><p>That was an interesting observation, although the study didn't mention that another confounder could have been non-verbal communication often being much more suggestive/stronger (both to reinforce or suppress influence) in-person.
If you're going for "none os us is smarter than all of us", the flaw in such group decisions is that they are made openly in the group. The first opinion influences the next and so on, and the louder and most persistent opinion gain more weight in the final decision than they deserve. The biases pile up and the quality of the decision goes down.<p>In order to take advantage of the knowledge of the many, each individual needs to form their opinion independent of the influence of the others. From there you move on to a structured and mediated discussion (i.e., not an adhoc free for all). Of course, participants can change their minds, but they do so based more careful considerations and far less based on the emotions and biases of a traditional group decisions.<p>See "The Influencial Mind" by Tali Sharot for more details.<p><a href="https://m.youtube.com/watch?v=2HMsEVmnhZE">https://m.youtube.com/watch?v=2HMsEVmnhZE</a>
While interesting, their study has a weird relation to the headline claim. It’s not unrelated, but it’s not a demonstration of the claim (I don’t see that they measured the time teams spent discussing each statement), nor does it really a seem like a consequence of the claim.<p>So while I appreciate some of the reminders about decision-making it’s an oddly structured article.
> Project statements were distributed unevenly<p>This seems to be comparing the effects of the distribution of mostly positive information about two bad projects with the distribution of mostly negative information about one good project.<p>The source of the stated bias cannot be concluded without isolating for the effects of the other variables and this study seems to be lacking significant permutations of information sentiment, distribution strategy and project quality to be meaningful.<p>It could also be the case that negative information spreads more easily or that positive information is harder to introduce into a group than negative information. Both of these conclusions seem equally derivable from the results of this very limited study.
This study is comparing individuals having complete knowledge vs. a team with individuals having incomplete knowledge. Yes, the communication complexity tax must be paid. What am I missing?
Familiarity is a huge factor in decision-making.<p>Apple, a multi-billion dollar company, made the Apple Watch to track your health. Something was missing for half of the population: cycle tracking. No decision maker was familiar enough with women's health to understand the importance of aligning cycle information with other health information.
It's common to see this effect manifest during the Mt Everest simulation[0], which is sort of a touchstone in business school organizational behavior classes.<p>[0] <a href="https://hbsp.harvard.edu/product/8867-HTM-ENG" rel="nofollow noreferrer">https://hbsp.harvard.edu/product/8867-HTM-ENG</a>
There is a grain of truth in it when they say that the IQ of a team equals the IQ of the most intelligent team member divided by the number of people in the team.
It seems like a rather artificial scenario due to the available information being explicitly written down? And yet they reproduced the effect. That’s pretty interesting.<p>Would it still happen if people were told what the experiment was about and they came up with a strategy first? One strategy might be to copy everything they got to their notes and combine them at the beginning of the meeting.<p>I wonder if there’s a lesson in that for real life.
It's ancient news that group decision making only helps if the group members don't influence each other.<p>See "The Wisdom of Crowds"