The big issue here is that the audience struggles with risk assessment and various levels of confidence.<p>The idea of limiting the spread of ideas and hypothesis and research that is most controversial is that most people implicitly attach a confidence level to information based on how often and loudly it is repeated to them, or based on their existing bias, like a distrust for institutions or authority or a political party affiliation, etc.<p>This is true of most everyone, our subconscious does this assessment, and it takes a lot of effort to re-calibrate our confidence levels and assessments in ways that don't match 1:1 with "how hot of a topic it is, and how often we came in contact with it".<p>What happens then is that a lead such as ivermectin and hydroxychloroquine get attached in people's mind an innacurate confidence interval. The level of public discourse around them is outsized to the real raw factual data and the experimentation and theory that has been accrued and tallied till now.<p>When I then try to talk about it to someone who say succumbs to this bias, they'll say, what if it works? Why not explore it further? Why are "imaginary evil group" trying to silence and stop this? And you want to say, yes off course, it is being explored further, but there are also other opportunities being explored with already existing higher confidence in their success such as vaccines and masking. And if in the future the exploration of those lower confidence leads also proves to be fruitful then the current consensus will change and the current recommendations will update.<p>But this risk assessment process seems to confuse a lot of people. If any of those leads does turn up good later and consensus change, they then make people believe that those more controversial initially low confidence and more outlandish claims are therefore true and so now they will again be biased to have higher confidence on anything controversial, even though the reason for it being controversial at that time is the lack of raw data, experiment and theory behind it.<p>That creates a weird mismatch, and it's not following the scientific process. This is dangerous bias to have, and I believe it is important for all media when they share information to reflect more precise confidence intervals.<p>This starts by hearing and talking about things in proportion closer to the current confidence interval. So we should hear about one thing every day when it is high confidence, and only hear about something else once a month if it is lower confidence.<p>But because media is driven by money, you get a skewed ratio, where low confidence leads that are sexy, dreamy, have a lot of hope attached to them, or seem to have a good conspiracy story around them get published and promoted a lot more.<p>The second thing is media should be explicit about confidence intervals, Joe Rogan should mention how sure of these things are we? What amount of due diligence has happened? What is left unknown? How likely is the possibility this if a false lead? Etc.<p>So what some people can see as censorship, in my opinion is about having a responsability towards factual information, which includes current confidence and assessments.<p>Now I don't know about Joe Rogan, but if he for example had 10 podcasts all done with other scientists about vaccines, and then the one about ivermectin, that would already be a better representation of the factual confidence interval known at the time. And then if in each one of them, he'd clearly articulate or ask about how sure each statement is, which one is a potential lead waiting to be explored, and which are leads that have begun being explored and been reinforced, to what extent, what is left to be more certain, and how likely are they to be dead ends? Well I would say he's being responsible with the power he yields. Anything short of that he'd be less and less responsible and more and more a simple for profit willing to spread whatever information gives him the most power and wealth.<p>And this goes for all media personality or business. The ad revenue model has corrupted a lot of media in dropping this responsability, since driving views, clicks, retweets and debates drives up ad revenue as well. I'm sure media companies struggle to balance this responsibility against their profit margins.<p>This is even happening to scientists and researchers, where access to Twitter followers, political financing and support, and grant money also benefits from lowering that personal responsibility. Which I think is what happens to some of the scientists Joe interviews.