U.K. privacy campaign group Big Brother Watch warned that the bill would induce social media platforms to “over-remove” content, according to Olson.<p>Olson gave an example of content potentially targeted by the law, saying that Ofcom would know if women in a certain area in Britain were likely to consume alleged “Covid misinformation” online. Ofcom could then warn social media companies of penalties if the trend were not altered, according to Olson. The proposed law would also reportedly require social media companies to perform “regular risk assessments” and take action on alleged harmful or illegal content.
"""That could, for instance, be data showing that women in certain parts of the U.K. are more liable to read Covid misinformation, or that certain teens are “hyper-exposed” to self-harming content. Ofcom would then tell the social media firm to tweak its algorithms to change those statistics, or be punished."""<p>"Lies, damned lies, and statistics" came to mind. I also strongly suspect that the internal research would get outsourced or otherwise far more difficult to discover.