Here is one way to disable "FLoC"<p>chrome://settings/privacySandbox<p>Evolution from FLoC<p>FLoC ended its experiment in July of 2021. We've received valuable feedback from the community^1 and integrated it into the Topics API design. A highlight of the changes, and why they were made, are listed below:<p>FLoC didn't actually use Federated learning, so why was it named Federated Learning of Cohorts?<p>This is true. The intent had been to integrate federated learning into FLOC but we found that on-device computation offered enough utility and better privacy.<p>FLoC added too much fingerprinting data to the ecosystem<p>The Topics API significantly reduces the amount of cross-site identifiable information. The coarseness of the topics makes each topic a very weak signal; different sites receiving different topics further dilutes its utility for fingerprinting.<p>Stakeholders wanted the API to provide more user transparency<p>The Topics API uses a human-readable taxonomy which allows users to recognize which topics are being sent (e.g., in UX).<p>Stakeholders wanted the API to provide more user controls<p>With a topic taxonomy, browsers can offer a way (though browser UX may vary) for users to control which topics they want to include<p>The Topics API will have a user opt-out mechanism<p>FLoC cohorts might be sensitive<p>FLoC cohorts had unknown meaning. The Topics API, unlike FLoC, exposes a curated list of topics that are chosen to avoid sensitive topics. It may be possible that topics, or groups of topics, are statistically correlatable with sensitive categories. This is not ideal, but it's a statistical inference and considerably less than what can be learned from cookies (e.g., cross-site user identifier and full-context of the visited sites which includes the full url and the contents of the pages).<p>FLoC shouldn't automatically include browsing activity from sites with ads on them (as FLoC did in its initial experiment)<p>To be eligible for generating users' topics, sites wil have to use the API.<p><a href="https://github.com/jkarlin/topics" rel="nofollow">https://github.com/jkarlin/topics</a><p>It is remarkable to me that Google can freely experiment on whomever they wish. If the experiments demonstrate negative effects, e.g., generation of execssive amounts of fingerprinting, it's unfortunate for those who were swept up in these "experiments".<p>Why not ask users if they want to volunteer to particpate in a trial/experiment.<p>Imagine if drug companies did not obtain permission to test their compounds on new patients. Instead they just substituted the new drug into what they sold on the market. Same label. Chrome is Chrome, right. Nevermind all the undisclosed variations and experiments. For example, "field trials" identified by only a number. This is hardly informed disclosure and consent.<p>1. This is amusing. What users were solicited for feedback. Perhaps they are referring to some surveillance they conducted, looking for mentions of FLoC.