This is Google extending an invite to the EU and other regulators to bring their war hammers and forever ban browser makers from pushing targeted ads and enabling profiling of people. Maybe break Google up and make the Google Chrome team a separate company with restrictions on how much it can work hand-in-hand with Google? Anyone in the EU who can file complaints about these abuses of market power?<p>Tell the people you know to switch from Google Chrome to another browser as their primary one. Google will still pester them on Google’s online properties to install Chrome, and may resort to other tricks on Android. But we are at a time when this can gather momentum and result in some good for all in the future (not mainly for Google, as it seems to be now).
I hate tracking as much as the next guy, but this article is so disingenuous that it's painful to read.<p>If you literally look up the most upvoted HN article about FLoC [1], it specifies two main issues with FLoC in BOLD: Fingerprinting and Cross-context exposure. Which if I understand correctly the Topics API fixes. But the article implies that these are minor problems that we never really cared that much about.<p>Whether it is this or constant aggressive writing, it seems that the goal of this blog post is to simply inspire anger and hate, while not furthering the discussion on the topic. Which makes this article unhelpful (if not damaging) to the goal of personal data privacy.<p>[1] <a href="https://news.ycombinator.com/item?id=26344013" rel="nofollow">https://news.ycombinator.com/item?id=26344013</a>
[2] <a href="https://www.eff.org/deeplinks/2021/03/googles-floc-terrible-idea" rel="nofollow">https://www.eff.org/deeplinks/2021/03/googles-floc-terrible-...</a>
This comes from the people distributing an ad-fuelled cryptocurrency called "Basic Attention Token". I'll take their word with two grains of salt and a tall glass of water, please.
Here is one way to disable "FLoC"<p>chrome://settings/privacySandbox<p>Evolution from FLoC<p>FLoC ended its experiment in July of 2021. We've received valuable feedback from the community^1 and integrated it into the Topics API design. A highlight of the changes, and why they were made, are listed below:<p>FLoC didn't actually use Federated learning, so why was it named Federated Learning of Cohorts?<p>This is true. The intent had been to integrate federated learning into FLOC but we found that on-device computation offered enough utility and better privacy.<p>FLoC added too much fingerprinting data to the ecosystem<p>The Topics API significantly reduces the amount of cross-site identifiable information. The coarseness of the topics makes each topic a very weak signal; different sites receiving different topics further dilutes its utility for fingerprinting.<p>Stakeholders wanted the API to provide more user transparency<p>The Topics API uses a human-readable taxonomy which allows users to recognize which topics are being sent (e.g., in UX).<p>Stakeholders wanted the API to provide more user controls<p>With a topic taxonomy, browsers can offer a way (though browser UX may vary) for users to control which topics they want to include<p>The Topics API will have a user opt-out mechanism<p>FLoC cohorts might be sensitive<p>FLoC cohorts had unknown meaning. The Topics API, unlike FLoC, exposes a curated list of topics that are chosen to avoid sensitive topics. It may be possible that topics, or groups of topics, are statistically correlatable with sensitive categories. This is not ideal, but it's a statistical inference and considerably less than what can be learned from cookies (e.g., cross-site user identifier and full-context of the visited sites which includes the full url and the contents of the pages).<p>FLoC shouldn't automatically include browsing activity from sites with ads on them (as FLoC did in its initial experiment)<p>To be eligible for generating users' topics, sites wil have to use the API.<p><a href="https://github.com/jkarlin/topics" rel="nofollow">https://github.com/jkarlin/topics</a><p>It is remarkable to me that Google can freely experiment on whomever they wish. If the experiments demonstrate negative effects, e.g., generation of execssive amounts of fingerprinting, it's unfortunate for those who were swept up in these "experiments".<p>Why not ask users if they want to volunteer to particpate in a trial/experiment.<p>Imagine if drug companies did not obtain permission to test their compounds on new patients. Instead they just substituted the new drug into what they sold on the market. Same label. Chrome is Chrome, right. Nevermind all the undisclosed variations and experiments. For example, "field trials" identified by only a number. This is hardly informed disclosure and consent.<p>1. This is amusing. What users were solicited for feedback. Perhaps they are referring to some surveillance they conducted, looking for mentions of FLoC.
"Don't be Evil" wasn't a motto --- it was a warning.<p>A peak into their mindset that foretold what they were thinking and where they were headed.<p>They sell you and your privacy to their "associates" --- aka, anyone willing to pay in some way. Their concern for your interests only extends to the level required to invade your privacy.<p>The thing I find most disappointing is the fact that it took so many so long to realize this.
Use a different browser that’s not chrome.<p>Problem solved.<p>Disclaimer: I don’t use Brave because I don’t want to see <i>more</i> ads. In the 90s, there were little banner ads that ran on your pc in the same app that helped you connect to the internet. That’s essentially what Brave is, except I don’t NEED to see more ads, as I needed to pay for internet in the early days. Just us a browser that blocks cookies.