Welcome to 2009, I guess<p>"Today we're helping people get better search results by extending Personalized Search to signed-out users worldwide (...)"<p><a href="https://googleblog.blogspot.com/2009/12/personalized-search-for-everyone.html" rel="nofollow">https://googleblog.blogspot.com/2009/12/personalized-search-...</a>
I remember Paul Graham gave a keynote at one of the Pycons some years ago. It was about interesting ideas that you might want to work on in the future.<p>One idea was to build a search engine that returns unpersonalized results. He talked about how Google will be moving into a "it's true, if it's true for you" kind of world. His idea was that it will open new opportunities. I think DuckDuckGo is one example, and they've grown and are doing pretty well. I think a lot of it comes as a reaction to Google, Facebook and other such things.<p>"It's true, if it's true for you" is also a great phrase worth remembering. It describes so much about the current world and where things are headed, and why some things seemed to have gone off rails.
The problem is, these "personalized results" have become mostly useless. For more technical (or simply specific) queries, Google seems to vomit useless only vaguely related links instead of, you know, pages that cover what i searched for.<p>This does more to encourage me switching to some other search engine than any privacy concern.
I think I agree with the complaints about this study. Unique doesn't mean personalized. IP geo-location, ISP... Don't forget browser and OS versions... there are lots of things you can sample over without actually representing any info leak from your logged in session.<p>Someone searching "at the same time" could technically be different times (remember time zones!) for the purposes of the algorithm (is the search during "work hours" or not, etc).<p>Users checking results on mobile phones compared to desktops... Without more details of how they controlled for these factors, the conclusion doesn't really follow.<p>Edit: I think to really measure the conclusion, they'd need 87 people in the SAME geo location, perhaps on fresh out of the box devices. That would be the best way to create the "placebo" group for their test, which they don't seem to have done.
I'm the last one to say you shouldn't scrutinize what Google does, but this is complete non-news.<p>In the beginning, I was suspecting actively used logged-out cookies like Facebook infamously uses for example (try it, they're showing your face and keep tracking you all over the web). Reading on about differing search results in private mode, I was then expecting something like Google actually using IP + Fingerprint matching, which would be way more devious.<p>In the end, this was purely about Google showing a different page to everyone. Playing the devil's advocate, this is about the only way to escape the exploration/exploitation dilemma.<p>Is DuckDuckGo seriously complaining that Google is basically A/B testing everything, all the time? Because if that's the case, their data scientists should take some notes here.
Am I missing something? Google gives me the creeps as much as it does to any sane person, but if I understand correctly, DDG is comparing just the variation of results on each page. This doesn't mean that you're still in the (same) bubble when you log out.<p>How about comparing the logged-in data with logged-out / private tab data? Did they find these two sets related? If not, G could be just implementing some sort of A/B testing on grand scale (learning from clicks and making search algorithm better).
I always assumed they would, just like Youtube 'recommends' videos for you whether you're logged in or not. This is one of the reasons why I don't use Google / Bing / etc directly anymore, instead I use them through a meta-search engine (using a local instance of Searx [1] with some extra code to have it search my local content as well [2]).<p>[1] <a href="https://github.com/asciimoo/searx" rel="nofollow">https://github.com/asciimoo/searx</a><p>[2] <a href="https://github.com/asciimoo/searx/pull/1257" rel="nofollow">https://github.com/asciimoo/searx/pull/1257</a>
The study’s result seem to be that users often get unique results. That’s not the same as “personalized”, and it certainly isn’t evidence of “bias” as the spreadprivacy.org-link suggests.<p>A good faith interpretation would point to google running learning algorithms on their results. That would also seem to be a far better explanation for Google changing parts of the page layout, such as the position of news and video results.<p>The use of the term “bias” for describing differences search results also trips my conspiracy theory detectors.
I think this should be considered the original source instead?<p><a href="https://spreadprivacy.com/google-filter-bubble-study/" rel="nofollow">https://spreadprivacy.com/google-filter-bubble-study/</a>
I'm not sure I'm comfortable which a critique written by an economic competitor (DuckDuckGo) that obscures its identity and doesn't disclose its conflict of interest.
Here's where Google (barely) exposes an option called "Signed-out search activity" retention: <a href="https://www.google.com/history/privacyadvisor/search" rel="nofollow">https://www.google.com/history/privacyadvisor/search</a><p>Make sure to access it while not signed in to Google, but using a browser mode which persists cookies (ie, not incognito mode). The actual control is <a href="https://www.google.com/history/optout" rel="nofollow">https://www.google.com/history/optout</a><p>Here's the equivalent for YouTube signed-out watch and search history: <a href="https://www.youtube.com/feed/history" rel="nofollow">https://www.youtube.com/feed/history</a> . Click "Clear All Watch History," then click "Pause Watch History," then choose "Search history" and repeat those 2 steps again.<p>Do all of this from each device you use Google or YouTube from.
Here’s Google’s search liaison clarifying things 2 hours ago on Twitter. Localization shouldn’t be confused with personalization. Disclaimer: I work at Google, but not in search<p><a href="https://twitter.com/searchliaison/status/1070027261376491520?s=21" rel="nofollow">https://twitter.com/searchliaison/status/1070027261376491520...</a>
It sounds like they found a lot of variation but not that the differences are biased in any particular direction? Could this be random?<p>The use of "filter bubble" doesn't seem justified if it looks like random variation.
Beeing hooked on rust the programming language and rust the game at the same time has been interesting google-wise. I used to take it granted to get rust programming results. And now google seems really confused. And I'm not getting good results for either. "Personalized search matters". And also. Fuck privacy invasion.
I've observed this a lot when working on SEO for my webpage. I'll be like, "Cool, I'm the top result for my name!" And, yes, this will be true for people searching from Berkeley, where I live. But if I go to an IP address over in SF, I'm not even on the front page.
This is honestly the worst thing ever, I've started noticing this too while testing SEO for different projects.<p>What if google becomes like Netflix? Only shows you results you expect, honestly a progression towards that has already rendered google search quite useless for most of my searching. I prefer searching HN on Algolia, Reddit, Medium and other websites (not at the top of my head) to find unexpected resources that I expect google "search engine" to give me.
I'm pretty certain, based on experience in my household, that Google's display network, Facebook, and Amazon are all doing some sort of targeting outside of cookies/pixels. My assumption is that it's based on your IP address.
Maybe that explains why Google gave me 1st-page results about "shrooms" few days ago while my search term was specifically asking for "Champignons" and "pizza" and whether to pre-boil or use them raw.
Youtube does that as well, and it has for a long time I guess, since I have been consistently able to reproduce the following:<p>1. open Youtube's home page in your main browser, while logged out and after clearing cookies<p>2. open the same page in a "virgin" browser (e.g. a newly created VM or even just using an incognito window)<p>observe that 1 has some amount of "personalisation".<p>When I saw this the first time I was baffled, so I did some research and found out that it's because of local storage. As per step 1, I was clearing just the cookies but not local storage.<p>Lesson learned: don't just clear cookies, remember to clear local storage as well
To me, constantly complaining about sites/companies doing this stuff or wanting laws to make them stop... It just feels silly and pointless.<p>In the long term, companies will gather and use what data they have access to. Companies will tailor their UI, product, etc. in order to keep that data flowing. A ruleset based on permission and consent is not practical, unless the goal is "better paperwork."<p>The solution (imho) has to come from browser software or w3c. The browser should control permissions, in broadly the same way mobile OSs/appstore control permissions and <i>login state.</i><p>ATM Google de-anonymizes you. This should just be impossible, unless the browser tells it who you are.<p>^I know gdpr is popular with a lot of people here. I think it has some good parts, but I disagree with other parts. We can still be friends and disagree :)
study finds in 2018. I found out google was 'listening' when I got youtube recommendations on my friends laptop in 2012, when I wasn't signed in, that I get on mine. Btw our youtube habits are different. Given his playstation had different recommendations.
Well .. they certainly do a shitty job at personalizing my results. These days I can never find what I want on Google. I've resorted to using other search tools (github search, reddit search, stack overflow search, ddg, etc)
Safari 12 is impressive when it comes to preventing browser fingerprinting and cross-site tracking. I wish it was cross platform. Not sure though how much it can do against Google.
New study ? You just have to open two different browsers and open use them for a couple of days and its quite obvious. Question is whats the problem with this ?
What's been a shame is that there is still no open source Search Engine despite this being a "solved" problem. Like not even something like there is a docker image that you throw at your cluster that gets you faster and faster search results. That's the real shame.<p>We should've commidified the core of search engine by now with programmatic and API access as commonplace and yet here we are where search engine software is still dominated by proprietary services.
This would all be fine if it were opt-in but all these companies who insist on excessive reliance on algorithms just end up making their service worse.
Strange as it is, I was just researching whether it was possible to search Google truly anonymously - stumbled upon a Firefox add-on called Searchonymous [0]. Planning on trying it out today.<p>If all else fails, can always use another search engine ;)<p>[0] <a href="https://addons.mozilla.org/en-US/firefox/addon/searchonymous/" rel="nofollow">https://addons.mozilla.org/en-US/firefox/addon/searchonymous...</a>
Is anybody the least bit surprised? It’s been obvious forever that google has absolutely zero respect for privacy.<p>Yes, this can be rationalized as ‘improving search results’, and indeed the results may be better.<p>But, it’s also building a personal profile without consent, or indeed with implied lack of consent.<p>If google cared about privacy, they would simply offer people the option not to be tracked, and respect it.<p>They don’t.