Google AdSense 'banned' web pages on my website - The Online Slang Dictionary - that define terms related to software piracy. For example, "warez": <a href="http://onlineslangdictionary.com/meaning-definition-of/warez" rel="nofollow">http://onlineslangdictionary.com/meaning-definition-of/warez</a><p>They've also 'banned' all web pages on the site that are related to human sexuality, despite the goal (mostly accomplished) of the site to have high-quality 'non-pornographic' definitions for such terms.<p>There is no recourse - no way to appeal. You just bend over and comply, lest they block your entire site from using AdSense. When that happens, again, there is no way to appeal: your site is done.
I have 0 love for what Google is doing and turning into, but this article is deeply flawed. Their focus is implicitly on the accuracy rate of Google's actions, yet it makes no effort to statistically demonstrate this. What would matter is not an anecdotal incident, but estimations of the error rate of Google's algorithmic censorship/deranking/demonitization as compared to how human's might perform. Humans regularly make dumb mistakes if not out of misunderstanding then out of sloppiness. Are these algorithmic decisions more accurate, on average, than we might expect of a collection of humans along the lines of Amazon's Mechanical Turk? We can only speculate, but I'd be quite surprised if they were not, but if so that refutes this article's entire premise.<p>This is one of the biggest problems with advertising driven media. They love anecdotal evidence because <i>we</i> love anecdotal evidence which means the story gets clicks and they get their ad bucks. But it entirely misses the point of issues. For instance here, should the issue be the accuracy rate of Google's behaviors or Google's behaviors themselves and their desired endgame? Their accuracy, as a whole, is almost certainly going to be quite acceptable relative to human accuracy. So consequently you end up targeting them in a spot where they're extremely well 'defended.' By contrast, I think the world Google is trying to shape is certainly not one many would particularly enjoy. And the worst part here is that there is great potential for countless longform consideration and analysis of such a world. But putting out an anecdotal bit is far easier. It's just plain lazy.
Google makes more than enough money to hire people to oversee appeals, but apparently that cuts into their earnings. At least short term...<p>A human being would see what this page was about in 10 seconds. Check a box and click submit. Total time: less than a minute.
Dont blame the machine. Blame those who wish to apply censorship standards meant for film/tv to the entire world. The internet cannot be made pg13. Dont ask the robots to try.
> A page about a 1986 porn bill got demonetized shows how algorithms can’t be expected to make judgement calls.<p>Another day, another article about how we cannot rely on algos. Yawn.
Does anyone know exactly when Google decided to become the pearl-clutching Tipper Gore of search engines? We had a similar problem a long time ago with Google and it seems like it hurt our page rank at the time. There was nothing legislative that would have had any influence at the time, so it has to be coming internally, and from fairly high in the food chain.
This is because a computer has no sense of social context.<p>It can't aggregate what the general public feels is "OK" on its own, which is how human society defines what material is available.<p>It will rely on a bland list of words fed to it by paranoid controllers who fear the wrath of emotionally captured adults.<p>It won't at all reflect the variety of opinion. It will be TV 2.0