So, clearly Google has too much power over the internet, it's arbitrary and opaque, etc. I agree. However, I think it is worth pointing out that:<p>1) malware is often very aggressive and fast-spreading, and once it's on a user's computer it's hard to get off, therefore...<p>2) the system to detect it and stop access to the site has to be automated, not a human-in-the-loop system that might take hours or days to shut off access to a site which is infecting many users per minute, and...<p>3) the more clarity there is on how exactly that automated system works, the more certain we can be that malware will be able to evade it; it's much like how spam detection or search page rankings are opaque, because the incentives to game the system are very great<p>I'm not saying Google's system is perfect, but I am saying it's a very hard problem to solve in a way that doesn't give us an even worse time stopping malware spread than we already have. So while it is hard to feel sorry for a company as wealthy and powerful as Google, I think the issue is not as clear-cut as some comments on this thread seem to suggest.
>Now we run automated tests to monitor server uptime and check server for problems every 30 seconds. Unfortunately automated test scripts were happily getting HTTP/200 replies while people using the Chrome browser were being told this is a scam business trying to steal their bank account information.<p>I was surprised this wasn't part of the lessons learned. But it seems the monitoring basically failed but that wasn't a lesson.<p>I feel like majority of uptime monitors are falling for this same trap. One of the reasons why for my monitoring service I choose to do full page load monitoring via Chrome instead of just a http request via Curl or whatever. Main reason, people care if the webpage loads or not. People care how long it takes for their webpage to load. Having a website respond in 200ms is great but if it takes 8000ms for all the JS to load and process your website is still slow. I get why sites are just doing curl requests because it's way cheaper but really you're monitoring one part of the stack while really caring about all of it. If your website starts producing javascript errors you want to know, etc.<p>[1] <a href="https://www.ootliers.com" rel="nofollow">https://www.ootliers.com</a> (The landing page and everything are terrible and I'm working on improving that)
They can remove your YouTube account, app, entire Google account or even your website at any time and you can only make guesses why did that happen, because they always make the rules really vague and it's never clear what is or is not allowed. And even when they do admit the mistake and get you back up, they still won't explain anything and nothing is ever fixed. Thank you Google, very cool.
Anyone who thinks this is the functioning of a "normal" internet is mistaken. This is a symptom of a decades-in-the-making problem. It strongly appears those in charge of legislation are not technically minded and have no idea "how" the internet works. Or they do and they have data-sharing agreements with all the 'big tech' software and are okay to "appear" to legislate but cannot actually change anything substantial in fear of retaliation (losing access to all that juicy data they collect). Imagine the power Google wields in this scenario, to me they are more scary than any drug cartel boss. I genuinely can't see how this isn't akin to a Coup d'état of the internet as a means of transmitting information. We cannot shut down these tentacles because of how deeply ingrained they are (remember when FB's SDK was having issues? Hundreds of third parties apps just broke).<p>Google should have been regulated years ago, instead, they have been allowed to snap up every smaller company to solidify their position in the market and ensure <i>they</i> and <i>only they</i> are allowed positions of power, control and authority.<p>If Google dislikes you (or their baseless algorithms that are detached from reality) then you are <i>toast</i>. How long before Google's algorithm results in an actual human death? Doesn't seem totally far fetched and entirely plausible.<p>Yet, <i>you</i> let this happen, or rather, it seems this isn't concerning enough for it to warrant a massive protest, after all, Big Tech controls protest online and can just shut it down. Amazon seems to have been mightily effective at stopping any "union" movement, so we know the censor machines are fine tuned and ready to fire at any moment.<p>We need to be talking about this daily, in needs to be front and center for weeks and weeks, and we need to <i>demand</i> accountability. We are ruled and governed not by elected officials but by faceless, nameless and non-human machines. They do not Think. They do not Talk. They do not care.<p>Yet this thread will disappear in a few short hours, and this will be just another episode of the weekly "Google's systems are out of control and one developer got caught out, too bad I hope they are okay".<p>This is happening to thousands of others undoubtedly that do not make hackernews or have the resources/energy to fix it.<p>We should demand better.
Can someone explain to my why Google isn't being drowned in a torrent of lawsuits?<p>We are getting stories like this on a weekly basis now.<p>Google is clearly causing measurable harm to your company and you. And apparently to thousands before you.<p>Considering how much money patent trolls manage to extract from Big Tech with considerably weaker cases, how is it that everybody is treating Google like a fragile grandmother with dementia, going out of their way not to hold them responsible in court?<p>This is not a rhetorical question. I really don't get it.<p>America is the land of getting millions in settlement when McDonald's gives you coffee that is hotter than you anticipated. How the hell is Google getting away with their behavior?
The previous thread:<p><i>Help HN: Google just blocked my site as deceptive site</i> - <a href="https://news.ycombinator.com/item?id=26326528" rel="nofollow">https://news.ycombinator.com/item?id=26326528</a> - March 2021 (20 comments)
I'm wondering if this could actually be spun into being a good thing.<p>I just looked over the site a little more. The business idea seems to be to have a widget to add to your site that can be used to upload arbitrary files to it. The real advantage looks to be that they have a bunch of integrations set up with Facebook, GDrive, Dropbox, Instagram, etc so that all just works without you having to set up and manage developer accounts with 10 different services. Plus built-in image resizing and such things that all just works. Pretty cool, and I might use it if I built a site that needed to do that.<p>One way you can frame the point of this business is that they worry about the details of integrating with these other services so that you don't have to. As they found out, hosting external content is inherently dangerous, and it pays to have someone responsible for that who knows the risks and has experience in mitigating them. If a site owner wasn't using this service, they would have to take that responsibility on for themselves and re-learn these same lessons. So that's just another advantage of using this service - "we have experience in mitigating the risk of hostile users abusing upload services to serve malware, so you don't have to worry about it".
Quick notes:<p>Site owner has not confirmed they screened all uploaded content for malware - this is a major issue these days and google and others will flag you if you host viruses and pump out malware.<p>And no - you cannot sue google to force them to allow users to be infected.<p>It’s not clear that all customer content is hosted on a separate domain, and each customer on a separate sub domain . Your reputation will be trashed pretty quickly if you host content on main domain blindly.<p>It’s not clear that all uploaded content is protected from being linked too or downloaded. Google admins and other virus vendors can setup screens on downloads.<p>Anyways - see plenty of shady / scam and incompetent website owners hosting malware - not much sympathy in most cases.
Ironically I had the opposite issue a couple of weeks ago: I've found a phishing website (for Facebook) that was hosted on a Google server and was actively used. I sent an email to Google's abuse email address - got an automated reply back saying basically "use this other form instead". Did that, never got a reply back. I have reported the website to their SecureSearch (or whatever the name is) product, entered the URL and all the related infos: nada. The site is still up and running, phishing users, and no alerts are triggered for Chrome users... Sad, really sad.
So this sucks for the developer, but I have another story to share.<p>I was trying to buy a school bus to make a schoolie out of, the Craigslist add directed me to a seemingly innocuous eBay motors link that looks pretty close to the real thing. I was busy and clicked, totally intending to drop $5k. I got distracted and had to come back to it later, when I did, credit card in hand, the page showed the red screen with a huge warning. A closer look revealed the bad url.<p>Saved by google? Oh god, I think I need a shower now.
Glad you got a resolution. Google recently banned my ad account for running ads to my landing page templates and I still don't know what was wrong with that. They just gave me a bs corporate answer and that was it.
Let’s not forget that the site probably was actually hosting malicious content. The problem is not Google blocking the site, that was the right decision. The problem is that Google is hard to reach in cases like this.
Google is a monopoly and they destroy the lives of anyone that even dares to challenge them or their owners. It's time to break this big tech monopolies. Obviously, through make something better ... This is more of an inevitability than a question.
> But there are plenty of Google engineers and good helpful people on Hacker news.<p>> (from a screenshot) I work at Google [...] so I escalated your issue [...]<p>> I believe the HN thread getting on the homepage tremendously helped me and somebody from Google saw it and expedited the review after all<p>So, once more an issue with FAANG could only be fixed because somebody knew somebody else and went out of his way to get this to the right eyes.<p>This could easily have gone another way and OP would have received no help whatsoever and would have waited for days or weeks to get this issue cleared and lost his business.<p>Maybe it's only me but I find it unbearable that you'll usually not be able to reach any real person at all for issues like these and it's pure luck what happens to you.
From the article:<p>> So after a lot of brainstorming and ideas from HNers I finally figured out the culprit(s).<p>> We have a live demo on our home where people can upload a test file. [...]<p>> We also give all users a 20MB test storage. [...]<p>> I believe that somebody signed up for our service (it’s free to sign up) and then uploaded a malicious file on our test storage and abused this feature.<p>If that is correct, Google was completely in the right to flag the domain as malicious and warn visitors.
Thank you for the write up, I really appreciate how there were actionable suggestions within.<p>NodeBB does host a demo instance to allow people to kick the tires. I don't believe we allow people to upload images, but it is worth double checking just in case.
How do cdn providers (Cloudflare, Cloudfront etc.) avoid the subdomain blacklisting problem? Do they just have some agreement with browser vendors to whitelist their all of their subdomains because they're big enough?
The issue with this black lists is that all the antiviruses/security tools will immediately put you on their list but it can take days or weeks to have them remove you and you can still get some customer that uses some weird security program that he still gets the issue. One of the anti-viruses company has a form to submit a dispute but their form was broken for weeks.
I wonder if the use of a .win domain had any influence. I've seen nothing but spam and malware / phishing from these $2 TLDs.<p><a href="https://symantec-enterprise-blogs.security.com/blogs/feature-stories/top-20-shady-top-level-domains" rel="nofollow">https://symantec-enterprise-blogs.security.com/blogs/feature...</a>
I tried to send the blog post link to another person on Twitter and got a notice that the tweet couldn’t be sent because the site was potentially harmful: <a href="https://twitter.com/gortok/status/1368309384619626506?s=21" rel="nofollow">https://twitter.com/gortok/status/1368309384619626506?s=21</a>
Thanks for the writeup - I've learned some things. I have a site that allows user image uploads as well. I take each "image" and resize and compress it. If it's not an image after that, it get's rejected. Hopefully this is rejecting any malware.<p>I have gotten warnings from google multiple times about hosting NSFW images (that is not the purpose of the site) that have ads on the page. This isn't google disliking NSFW content - it's google not liking NSFW content and ads together. Due to multiple warnings, and worried about bans, I now actually manually review each image. This is actually easier than it sounds. I wrote myself a batch script and review in chunks before I allow google to view any images.
Website blacklists exist because of malware and phishing. Malware exists because our browsers and OS's are insecure. Phishing exists because our auth systems are insecure. Solving software security and auth will have wide positive effects on society.
Clicking the OP link I get a warning page from my ESET AV:<p>"Potential phishing attempt. This web page tries to trick visitors to submit sensitive personal information such as login data or credit card numbers."<p>Is this somehow related to the Google situation?
I made a Wordpress site last year to start blogging that had this happen. The only reason I found out in this case was from visiting it in edge, which showed a warning pop up, so maybe it was a Microsoft flag instead of google in this case. I never figured out the cause or a way to remedy it and just took the site offline because it was invisible to all search engines. Pretty disappointing
So... The proposed mitigation is to use multiple top-level domains. At the same time, third party cookies probably won't be around much longer and already don't work for some browsers, so if you want to share state between pages, you need them to be on the same domain (but can be subdomains). There is no winning scenario here.
If you want to avoid this issue with a Drupal site, the file_public_base_url settings is helpful and you might -- or might not, given the latest comment there -- need the patch from this issue: <a href="https://www.drupal.org/node/2754273" rel="nofollow">https://www.drupal.org/node/2754273</a>
Everyone in this thread is clearly stating how this is not a properly functioning system and there is story after story of the kafkaesque disasters to which Google is not responsible at all.<p>The question I have is what can anyone do to really change things? If we all agree this is a major issue why can't we find a reasonable solution to it.
Another idea could be to maybe have a separate dedicated companion domain (not a sub-domain) for communication which can be mentioned in the main domain. Atleast if the main domain is affected, you can still have a working channel that is a single place of truth for updates/communications.
Slightly offtopic: both "Drag and drop" and "Embed on page" examples at <a href="https://www.uploader.win/docs/" rel="nofollow">https://www.uploader.win/docs/</a> do not work.
To be quite honest, this seems like a case of Libel and possibly Tortious Interference on behalf of Google/Alphabet.<p>Especially if you can show damages/customers cancelling service, I think this would be a hill to die on. Google et al have too much power, even over people and orgs that aren't even customers. Its high time we reign their powers in, find them strongly culpable for what they do (and what they change and then refuse to do), and consider breaking up these monster companies up when they show they are against the public interest.<p>Were you, uploaderwin, given a notice prior (say to abuse@uploader.win , admin@uploader.win or other appropriate mails) to being effectively banned WRT google? I'd go on a limb and say you didnt. No, you have to be aware of the right page at Google, register you as an admin to the site, and hope they share what they consider abuse.<p>And frankly, you were lucky you got the social media escalation. You should have never had this happen... But here we are.
> <i>Don’t use base-64 images (or inline images)</i><p>For SVG just paste the markup on your HTML. Browser support is excellent and it will weight less than using a base64 encoded string.<p>You will be able to style it using CSS as if it was regular HTML, use JS, etc.
Sounds to me like Google protected the Internet from your site after you got hacked, which alerted you to a severe security hole in your system, so what are you complaining about?
The comment from the Google engineer who helped the OP is not there.<p><a href="https://news.ycombinator.com/threads?id=daave" rel="nofollow">https://news.ycombinator.com/threads?id=daave</a><p>What the?
Keeping user content on a separate domain is something I'll Reber out of this. Suddenly it makes sense why social media sites have so many different domain names
This is another argument on why we shouldn’t be using Google Safe Browsing. It’s frankly unacceptable that for every 5 (or less!) bad sites it blocks, we get something like this.
From what I see Google should now be considered an active threat. You have to design your system knowing they will eventually act against you, either your domains or your accounts. And your chances to get it fixed are slim, unless you’re able to get some public outrage.<p>Really a disgusting company.