That article doesn't summarize the ruling very well. Here's a short tl;dr of the actual ruling[0]:<p>Part A: Privacy settings<p>- Facebook tried to claim that it is only subject to Irish law. Court disagrees since Facebook operates in Germany, so local law applies. [side note: this kind of confusion is exactly why the GDPR is needed]<p>- Law states that the imprint must be "easily" accessible. Court found this not to be the case (it took three clicks and was hidden behind a link called "explanation of your rights and duties").<p>- Law states that explicit, informed consent is necessary for the kind of data processing Facebook does. Facebook pointed users to the privacy settings page where all settings were enabled by default. Court found that this constitutes neither explicit nor informed consent - the settings would have to be opt-in, or the user needs to be explicitly informed about the full extent of how his data is used ("without any doubt").<p>Court explicitly states that presenting an opt-out <i>after</i> registration and login is not sufficient, especially if it is presented as an optional "privacy tour" that most users are going to ignore.<p>- Plaintiff stated that Facebook incorrectly claimed it was "free forever", when users were in fact incurring hidden costs by volunteering their personal data ["paying with their data"]. Court strongly disagrees - no money is changing hands, after all. They do recognize that there's a counterpart, but it's immaterial and as such does not constitute a "hidden cost". Court basically states that the meaning of "free" is not up to debate.<p>Part B: Terms of Use<p>- Terms of use state that the user "acknowledges" to have "read" the privacy policy during registration. This is invalid in two different ways - a mere "acknowledgment" is insufficient, since it puts the burden on proof on the user, and since parts of the privacy policy are invalid, the user can't legally agree to it its entirety anyway.<p>Court explains that "read and understood" clauses like this one are invalid. Clearly, the user didn't actually read and understood the whole thing - but the language in the terms forces him to admit he did, which would disadvantage him by implying informed consent about everything in it when he didn't explicitly consent to anything.<p>- There's a clause in the ToU stating that the user "agrees to use his real name". This does not constitute informed consent since the user isn't properly informed - Facebook does not state <i>why</i> his real name is required and how it will be used.<p>The court states that it is questionable whether a real name policy is at all legal, underlining the need for proper consent due to the significant consequences of volunteering one's real name.<p>- Same for "agreeing that personal data is transferred to the US" - no explanation why data is transferred, what it will be used for or even what data is transferred. In addition to that, there's no indication which data protection standards are applied.<p>- Similar case for "agreeing that the profile picture is used [...] commercially": no informed consent since the user is not informed about the consequences.<p>... and a few more clauses where the court finds that no informed consent is given by the user due to very broad clauses with little explanation.<p>- It's OK to have the user agree that he's 13 years or older. Facebook cannot possibly check whether it's true, and the age doesn't matter anyway since the contract would be valid even if it weren't the case.<p>- Plaintiff complained about a few informational clauses in the privacy policy. Court rejected this since they weren't part of the terms of use due to their purely informational character (user isn't agreeing to anything).<p>This was a very interesting read. It is very clear that the courts take the requirement of "informed consent" very seriously, as they should. Is is not enough to present the user with a 100+ page privacy policy and have him agree to it, they actually need to present it such that the user realizes what they're agreeing to.<p>[0]: <a href="https://www.vzbv.de/sites/default/files/downloads/2018/02/12/facebook_lg_berlin.pdf" rel="nofollow">https://www.vzbv.de/sites/default/files/downloads/2018/02/12...</a> (interesting part is page 22 onwards)
Wait until GDPR is in place in May and German and other EU courts will rule FB to death.<p>IDK how FB will ever be compliant with GDPR and survive that huge upcoming fines in the long term or in the worst case the withdrawal from these markets.
German news reports have a very different angle on this.<p>German law forbidds a real-name policy, has to allow pseudonymous usage and advertise this fact as long as it's technically possible and feasible.<p>German law is obvious, but not weather facebook is bound to it. The court ruled it is.
FB have taken out huge newspaper and billboard ads in Belgium, pretending to care about your privacy. They're trying to divert attention from their real privacy issues, by saying "you can choose who can see your stuff".
I'd argue there is no way to properly communicate to the average facebook user how their data is being collected and used in a way that is transparent but not confusing.<p>For example, explain to someone who is illiterate in technology how the act of you "tagging" your friend in a photo is to offload image labeling work to train a deep neural network to infer your friend's face.<p>If you radically simplify the issue in line with GDPR by saying something like:<p>"Whenever you tag a friend in a photo you to help teach our computers to recognize what your friends face looks like"<p>It makes it seem way more terminator/ominous than it is to the average person.<p>Ok now do the same thing with all of the nlp, voice etc... data points.<p>I just don't see how facebook is going to deploy a worldwide education effort on big data effectively.
The ruling and article only mention Facebook but I don't see how everything in it doesn't apply to every single app/website that does targeted advertising.
1 try not to get hacked<p>2 don't sell your soul to marketing parasites<p>Seems like common sense really but it has (US) companies scrambling. Good. We are GDPR!
A thought I was having recently: any communication medium (Messengers, social networks, email services, contact apps, etc) that does not use end-to-end encryption and has access to the data, may be in violation of privacy/data laws or moral obligation that will soon become law.<p>For example in email, people can, and do, send everything including documents with sensitive information, pii, account/payment numbers, etc. to each other - which are likely not being stored in pci compliant, and/or other responsible ways, by the providers.<p>Social networks run platforms that facilitate <i>others</i> to provide information about <i>you</i> when you did not agree: whether you're on Facebook or not, you're on Facebook.<p>Same with contact apps where you fill in all your friends' contact info then simply pass it all to a company without the consent of your contacts: mass legal doxxing.<p>Any communication medium where the platform has access to the contents of the communication might be susceptible to serious future legal/moral ramifications. There is a non-zero possibility that today's business models might be fully illegal at some point. Perhaps replaced by decentralization/encryption/privacy/crypto/etc.
If the appeal doesn't go Facebook's way, what is the resolution to this? It sounds like they'll just have to update their terms of service to say that you agree to allow Facebook to use your data in XYZ ways. Of course, that'll be buried in the fine print and no-one will even notice.
I think if I was creating a new social media website today I'd probably not set up any presence in the EU. The sheer quantity of fines for vaguely specified "crimes" being handed out makes it a deeply unattractive business environment and it seems to be getting worse. I remember when Facebook was new, one of its big competitive advantages was its easy and comprehensive privacy controls. I didn't see other social networks go significantly further in the years since. Now Germany - having failed to clone Facebook domestically (StudiVZ) - sits around extracting money on the grounds that users somehow did not consent to their data being used when they directly uploaded it to the site.<p>I don't see the Valley's hold on social networking loosening any time soon. For all its faults the USA doesn't constantly fine its firms for not doing "enough", whatever that means.
I think it's interesting that more antitrust lawsuits seem to be brought and won against big SV companies recently in Europe. Is it just recently because because bureaucracy takes its' time, or is it because nowadays there is more political will to act against American companies since EU-American relations worsened since Trump came into office?