Barring the fact that I still cannot figure out what <i>evil thing</i> the facbook did other than vague descriptions of <i>definitely evil stuff, trust us</i>:<p>Allegedly facebook lied because it did not make its research about its algorithms public. But I could make the same argument about any proprietary system. YouTube's recommendation algorithm is not public. Google's pagerank is not public. What effects do those have?<p>And what findings would a public care about? Well, any given investor could find just about anything unethical. Sure, I think most people agree that if I see racist content on Facebook and as a result of that someone decides to join the KKK, it probably harmed society. But this can be extended indefinitely. Perhaps facebook's recipe recommendations encourage people to eat more meat, so vegans would find it unethical. Perhaps facebook's recommendations encourage people to drive more, so climate activists are upset. You sort of open up Pandora's box if you start considering these things. Would facebook even want to do research if it had the potential to reveal something negative?<p>This is another case of 'everything is securities fraud'. Why should the SEC get involved? That seems like a terrible expansion of SEC's powers to start judging IPOs based on the ethics of the organization. You start with a simple premise (the investors have a right to know) but end up making a very broad and IMO over-bearing conclusion.<p>And regarding the IPO and it's share classes--if you're purchasing facebook, know that you are not purchasing control. If you do not like it, do not invest.