I really think programmers need a basic lesson in probability, because most of these solutions are complete nonsense with respect to information theory. Skynet still knows enough about you and everyone else even if you alone leave Facebook, and we need to figure out how to curb the powers that come with amassed data more generally.<p>1. Models don't need everyones data, they just need <i>enough</i> data. Facebook, Google, et al have more than enough data for a lot of applications even if they only had 5-10% of the population. So at best, this ship jumping will limit Facebook's ability to "micro-target" certain populations. In this case, P(whatever | privacy concerned individual) will be a bit noisier, but Facebook will still have a really damn good idea about P(whatever | still a Facebook user)<p>2. Facebook can still use/sell the models it develops against its user base to target you even if you're not on the platform unless you really think P(depressed), P(bad employee), P(insurance risk), P(easily influenced by a specific type of marketing) has anything to do with the fact that you're not on Facebook. The minute someone asks a few questions about you in any setting, they'll be able to infer a ton more from the models alone. Lack of information about you will only add noise, and to make things worse, Facebook has enough data on privacy conscious individuals anyway to where they can reasonably fill in the privacy conscious holes in their data with a reasonable model.<p>3. P(privacy concerned) may be correlated with P(not manipulable), so you jumping ship isn't going change the systemic issues everyone is concerned with, namely Facebook and third-party customers' ability to morph society in the means they see fit.<p>4. You can replace Facebook with Google/Amazon/Spotify/Chase/Bank of America/Hospital System/Government and all of the above is true within the domain of data they control.