<i>Uber’s logic is part of a wider pattern of using technology to sow doubt in how we name what is easily observable. Facebook, for example, monitors users’ posts to detect indications of suicidal behavior and then stages interventions, such as calling the police, as Natasha Singer reported. However, by refusing to call this “healthcare” or, potentially, “practicing medicine,” Facebook is able to play by a different set of rules, like running experimental health research algorithms on unwitting users.</i><p>This.<p>Classification is based not on means but ends.