Congratulations to a point. In the jurisdiction I worked on these problems inside government, the mandatory requirement was that the identity of each individual who accessed a personal health record was logged with the record they accessed. The systems caught some pretty egregious abuses and attempts over the years.<p>The controls on this system as described by OpenSafely are far superior to the ones in Canada I am aware of, which some researchers leveraged the pandemic crisis to push through as a means to squeeze the data toothpaste out of the tube before technologies like FHE and LLMs matured enough create a more complete screen over individual records. The scheme in Canada removed some hard controls that prevented abuse and replaced them with discretionary controls without clear liability for their failure, I thought.<p>I am very cautious about the benefits of this kind of data aggregation because the idea that researchers have altruism or respect for individual privacy is a myth. There is also the risk that it moves governance of personal health information repositories out of government and into academia where there is no AtoI/FOIA requirements, no background checks on the people accessing the health data and systems, no legally binding mandates for limitations on use, and no clear role definition of the responsibilities of custodians, agents, providers, and other formal roles.<p>That said, OpenSafely's public logs could be an unbelievably good control against this, and it shows a level of stewardship and respect for public trust that mirrors the principles and ideals privacy professionals who have spent their careers on these problems hold.<p>I'm used to building for hostile environments and for huge userbases who are at least 3% criminal, and the only thing those people understand is consequences. So cautiously, congratulations, but if this data gets used for digital identity, social credit, domestic passports, restrictions on movement, association and other basic freedoms, that is on you.