I disagree with this, it's making a big complex mess trying to solve <i>for the wrong requirements</i>.<p>The sites that care the most about "a real human" would never use this system anyway, because need to know your real identity, like with a bank.<p>For the rest, they don't actually need to know "a real human", but instead than an account <i>represents a significant investment</i>. That way they can have confidence that (A) the operator will care about following rules in order to keep it and (B) it is not feasible to have too many sockpuppets.<p>_____________<p>So... Forget digging up your birth-certificates or getting a passport, forget government officials interviewing you across a desk, forget the requirement for fancy digital wallets, and forget the overall privacy nightmare of biometric data.<p>Instead, simply imagine your local library along with... <i>a vending machine!</i><p>How does it work?<p>1. Pop $X into the machine. [0]<p>2. Tap on the screen to select which charity gets the profit.<p>3. Take the one-time code(s) it prints out.<p>4. Register on the website, using a code to prove you have "skin in the game". The website validates/burns the code. [1]<p>Easy!<p>[0] Yes, I know this imposes an <i>overt</i> price on participation, but I'd argue it'll be <i>less</i> onerous than what is being described in TFA, which imposes a covert burden of taxes and application fees and travel-expenses and interview time.<p>[1] While someone could theoretically make a site that <i>says</i> it needs codes but actually collects them for later use, that would be easy to detect simply by entering a fake code, and everybody would have an incentive to try it.