This doesn't say anything. They invested in a tokenization company. That's not a new or interesting technology. What am I missing?<p>There are interesting data security companies happening right now. For instance, Matthew Green is doing Zeutro, an ABE company. Think of ABE as Shamir's Secret Sharing on Steroids: you can encrypt data and delegate it out to different people based on boolean expressions. That at least addresses a fundamental problem in data center encryption (the fact that serverside data encryption is "all or none" with respect to applications).<p>This, though? I assume the announcement means VGS is doing great in the market. Congratulations, I guess?
EnvKey[1] takes a somewhat similar approach to securing credentials/config in that we effectively replace your config with a short token that can be set as an environment variable. This then 'expands' into your full configuration when it's needed.<p><i>But</i> the crucial difference is that instead of storing sensitive data in plaintext ourselves and then sending out access tokens, we manage an OpenPGP PKI/web-of-trust for you behind the scenes so that we're only storing encrypted data, and only the token (which we never see in its entirety) can decrypt it.<p>End-to-end encryption is <i>much</i> harder to implement for these kinds of use cases than simple tokenization, but there's also the huge benefit of not needing to trust your storage layer.<p>With credit cards, for example, an approach like this could hypothetically remove PCI-compliance as an issue entirely because no one is actually storing the cc # in the clear. To me this is a lot more interesting than simply shifting the burden of trust. That said, anything is better than our current status quo of spraying secrets all over the place.<p>1 - <a href="https://www.envkey.com" rel="nofollow">https://www.envkey.com</a>
Anonymous account, because of reasons.<p>I interviewed and was offered a job at this company. I turned it down because they had some of the most morally bankrupt leadership I have ever seen in a startup. Frankly, it made me less likely to interview with YC companies at all.<p>Just a quick list of giant red flags-<p>1. They are violating visa laws by having their employees in the Ukraine lie on their applications and say they are coming into the US for tourism instead of business.<p>2. The Ukrainian developers they get out here are kept on their Ukrainian salary, with a small stipend for housing. So they get to live in the bay area on a eastern european salary.<p>3. Their CEO actually bragged to me about how little they were paying the only female developer they had in the office. He thought it was hilarious.<p>4. When they made an offer they refused to tell me how many shares had been issued for the company or what percentage the offer included, making their offer completely impossible to decipher. It was also about 15% lower than the numbers they had discussed with me beforehand.<p>If I was an investor in this company I would demand the removal of the CEO and put their CTO in charge.
I can see why this is an attractive idea to fund, but in my opinion it's the wrong way to resolve the problems highlighted in the article.<p>This is not a technical problem, it's a usability problem. We have had the cryptography necessary to technically fix this for a long time. Replace the single human-memorable token (SSN) with a unique public/private key pair. Then you provide safe authentication by signing verification messages with your private key without placing that private key into the hands of a centralized vendor (like Very Good Security).<p>The obstacle to this solution is 1) buy-in, to either get the government to do this or to bypass it with this solution in private industry, and 2) usability, to abstract as much of the technical signing process away from the user as possible. But this <i>is</i> a better solution. From what I can understand of Very Good Security's website, it's just more of the same. It wants to become the secure gatekeeper of sensitive data instead of developing a novel means of obviating that problem entirely.<p>The <i>real</i> company to fund is one which takes inspiration from an existing cryptographic protocol - like ApplePay's or AndroidPay's - and expands it to handle identity verification and one-time payment authorization without requiring an SSN or canonical credit card.
H(ssn) just kicks the problem downstream.<p><pre><code> - If H is a simple cryptographic hash function, it's not resistant
to brute-force attacks to recover the SSN
- It's not revokable
</code></pre>
What we need is something more akin to a Credit Card number. Something like an abstraction layer. It might even be implementable as a UUID.<p>If you need to revoke it, you can do so since it's not cryptographically tied to anything.<p>Failing that, a base32-encoded random string (without = padding) with an optional checksum would do the trick.
I find it interesting that Stefan Brands [1] solved the zero-knowledge authentication problem a couple of decades ago and his tools are still not widely applied. Given my bias against imaginary property, I believe that's because his patents on them are still valid -- and apparently owned by Microsoft at the moment [2].<p>[1] <a href="https://en.wikipedia.org/wiki/Stefan_Brands" rel="nofollow">https://en.wikipedia.org/wiki/Stefan_Brands</a>
[2] <a href="http://financialcryptography.com/mt/archives/001011.html" rel="nofollow">http://financialcryptography.com/mt/archives/001011.html</a>
This type of things are better off delivered as an SDK rather than a 3rd party API. Sending sensitive data to VGS for encryption would be a non-starter for many companies.. the probability of data getting stolen is same for VGS or anyone else...
Why don't we already have apps on our smartphones for this?<p><pre><code> - $PROVIDER wants the following data: $LIST_OF_OPTIONAL_AND_REQUIRED_ITEMS
- You select which you can provide
- If the data to be provided includes "billing identifier" or "credit file identifier" (and especially if the identifier is, say, SSN), then first your app obtains a new identifier from the reporting agency or your insurance carrier, and *that* number is given to $PROVIDER
</code></pre>
Gives more control back to the customer/patient and eliminates (yet another) treasure trove of data for attackers to go after.
It's mind-boggling to me that this didn't already exist. I wonder if that's because there's a lot of low hanging fruit in security/privacy, low hanging fruit in healthcare, or a combination of the two.
Biggest problem with this solution is to trust VGS capability to secure our sensitive data. Which is a hardest thing to do in the first place. All it takes for a disgruntled employee(given company's practices cited in the posts below) siphoning out the data. They are really tiny enough to pay for the damage, making their liability claim render useless. However, given the trivial nature of this problem and interest, I decided to open source a solution which avoids liability concern.
If an organization is deciding between interacting with VGS hashes/tokens having to proxy requests or deploying a secret store like HashiCorp vault what are the pros/cons?<p>> When it’s time to bill your insurance company, their “reimbursement” code goes through VGS which “reveals” the token and sends the real version to the insurance company.<p>Forgive me if I am wrong, but that means all 3rd party integrations that require the sensitive values must be implemented by VGS correct?
I'm not quite sure what this is exactly, but it sounds like they are providing a security "service", so all the "real identifiers" will be stored on their servers?<p>Why should an entire country trust them? I'm not saying they wouldn't be an improvement over Equifax, but it still sounds far from ideal. I think a hardware token would be preferable.