Not excusing this is any way, but this app is apparently a fairly junior effort by university students. While it should make every effort to follow good security (and communication) practices, I'd not be too hard on them considering how some big VC funded "adult" companies behave when presented with similar challenges.<p><a href="https://georgetownvoice.com/2025/04/06/georgetown-students-create-cerca-a-new-dating-app-that-swipes-right-on-safety/" rel="nofollow">https://georgetownvoice.com/2025/04/06/georgetown-students-c...</a>
I worry about my own liability sometimes as an engineer at a small company. So many businesses operate outside of regulated industries where PCI or HIPAA don't apply. For smaller organizations, security is just an engineering concern - not an organizational mandate. The product team is focused on the features, the PM is focused on the timeline, QA is focused on finding bugs, and it goes on and on, but rarely is there a voice of reason speaking about security. Engineers are expected to deliver tasks on the board and litte else. If the engineers can make the product secure without hurting the timeline, then great. If not, the engineers end up catching heat from the PM or whomever.<p>They'll say things like...<p>"Well, how long will that take?"<p>or, "What's really the risk of that happening?"<p>or, "We can secure it later, let's just get the MVP out to the customer now"<p>So, as an employee, I do what my employer asks of me. But, if somebody sues my employer because of some hack or data breach, am I going to be personally liable because I'm the only one who "should have known better"?
Oops! Nice find!<p>To limit his legal exposure as a researcher, I think it would have been enough to create a second account (or ask a friend to create a profile and get their consent to access it).<p>You don't have to actually scrape the data to prove that there's an enumeration issue. Say your id is 12345, and your friend signs up and gets id 12357 - that should be enough to prove that you can find the id and access the profile of any user.<p>As others have said, accessing that much PII of other users is not necessary for verifying and disclosing the vulnerability.
This is a pretty confusing writeup.<p>><i>First things first, let’s log in. They only use OTP-based sign in (just text a code to your phone number), so I went to check the response from triggering the one-time password. BOOM – the OTP is directly in the response, meaning anyone’s account can be accessed with just their phone number.</i><p>They don't explain it, but I'm assuming that the API is something like api.cercadating.com/otp/<phone-number>, so you can guess phone numbers and get OTP codes even if you don't control the phone numbers.<p>><i>The script basically just counted how many valid users it saw; if after 1,000 consecutive IDs it found none, then it stopped. So there could be more out there (Cerca themselves claimed 10k users in the first week), but I was able to find 6,117 users, 207 who had put their ID information in, and 19 who claimed to be Yale students.</i><p>I don't know if the author realizes how risky this is, but this is basically what weev did to breach AT&T, and he went to prison for it.[0] Granted, that was a much bigger company and a larger breach, but I still wouldn't boast publicly about exploiting a security hole and accessing the data of thousands of users without authorization.<p>I'm not judging the morality, as I think there should be room for security researchers to raise alarms, but I don't know if the author realizes that the law is very much biased against security researchers.<p>[0] <a href="https://en.wikipedia.org/wiki/Goatse_Security#AT&T/iPad_email_address_leak" rel="nofollow">https://en.wikipedia.org/wiki/Goatse_Security#AT&T/iPad_emai...</a>
I had a similar experience with another dating app, although they never got back to me. When I tried to get the founders attention by changing his bio to contact me text, they restored a backup lol<p>years later I saw their instagram ad and tried to see if the issue still exists, and yes it did. Basically anyone with the knowledge of their API endpoints (which is easy to find using the app-proxy-server) you have full on admin capabilities and access to all messages, matching, etc.<p>I wonder if I should go back and try again... :-?
People need to be forced to think twice before taking in such sensitive information as a passport or even just addresses. This sort of thing cannot be allowed to be brushed off as just a bunch of kids making an app.
<a href="https://yaledailynews.com/blog/2025/04/24/yale-student-exposes-data-leak-in-college-dating-app/" rel="nofollow">https://yaledailynews.com/blog/2025/04/24/yale-student-expos...</a><p>^another article on this
I would like to see laws that make storing PII as dangerous as storing nuclear waste. Leaks should result in near-certain bankruptcy for the company and legal jeopardy for the people responsible.<p>That’s the best way I can think of to align incentives correctly. Right now there’s very little downside to storing as much user information as possible. Data breach? Just tweet an apology and keep going.
I'm not sure how I hadn't heard about Charle's Proxy for iPhone before! I've done some light pentesting before and had to manually result to grepping for strings throughout the app binary. Glad to have found out about this, especially for when apps are only on iOS.
FYI: This is more common than you think.<p>I briefly worked with a company where I had to painfully explain to the lead engineer that you can't trust anything that comes from the browser; because a hacker can curl whatever they want.<p>Our relationship deteriorated from there. Needless to say, I don't list the experience on my resume.
Hot take: just like real engineers, there should be a Software Engineer licensing exam that's legally required before you can handle PII ... because this is the alternative.<p>Before I was allowed to hand out juice cups at my kids' preschool, I had to do a 2 hour food safety course and was subject to periodic inspections. That is infinity% more oversight than I received when storing highly sensitive information for ~10^5 users.
If they're sending the OTP to the user, its because the OTP is being checked client side, so you might have been able to just call the authentication endpoint directly.
> Since then, I have reached out multiple times (on March 5 and March 13) seeking updates on remediation and user notification plans. Unfortunately, as of today’s publication date (April 21, 2025), I have been met with radio silence. To my knowledge, Cerca has not publicly acknowledged this incident or informed users about this vulnerability, despite their earlier assurances to me. They also never followed up with me following our call and ignored all my follow up emails.<p>there can always be another side to this story but also wtf. this kind of shit makes me want to charles-proxy every new app i run because who knows what security any random startup has
There's no penalty for failing at privacy and security so companies would rather play the odds that they will be fine than invest in proper practices. Alex says Cerca is being misleading when it comes to encryption but it seems to me they are outright lying and will likely face no consequences for it. In a more just world, this would trigger so many regulatory and compliance audits.
There's probably some benefit to having people who will tell you about security issues rather than exploit them. You can't really blame businesses / app devs for wanting to be left alone though.
In a data conscious world, the complete and utter disregard for PII and lack of competency in design and implementation would result in catastrophic business failure.<p>They may have "patched" the ability to exploit it in this way, but the plaintext data is still there in that same fragile architecture and still being handled by the same org that made all of these same fundamental mistakes in the first place. Yikes.
FYI the Hinge app works the same way<p>I requested my data and all the image URLs are publicly accessible - and the URLs provided are both your own images and the images of anyone who’d ever viewed your profile
this is useful! I am considering building a dating app with its own twist and seeing the api endpoints this team went with is useful<p>under the hood they're all the same, just with different theming and market segmentation
I m not sure I understand properly. Did he try to hack a random service he encountered? Is that even legal? Where I live (Sweden) it's definitely not legal.
They might not have a playbook on how to handle such reports. Doesn’t mean they shouldn’t respond. They are also probably sh*t scared about legal ramifications - but not responding only makes them look even worse. None-the-less it is amazing how many of these products and services don’t put security and user privacy first.<p>Open for discussion - What would make them pay attention?
I'm flagging this submission. Look at the author[0], at the "Georgetown students..." (won't backlink again) post linked below stating that Cerca was 2 months old in April, and OP's post from April stating that they hacked this thing two months earlier.<p>It's some self-promo or whatever scheme/scam bullshit.<p>[0] <a href="https://news.ycombinator.com/from?site=alexschapiro.com">https://news.ycombinator.com/from?site=alexschapiro.com</a>
Imagine every time you entered a specific physical location you would increase your exposure to a detrimental disease. After only entering a couple of times you've contracted this disease and each subsequent visit to this place makes the illness worse.<p>A few people try to warn you but you choose not to listen and, in fact, you recruit the government to make it easier to enter such places with safeguards that don't actually protect you from the disease and encourage you to enter more frequently.<p>You're then surprised why you're ill to the brink of death and blame the location as the sole cause for your ails. Yes, the location is to blame but so are you for continuing to enter even after getting sick.<p>Why do you do this? Because you want something. Convenience, pleasure, a distraction, etc. But you refuse to acknowledge that its killing you.<p>This is how we should view optional services that require us to give our PII data in exchange for hours of attention-grabbing content. They're designed to sell your eyeballs and data to advertisers. You know this already but you can't say no. You're sick and refuse to acknowledge it.