This is fun. Does this potentially mean that there analytics firms out there with tons of "screenshots" contained easily demasked credit card info probably sitting somewhere in an s3 bucket? That's a new attack vector I've never thought about.
What’s the data advantage of <i>taking and sending a screenshot of the app</i> instead of just sending user events (e.g. field filled, field selected, form submitted)?<p>A screenshot literally unstructures the data.
<a href="https://www.smartlook.com" rel="nofollow">https://www.smartlook.com</a><p><a href="https://www.appsee.com/" rel="nofollow">https://www.appsee.com/</a><p><a href="https://uxcam.com/" rel="nofollow">https://uxcam.com/</a><p><a href="https://userx.pro/" rel="nofollow">https://userx.pro/</a><p>That's just a small sample of services that allow you to record the user's screen or take screenshots). App session replay software has existed for years, and of course, they capture all the things that are going on the app including checkouts and profile data (unless you flag those screens on the SDK implementation).<p>Like someone already pointed out, that video or image will likely be stored somewhere (an S3 bucket or some static storage). I think anyone who is implementing these type of SDKs on their app needs to do their due diligence, and not push sensitive data to these third parties.
This write-up doesn't actually state where these unobfuscated images came from, so it's not clear to me where (or whether) there are actually unobfuscated images in Air Canada's system. Tools like Glassbox usually mark PII fields with CSS classes to blur/redact fields when the screenshots are taken. It looks like the author may have found password and credit card fields without these CSS classes and manually recreated what the unobfuscated fields would look like with dummy data, but it's also possible to configure these tools to not log entire pages or directories -- this is how payment pages are usually configured, with screenshotting completely disabled.<p>If the (anonymous) author simply mocked up what these screenshots _might_ look like if they were saved, that's pretty misleading.
I was once forced to integrate once such product in our app. We did mask what we thought was the sensitive information. Within days of release, the app was removed from the play-store for privacy violation. Had to remove the SDK to get back in business. So Google does use tools to detect such stuff and this was early 2017.
I was in charge of building this kind of product for another analytics company, this technology is called session replay, and it is used for many use cases, like : UX improvement/ support/ bug detections ...<p>Most of vendors record keyboard inputs and thus can record password as well as credit card information, there was an affair about it a few years ago [1]. To not have this issue, most of vendors provide a way to not record those information. It requires manual tagging of the website on the element that contains critical content.<p>But many of session replays vendors have many clients, and don't force or don't verify that all the critical information are masked. This is not GDPR compliant, because when the GDPR apply you need to consent of the user to record his PII, and you are not even allowed to record information like password, sexual orientation, credit card even if you have the consent.<p>Two things:
- Nowadays on the web most of payment pages are not hosted on the client website, so those analytics tools are not included (but we still have many websites that don't use third party for that)
- This data is not (most of the time) recorded in a structured way, data of inputs is recorded as some element of an HTML, and thus it is not super easy to extract the information at scale<p>[1] <a href="https://freedom-to-tinker.com/2018/02/26/no-boundaries-for-credentials-password-leaks-to-mixpanel-and-session-replay-companies/" rel="nofollow">https://freedom-to-tinker.com/2018/02/26/no-boundaries-for-c...</a>