>The less JavaScript (especially third-party) you have on your landing pages, the better. It’s a better customer experience, and it improves your page’s conversion and quality score.<p>Really wish I read this more. And not just for the landing page.
> inform Google/Facebook via their server-side APIs when feasible, instead of trying to load a pixel<p>I was wondering when this was going to be the norm.<p>Can't block third parties for privacy if the first party talks to them behind the scenes.<p>(If I understand what this means)
We ran into the same exact issue at my previous employer. It's scary how similar it was to your situation given we also moved our site over to Next.js and we were a competitor to Opendoor. It also wasn't the first time I've run into bad metrics in a legacy app making it appear to convert better than a new version.
Sshhhh. So many people will lose so many bullshit jobs if this gets discussed widely. There are entire tertiary industries built on top of front-end tracking and most people with a modicum of analytical ability know it's turtles all the way down.
The headline, the problem and the solutions are all different things.<p>- Front end tracking did not lie, the people tracking it were not aware. If opendoor was spending on Ads, the marketing team would be the first to see a disparity between the clicks on an ad network and pageviews on their end.<p>- As such this is also a reason why you need to check your server hit logs with your Analytics.<p>- Where is the author is right it Frontend numbers being incorrect largely due to blockers<p>- His understanding of bounces is also incorrect. Bounce is when there is no second "interaction event", many marketing folks fake-fix the bounce rate, by sending a scroll event as an interaction event.<p>- I always tell people that analytics numbers are "signals", not "metrics", they are not accurate enough to be called "metrics"
I don't think this is an issue with the on-site analytics, but with the quality of the traffic and website performance. If you make sure that:<p>1) Your site loads in under 3-4 seconds for any user.<p>2) The user is interested enough to wait 3-4 seconds until the page loads.<p>Then most issues will be solved.<p>The problem with ads in many cases is that the traffic they send is of very low quality or just bots. In the end you already know from your ads provider how many users they say they sent and you should always use that when calculating ad conversion rate.<p>Also note that using Cloudflare will count as bounced users who never actually even tried to load your page (bots, crawlers, scrapers, all HTTP requests).
Wouldn't the B variant show higher session count? If your A/B testing tool doesn't detect imbalances in cohort size I would imagine you have a bigger problem, since it's easy to accidentally measure the A and B groups differently.
Although interesting story, it seems the main issue was that the author never sanitized their analytics in the first place and blindly relied on data that was never confirmed before.
I've seen this as a quite common phenomenon. No piece of software does magic and they all require certain certain conditions to work as expected. In client JavaScript, these conditions are simply harder to grasp but they can be studied.
Can someone explain to me why third party JS has such a big impact on load time? I thought browsers deprioritize third party JS and load it after all first party assests have been downloaded and the site has been rendered. If that's the case why does third party JS still have such a big impact on page performance?