This is a terrible article, written by someone who is either dishonest or doesn't know what they're talking about and has never been to London, covering a paper that appears to be reasonably well done but has some serious limitations.<p>The study is here: <a href="https://ijbnpa.biomedcentral.com/articles/10.1186/s12966-024-01621-7" rel="nofollow">https://ijbnpa.biomedcentral.com/articles/10.1186/s12966-024...</a><p>The data was collected during the 2018/2019 academic year and then during the 2019/2020 academic year (but before the Covid school closures).<p>First, some context:
-The original ULEZ, which the referenced study looked at covers central London and should not be confused with the much larger recently expanded ULEZ which covers the whole city. Nor should it be confused with the much smaller congestion charging zone or the larger and older Low Emission Zone which covers freight vehicles.
-The ULEZ rules are designed around penalising the driving of the oldest and most polluting vehicles only. In 2019 this was 80% of cars, the expanded ULEZ has overall vehicle compliance of 95%+.<p>As as result of the second point, it would not be expected that it would have a substantial effect on the number of vehicle journeys since 80% of passenger cars in the zone were already compliant anyway, therefore any effect at all is actually surprising. The paper notes a drop of 9% in total vehicle counts.<p>"Four in 10 London children stopped driving and started walking to school a year after the city's clean air zone went into effect."<p>This little quote heads the article. It seems like quite a result, right?<p>It isn't.<p>Let's look at the baselines here, something which immediately anyone who lives in London would be suspicious about because like me their first question would be: "who was driving their kids to school in central London in 2019? Are there enough for there to be four in ten at baseline to switch?". It turns out not many people do, and no.<p>Let's look at table 2 from the paper: (there were about 1000 kids in both the Luton and London samples)
At baseline, 856 kids in London travelled using active modes and 105 using inactive modes
In Luton that was 599 and 364 respectively<p>So first, we can say that "four in ten children" has to be interpreted pretty carefully here since 85% of kids were already walking to school (note that if they just took the bus the whole way this also counted as walking).<p>At most, we must be talking about changes to the minority of kids who weren't using active travel before, in other words maybe it's that "Four in 10 London children (of the minority who were being driven) started walking to school.<p>But, if we look at the changes, that doesn't quite stack up either.<p>In London:
47 kids switched from active to inactive (all measured based on travel "today" and in many cases there will be variation in modes across days)
44 switched from inactive to active
61 inactive/inactive
809 active/active<p>In Luton:<p>124 active/inactive
74 inactive/active
290 inactive/inactive
475 active/active<p>It doesn't look like, ignoring the Luton control for the moment, there was any modal shift at all for London!<p>Luton has proportionally shifted <i>away</i> from active transport and only in relative terms to the control has there been a modal shift.<p>This is already a much less positive message. "Kids in general less likely to walk to school, except in London where (potentially due to a low emissions zone) their behaviour didn't change." Where's my four in 10 gone?<p>The "four in 10" comes from the 44 kids who were inactive in the first sample but active in the second (out of 105 total inactive in first sample). Of course that is a much larger % of children from that group who switched in that direction than the 47 who switched the other way from much larger number of first sample actives. If your transition probabilities from A to B are much higher than B to A, but B is much larger group, you can end up in this situation here where you have impressive sounding % changes which nonetheless mean nothing and don't change the population behaviour at all.<p>It's a very fine thing, no doubt, to run multilevel binomial logistic regression models on data and come up with statistically significant odds ratios but I don't think these results remotely justify the news article headline and subhead.