I really appreciate that kind of focus and dedication. I recently had another dive into web development after a couple years of absence and was pleasantly surprised by tools like react. In combination with server side rendering it allowed me to serve a solid/usable HTML page with no JS required. When I started pulling in components from popular Material Design libraries, though, I realized that nobody thought about progressive enhancement even for a second. Rather than serving a plain `<select />` and then progressively enhancing it, all I got was an unusable mix of HTML tags and styling. It's a shame that even down at the library level people don't care about progressive enhancement. Are there any (react) projects that try to do better or is the mantra more like: react without JS is nonsense?
This, this, <i>so much</i> this.<p>We owe it to our users to provide <i>all</i> of them with a good experience. We also owe it to ourselves to write good, clean, maintainable applications. The wonder of it is that very, <i>very</i> often if we write RESTful apps then progressive enhancement just falls out naturally: a GET request with some JSON Accept header will return data; with text/html it'll return data, and possibly a form to update it; a POST with some JSON Content-Type will DTRT, and so will one with application/x-www-form-urlencoded or multipart/form-data.<p>Why <i>wouldn't</i> one do that?<p>Of course, the user experience of such an app isn't as great as a singing, dancing single-page app (SPA). But <i>a</i> user experience is better than <i>no</i> user experience, which is what offering <i>only</i> an SPA provides.<p>It's just sloppy to require JavaScript and CSS to use a web site. It's particularly sloppy when that web site is composed of documents.
I like the reasons they give why users might not have JS, even though they haven't blocked it themselves. Here are three:<p>* <i>user’s hotel is intercepting the connection and injecting broken JavaScript</i><p>* <i>user’s telecoms provider is intercepting the connection and injecting broken JavaScript</i><p>* <i>user’s country is intercepting the connection and injecting broken JavaScript</i><p>Broken or not: all three of those sound like good reasons to intentionally block JS.
I'm glad GDS is working to improve gov.uk sites but some really simple things still seem broken. Example: it'd be great if GDS would stop with the ridiculous forced password recipes across gov.uk sites.<p>Today I had to try three times to get 1password to satisfy the insane requirements of the site (8-12 alphanumerics, no special characters).
The JavaScript annoyance will always be a part of web development at this point, but I really don't see why so many people use these single-page applications and <i>require</i> JavaScript to perform simple tasks.
<i>user’s company has a proxy that blocks some or all JavaScript</i><p>This one bites me every day. Websites with their social button / analytics scripts before the main feature JS become unusable if the 3rd party JS errors or fails to load.
Is progressive enhancement still possible nowadays? Considering how many millennials choose JavaScript frameworks for most of their projects I suppose not. It is very rare for me to find a website that works without JavaScript enabled either because the website in itself is written with JavaScript <i>(which by the way doesn't sounds like a good idea, try that with Lynx)</i> or because they are using analytic services as dependencies for the rest of the code.
This is quite progressive for the government.<p>think of all the savings for the government - no need to update old hardware sitting in the schools, libraries, Councils etc...
If anything this is a great example of what being agile is about. Start with the bare essentials and slowly add features, without breaking the characteristics that make the website nice to use such as speed and reliability.
Here's the original post that explains how they got the numbers: <a href="https://gds.blog.gov.uk/2013/10/21/how-many-people-are-missing-out-on-javascript-enhancement" rel="nofollow">https://gds.blog.gov.uk/2013/10/21/how-many-people-are-missi...</a><p>Looks like this was removed after a few years: <a href="https://github.com/alphagov/frontend/pull/944" rel="nofollow">https://github.com/alphagov/frontend/pull/944</a><p>I'd like to see the data over that period and see what the JS stats are like today.
One small quibble, if you are going to take the time to implement progressive enhancement, why serve up incredibly large images?[1] Even sitting on a MBP with a high speed connection, the initial download took some time from across the pond.<p>[1] <a href="https://gdstechnology.blog.gov.uk/wp-content/uploads/sites/31/2016/09/28825249152_cd35912c3c_o.jpg" rel="nofollow">https://gdstechnology.blog.gov.uk/wp-content/uploads/sites/3...</a>
I tweeted recently to say how I'd wanted to give some good feedback to @GDSteam, and the beta feedback link redirected me to a localhost URL.<p>I hoped this would be a quick and pleasant interaction, but - and I really should have guessed it - the Twitter 'operator' was clearly so far removed from the developer that can quickly fix it.
This may not go down too well, but it is an opinion I heard and I personally agree with (having worked in UK gov institution and that far from GDS):
GDS is wasting taxpayer money, spending way too much attention on minute details like apostrophes, CSS and stuff.
Given that most of *.gov.uk are standardised now, it works out to literally millions of GBP per line of CSS.
In an ideal world we'd have fully accessible web, progressive enhancement, responsive layouts etc, but any real business has to say stop - this does not improve our services that much anymore, whereas GDS are free to to redesign their CSS over and over, just because no one else in the government understands technology.