I naively assumed from the headline that the author would complain about users blocking cookies. I was very pleasantly surprised to see a post written by someone who appreciates that some users will want to do this and is actively working to support delivering them a useful content experience!
I often think that instead of completely blocking cookies, it would be better to accept them and then throw them away. Same with localStorage. Just store it temporarily.
What if browsers made it so when you turned off cookies, instead of not allowing anything to be written, they instead gave each page you visited its own fresh cookie jar that was cleared when you navigated away?
<i>> All I am using is some innocent localStorage and IndexedDB to persist user settings like the values of the sliders or the chosen color scheme.</i><p>When you turn off cookies you're telling the browser not to let sites persist information. Otherwise, whatever goals you had in disabling cookies would just be worked around through these other technologies.
I always use a wrapper around local/session storage[1] to avoid this problem. Then you have your app <i>sync</i> settings with storage, never read from it except during startup.<p>It becomes impossible to implement basic UI features like remembering open panes, etc when storage is disabled though. With the current policies around cookies - no cross-domain reads, Safari's ITP - there is no real need to turn them off for privacy reasons, for the average user at least.<p>[1] <a href="https://www.npmjs.com/package/localstory" rel="nofollow">https://www.npmjs.com/package/localstory</a>
"(On a tangent, MDN is completely broken with cookies blocked, too. I was about to report this problem (because I care and love MDN), when I discovered a PR is already under way that fixes the Issue. Thanks, @bershanskiy!)"<p>This would imply that "MDN" is under a state of rapid flux, potentially "breaking" and then being "fixed" (or not) over short periods of time. However it appears from the edit history that most of it is actually static and has not changed since 2019 or 2020.^1<p>Perhaps the "completely broken" catchphrase invoked by the author refers to an issue with "cosmetics" (window dressing) not content. I use a text-only browser and have not found MDN to be either partially or completely "broken". I send an HTTP request for a file and I receive the contents of the file. For me, it works. No cookies or Javascript required.<p>1.
<a href="https://raw.githubusercontent.com/mdn/content/main/files/en-us/_wikihistory.json" rel="nofollow">https://raw.githubusercontent.com/mdn/content/main/files/en-...</a><p>If I want to check browser compatibility, which can change from time to time, I can use Github or the MDN website.<p>For example,<p><a href="https://raw.githubusercontent.com/mdn/browser-compat-data/main/http/headers/Clear-Site-Data.json" rel="nofollow">https://raw.githubusercontent.com/mdn/browser-compat-data/ma...</a><p><a href="https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Clear-Site-Data/bcd.json" rel="nofollow">https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cl...</a>
I created an extension that limits the maximum lifetime of cookies, I was surprised to see some have a lifetime of years.
<a href="https://addons.mozilla.org/en-GB/firefox/addon/fresh-cookies/" rel="nofollow">https://addons.mozilla.org/en-GB/firefox/addon/fresh-cookies...</a>
Why does the browser pretend to have localstorage but then throw an exception when it's used?<p>Surely it would be better to simply pretend to not support localstorage and then all sites built with feature detection would work correctly without needing to special case this?
The Atlantic is really annoying in this respect. When you open an article in Firefox Focus, it fully renders for a moment, but then apparently some javascript loads at the end of the cycle which clears the page.
Yeah, adding try/catch around those has been a good practice for a while. I think there was a time when, if the site was running in a private window in Safari, localStorage would also throw exceptions.
I handle cookies on my proxy where I change them to per session cookies and interestingly the sites that have cookies that are "necessary for sites to work", are working flawlessly (/s)
God bless this man for making this -- but you know what's CRAZY to me? That no one has done this before.<p>We all visit websites constantly and governments (particularly in EU) talk endlessly and vaguely about cookies and yet almost NO ONE really gets it. I work on this specific problem and it is SUCH a mess.
An interesting facet of this is the implicit trust by the author towards the downstream tooling and libraries. He is not alone.<p>We talk about how we need to make sure dependencies are secure, but I venture to state, it is often just brushed over. Yes, <i>supply chain security</i> (now to rinse my mouth out).
Weird to throw an exception when localStorage is not available. It is much more logical to have it undefined or null. Code working with localStorage is more likely to check whether it is available (“not falsey”) rather than trying to use it and fall back if it throws.
One thing that annoys me about firefox's total cookie protection is that I offer some 3rd party embeds. What I did on those is I set a cookie, and probed for its existance to check if the user has third party cookies disabled. Then if they do, it displays things for the case where it's not known whether or not the user is logged in to the service, rather than as if they're definitely not logged in.<p>This worked fine, but now that firefox just containerizes third party resources rather than actually blocking the cookies, so there's no longer a way to detect that the actual site cookies just aren't being delivered in a third party context, rather than not present without user agent sniffing.
instead of blocking cookies, it would be nice if there was something like "Certified Humane" for websites... and you could stick within an internet of websites created by people who are not dicks.
I think the problem here is "potentially blocked".<p>How do you know what's potentially blocked? Maybe it's listed clearly somewhere in the browser docs, or maybe it's not. Did they change it between versions? Did you even know about this issue in the first place?<p>I know people like to think of checked exceptions as a failed experiment from the dark past of object oriented programming, but this situation is a great example of statically-typed (or at least statically-checkable) side effects are a huge improvement in code safety.
This feels kinda unscalable though...<p>wouldn't it make more sense to change the browser to make cookies and localstorage non-persistent and isolated, but otherwise available programmatically and to XHRs.<p>i.e so that they can exist in isolation as long as the tab is open. This would be compatible with anything that doesn't require cross frame or cross tab persistence (which is usually all users care about).
Why is the website failing with unhandled errors, but working when they are try/catch'd? Either way the errors are being thrown, and the functionality isn't available. Is the browser not able to handle the situation itself more gracefully?
Fun fact: the code example with the glow effect was created with [Carbon](<a href="https://carbon.now.sh/" rel="nofollow">https://carbon.now.sh/</a>)
I wonder if instead of blocking cookies we could make a browser extension to share the tracking cookies (and only those) with random people on the web, to confuse trackers?
Not directly related to this article but doesn't aggressively blocking tracking in this way create a tracking monopoly for browsers, extensions and apps?
I recently read a thread about privacy here. One point was that the one best thing you can do is disable JavaScript. So I decided to try it. I installed Brave on my phone and disabled pretty much everything, including all cookies.<p>My thinking was, all I do is browse HN, hn.algolia, and lobsters. Those should work, right? Well lobsters works perfectly, including collapsing comments.<p>HN loses the ability to collapse comments. But algolia is the worst. Not only does it require JS, being an SPA, but it refuses to work until you enable cookies! My theory is that it reads the settings (popular, 24-hour) from a cookie, and plain dies if they're not there.<p>On another note, and to a pleasant surprise, a lot of the web works perfectly fine, and feels a lot snappier, including even google search. And many of the annoying cookie and paywall popups never appear, since they appear to be implemented in JS.<p>So yes, if you haven't tried it, I recommend you do. You can always whitelist sites you trust or really need to use.
I always accept all cookies because they have never had any negative impact on my surfing experience ever. Cookie banners and Privacy banners are much more of a problem than cookies ever were.
I just want to take this opportunity to thank "adtech" and everyone working in it for making local storage way more complex than it otherwise needed to be because you couldn't/can't stop yourselves from abusing users.