Wow, this is a nice feature! I'd much rather implement it by having a .txt or .raw file just in the same folder as the HTML page though, rather than having to go in the middle of the URL. I feel like it is more convenient to do so.<p>Example, rather than <a href="https://mahdi.blog/raw/raw-permalinks-for-accessibility/" rel="nofollow">https://mahdi.blog/raw/raw-permalinks-for-accessibility/</a>, it would be <a href="https://mahdi.blog/raw-permalinks-for-accessibility.raw" rel="nofollow">https://mahdi.blog/raw-permalinks-for-accessibility.raw</a><p>It's a minor nitpick really, but I quite like this idea! I think I'll try to implement this for my website too.<p>As for the other people here wondering why User Agents weren't used for this:<p>- Using static website hosting goes out the window, which is quite a shame because it makes everything so much easier<p>- User agents are pretty terrible for determining capabilities and intent (what if someone was using curl to get an actual webpage?)<p>- It will never cover all types of HTTP clients (a whitelist is pretty terrible as we have seen from various online services restricting Firefox users or Linux users from certain features for no other reason than their user agents weren't in the list the developers used to check for the presence of certain features).