We build websites for small business, many biz owners:<p>- Do not know what or where the "address bar" is.<p>- To visit their website, they will "google" (the verb) their website address (or biz name) - which leads to newly launched website owners thinking their website isn't accessible/online.<p>Chrome and other browsers combining the address bar with the search box has only made this situation worse.
It sounds to me like this is more of a problem with the <i>domain</i> system, and the way it embeds into a URL, than a problem with URLs as a whole. Browsers have already largely hacked off the protocol in the front, and even Google couldn't get rid of the path & querystring (though we can continue to work towards hiding more of it from the average user if that is desirable), so it seems likely this is all about the domain, and the difficulty of mapping what domains really mean to what end users think it means.<p>To boil it down to one example, how do you solve the problem of someone setting up secure.citigroup.accountmanagement.com and have an end-user understand that that's a phishing site? I mean, for goodness' sake, authority in the DNS chain is read <i>backwards</i>. How many users are going to read that and see ".com" as the root and technically most important element, rather than "secure"?<p>I'm not commenting on any possible solution since this article doesn't even sketch one. I'm just trying to apply the principle-of-charity to the article and come up with the most plausible interpretation, and the one most interesting to have a discussion about.<p>My own commentary is that I'm not sure if there is an answer that's any better than beating on domain names until they work as well as possible. The only other alternative I see is a centralized authority of some sort, and while that is likely to potentially work better than what we have now for maybe about 5 years, the negative consequences after that as the central authority learns to spread its wings and exert control to extract money from people and abuse its authority to push some agenda outweigh the safety.
Whatever Google comes up with I'm pretty confident that I will disagree heavily with. URLs is fantastic, solves the problem well and basically has little to no issues. There is no need to fix something that is not broken.<p>Thanks, but could you please stop trying to control the web Google?
Maybe instead we can have little badges called "Googs" that'll direct you to the AMP version of a website. Obviously, you'll need to pay the Google for this privilege. They'll make the design open source, but since their browser and Firefox will only support the official "Googs" implementation, oss versions will be dead from the start.<p>Letting a commercial company define a breaking web standard change won't have any repercussions whatsoever.
Similar to the Google push for AMP pages:<p><a href="https://news.ycombinator.com/item?id=17920720" rel="nofollow">https://news.ycombinator.com/item?id=17920720</a>
<a href="https://www.polemicdigital.com/google-amp-go-to-hell/" rel="nofollow">https://www.polemicdigital.com/google-amp-go-to-hell/</a><p>There's a great comment about the link between these two over there:<p><a href="https://news.ycombinator.com/item?id=17923156" rel="nofollow">https://news.ycombinator.com/item?id=17923156</a>
Do we really need to change the url system? What if we make it super-easy to register a TLD?<p>What if there is an alternative system to verify that a website is owned by someone? For example, what if the icon needs to be registered with some kind of trademark-like entity, and the browsers made the icon more prominent?
It makes a lot of sense to me to redesign how we display URLs. Right now, we show them as a singular identifier, where the only thing we can tell the user is to search for things inside that string of characters (look for https, or look for bankname.com). We don't need to show the whole string of characters.<p>It seems very sensible to me to take the URL for what it is and communicate that back to the user:<p>- The protocol doesn't matter much to most people, except the implications it has, so maybe show "encrypted" or "not encrypted" (and fallback to protocol name for FTP and the like, most people won't ever touch that without knowing what it is)<p>- The domain is interesting: there's the TLD, which doesn't _really_ matter, and the second level name, which matters a lot. Subdomains matter less. How about we show it as a much more prominent "[ domain.com ]" with a less obvious subdomain.<p>- Paths are pretty generic all around: incrementally deeper descriptors separated by forward slashes. Could definitely show that as a breadcrumbs-style thing (might also make people care more about readable paths, which is nice).<p>- The query string should probably be shown as "tags" with a key and value, since that's what they are.<p>- The hash is a bit icky, but it might be enough to highlight that you're linking to a place in the page.<p>Could be displayed as something like this: <a href="https://i.imgur.com/RfJoP23.png" rel="nofollow">https://i.imgur.com/RfJoP23.png</a>
Welp, time to be afraid then. Google does not wait to find a replacement when they don't like something. They destroy first, and then never get around to coming up with anything to replace it. So get ready to embrace a future with no address bar and with Google search being the only way to find and navigate to pages. Just look at what they did with vertical side tabs in Chrome for a perfect example. Google said they "didn't like how it looked" and "wanted to find a better solution" so, despite the option being a hidden setting you had to dig to even find and enable, they ripped it out of the browser. And to this day, years later, there is no solution at all to having lots of tabs open. The browser just rapidly fills the bar with tabs, shrinking them to utter uselessness.<p>But at least Google doesn't have to tolerate something they find aesthetically displeasing.
I'm a bit late to the party here, and the following doesn't address a lot of the concerns in the article (nor does it actually get rid of URLs), but for anyone interested in alternatives to horribly long URLs or incredibly opaque URL shorteners I made a fun little tool [0].<p>It's called ShortestSearch and in essence could be looked at as a reverse Google in that you give Google search terms and it gives you a list of websites, but for ShortestSearch you give it a website and it gives you a list of search terms. All with the aim to perfect the art of "It should be the first result if you Google 'x y z' ".<p>Also found the part of the article where it describes its own URL strangely enjoyable.<p>[0] <a href="https://oisinmoran.com/ShortestSearch/" rel="nofollow">https://oisinmoran.com/ShortestSearch/</a>
<i>>"Some URLs are good for sharing, others aren't."</i><p>This is one of the things that definitely can be improved on. Having something either in the URL or in page metadata to say "this is linkable". This doesn't warrant a replacement, however.
Navigation is the problem.<p>Maybe go back to lists and hierarchies of lists like the old days. With `UUID` behind it. Lists are at least navigatable.<p>So every URL would be part of at least one list and `URLs` point to lists. Or something like that.<p>Example: Sites-I-Use List can be shown in the address-bar instead of the url as green while a site I never used before can be shown as red. That’s not even taking organization into account, or all the other myriad of useful usecases.
It's a bit disingenuous ars technica ledes with a screenshot of a homograph attack that Chrome mitigated in March 2017, and really doesn't relate to the article. <a href="https://www.xudongz.com/blog/2017/idn-phishing/" rel="nofollow">https://www.xudongz.com/blog/2017/idn-phishing/</a>
The “World Wide Web” was based on only three concepts. A document markup language, an document “universal resource location” (URL), and a tool that could use the URL to retrieve the document and render the markup.<p>Breaking/obfuscating/demoting the URL is so fundamental I am shocked it would even be suggested.
Given Google hasn't laid out any particular solution and they're still being criticized, it seems to me that HN is so cynical of large tech companies exercising their monopoly power (because of past practices) that they're afraid of any effort to rethink the fundamentals of the web.<p>Which on one level makes sense. It's good to protect things that have made the web great. However, it's also good to maintain a spirit of innovation and willingness to question those things that people assume has to work a certain way just because it's always worked that way.<p>When there's a specific proposal, and if that proposal is a bad one, attack it on the merits. But there are real problems of security and user friendliness when it comes to URLs and I don't see anybody else working on it. Does anybody really think that it's impossible to improve and the way we do things now is exactly the way we should be doing it 100 years from now?
I'm not sure URLs have any major downsides. The full complexity is beyond most people but most people don't need to understand anything more than how to type www.something.com, it is the web application owner who has to understand them and where most security problems live. Security is only a problem if the server or web application have not been setup correctly.<p>It's no different than visiting a shop or a house where you should be able to expect certain regulations to be met (insurance, building codes etc) and I can see more countries starting to enforce country-wide codes for web sites, annual checks, registrations, whatever for people to be allowed to trade in that country.
Half-baked idea: replace it with the x509 subject?<p>The route could be displayed less prominently to the right.<p>Preserve normal omnibar behavior for input.
so, lets see..<p>we'll need a protocol. and a port. and a host. and then a resource on that host. and to combine them together.<p>how about:<p>foo.html\80:net.host\\:http<p>totally cool!