The reason the Web needs rescuing is because it's not a particularly well-designed system that has been patched over and over again for the last quarter century. And now it has been degraded to a delivery layer for JavaScript apps, a poor RPC protocol and an overly complex UI rendering toolkit.<p>It should have had protocol-level technologies for preserving historic data, for search, for offline use, for authoring and publishing. If you look closely, cloud services created tools to easily do all those things and that's how they got in control.<p><i>"The Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs."</i>
-- Alan Kay.<p>A lot of web devs were enraged by his comment without listening to the context. He was talking about the lack of protocol-level solutions for <i>resilience</i> of the Web.
<p><pre><code> Here’s the problem with IP addresses: there aren’t enough of them....
As a consequence, the Internet has allowed intermediate
computers to rule. These are like parasites that have grown
too large to remove without killing the host. The technical
flaw that favored intermediate computers prefigured a world
where middlemen business models thrive.
</code></pre>
The handwave is the word "prefigure". How did IPv4 and NAT play any role in the dominance of Facebook, Airbnb et cetera? This is an analogy masquerading as an argument.<p><pre><code> It is not fundamentally necessary to have any intermediate
company profiting whenever a person reads news from their
friends, rents an apartment from a stranger, or orders a
ride from a driver.
</code></pre>
The author provides no evidence that the services of Airbnb, Uber etc. have no value added. These companies carefully designed interfaces to help us find what we need. If they did not add value, we would still be using newsgroups.
In my view of history, P2P takes off and does well when it's faster for end-users than centralized solutions; P2P solutions fail when centralized solutions are faster.<p>This holds regardless of the political advantages (or disadvantages) of P2P.<p>BitTorrent is often faster for large files than centralized solutions, so people use it. (It's still inconvenient for a new user to kick off their first torrent, but WebTorrent will help.)<p>Freenet is essentially never faster than alternatives, and so it has stumbled.<p>If some future P2P network can outperform HTTP, then it will succeed; otherwise, it will fail.<p>Regardless, unless we see a breakthrough in battery technology, it will never be the case that lots of people will carry around mobile phones in their pockets whose radios are always on and participating in a mesh network. The battery would die in an hour or two.<p>P2P networks work best when coupled with reliable power networks. But if you have reliable power, you can probably set up a reliable wired network, too.
There's a lot of white on this map: [World Population Density](<a href="http://www.luminocity3d.org/WorldPopDen/" rel="nofollow">http://www.luminocity3d.org/WorldPopDen/</a>) — the hard part is gonna be getting data between Europe and New Zealand without Big Wire.<p>I love how Staltz's solution to this goes hand-in-hand with re-personalising our interactions. In short:<p>1. <i>The Next Billion</i> haven't yet become used to the idea that useful tech services must be global, and provided for you by a single corporation.<p>2. They want to communicate with other people they know, physically nearby-ish.<p>3. Sneakernet is never slower than talking in person.<p>4. Isolated mesh networks and comms with days of latency are viable for these new net users.<p>5. Get enough people using a mesh and the networks will start to connect.<p>6. Where the internet is already pervasive, privacy and autonomy advocates are resisting corporate control, and choosing decentralised alternatives.<p>7. For now, we can exploit the existing internet to handle long-distance.<p>8. But eventually, enough people will have the bottom-up mindset, and the weak links in the mesh will become worrying.<p>9. So a solution will emerge to fill in the gaps in the mesh, using ships / beacons / balloons / satellites / modulated whalesong.
I'm all for p2p (really), but how about taking back control (sorry) of the Web by actually developing and enforcing declarative/markup technologies and standards instead of praising JavaScript because "it's not half bad" and adding procedural features to the Web (APIs, WASM)? With the current state of affairs wrt privacy and self-proclaimed standardization bodies, I'm not sure the Web is worth preserving.
The first choke point is the ISP. You are completely dependent on the ISP or mobile provider to get on the network. Once on the network there are multiple choke points around IP addresses, dns, ca authorities, registries and more.<p>As long as you are dependent on anyone to get on the network it by definition can't be decentralized. Consumer wireless tech is heavily regulated and governments are extremely paranoid about communication channels they don't control and can't monitor.<p>This is unlikely to change because there are no incentives to develop technology that truly empowers individuals, there is no profit in it, it's a social good. If developed it will be demonized and made illegal, and limited to minority dissenters, the general population is unlikely to go through hoops to get on a network.
Content-addressed overlay networks work pretty well over the Internet, and they could work equally well on mesh networks (as the article posits), if not two rather awkward problems: storage at rest, and liveliness of a storage node.<p>These two factors even interact to make the situation worse. Because at any point, nodes can drop off and you never know if this condition is temporary or permanent, a distributed datastore has to redundantly store everything. This needs much more space than just storing everything at its origin, as is done in location-addressed networks like Web.<p>These are not insurmountable problems, of course; it's just that right now, conditions like storage on nodes, link asymmetry, traffic distribution asymmetry still favor centralization.<p>And fundamentally, the operators of individual nodes (i.e. ordinary people) are often not very selfless, and not enough of them contribute to the health of the mesh even if they discretionarily choose the mesh: seeding ratios on BitTorrent networks that are not user-adjustable (e.g. old World of Warcraft downloader, Facebook patchsets) are much, much higher than seed ratios where the peer can bow out at any time, despite all of latter users choosing BitTorrent voluntarily.
After reading the FCC Chairman's idea of "the Internet" in his Notice of Proposed Rulemaking (below), I think maybe a better plan would be to rescue the internet from the web. Every example he references of internet usage is <i>web usage</i> or email. He does mention "DNS and caching" but in the context of the potential effect of <i>removing these</i> from the services available to users. (para. 37)<p>The general tone of the NPRM seems to be that an ISP can and will block or throttle any non-web or non-email traffic. That could include any peer-to-peer innovations that seek to restore the original functionality of the internet, such as those mentioned by the author.<p>On the contrary, the dissent by Commissioner Clyburn specifically mentions Skype as an example of internet usage. He believes the traditional notion of "permission-less innovation" is under threat from the Chairman's proposed approach.<p><a href="https://apps.fcc.gov/edocs_public/attachmatch/FCC-17-60A1.pdf" rel="nofollow">https://apps.fcc.gov/edocs_public/attachmatch/FCC-17-60A1.pd...</a><p>The author highlights the importance of distinguishing "the web" from the internet. Perhaps nothing is more important. The internet has more value than the web. The web is severely limited in functionality. The internet, still underexploited in its potential, does not suffer from the same limitations.
Funny.<p>Before DRM, spam, i-have-to-monetize daily 5 minute of bloging into a full time pay, unskipable 30s YouTube ads, profiling every possible user behavior and other shady shit like that:<p>We wondered how can we <i>improve</i> the web.<p>Now, we wonder how to <i>save</i> it.
My wife gets increasingly pissed off when she searches for various materials she wants to purchase. A specific site keeps coming up for her that she absolutely hates and starts with an E. And there's nothing special about the links, and likely nothing important about the specific products other than the domain name that starts with E has billions of cross-links all over the web. It makes the random long-tail search terms (that mind you are so freaking obscure) all go to this same site. It's driving her nuts.<p>At this point SEO has been totally gamed and search is totally useless now. We NEED an alternative.
What makes people use centralized services is their utility. People will pick up new tools and repurpose them if they find them useful. In theory the web is supposed to be about communication and connection.<p>I think the emphasis on building a local network is a good idea. P2P mesh is cool but until it provides opportunities that don't exist otherwise it is unlikely to surpass.<p>The notion of building this for places without internet access is a positive angle but also tricky. Charity is seldom as successful or scalable as user driven initiatives. A lot of mobile phones now exist in places w/o "internet" per se. But from my understanding say converting a 3 year old smartphone into a mesh 1st device seems challenging from a wifi driver and power consumption and app perspective.<p>Balancing the project of design that is easy for non-technical people with the notion of eating your own dogfood one can theorize about building alternatives to the hierarchical Internet. This is the challenge though, figuring out how to build utility that is superior to the walled gardens and is in the hands of the users to control.
I like the phrase "local-first" software.<p>It's been happening for a while:<p><a href="https://qbix.com/blog/index.php/2017/12/power-to-the-people/" rel="nofollow">https://qbix.com/blog/index.php/2017/12/power-to-the-people/</a>
For us "old" people who remember the internet before the web -- one of the things that's really different about the modern internet is the very limited set of protocols and applications that the average user interacts with. It really used to be that every different service type mapped to a different protocol and HTTP (and HTTPS) has just sort of subsumed everything. Back in the old days to minimally use the internet you'd have to know at least telnet, ftp, nntp, gopher (maybe), smtp, pop and maybe a handful of others.<p>(okay, maybe modern users use more protocols than I'm admitting to, but it's very obscured these days but so many different applications just ride on HTTP(S) anyways).<p>There's really nothing preventing some motivated group to just spin up an entirely new kind of service that "fixes" all that's wrong with the web, custom protocol and application stack.<p>"But the network effect!"<p>And that's something us old timers remember, we remember lots of great services spinning up and down and even when the web was just a handful of sites. The web earned its network by having better general utility than other things that were attempted, but why can't a new better service eventually earn it?<p>(In the meanwhile us hacker types will enjoy having a cool new playground to muck around on for a few years).
scuttlebutt is my main social media these days... oh and in case you're looking for git that's not wedded to github via your comments and issues, scuttlebutt plays really well with git - you push comments and code into your gossip-cloud together
> Smartphone manufacturers sell mesh-first mobile devices for the developing world<p>I hope puri.sm is listening. This is what I want my Librem 5 to be!
Unless I missed something, this article lacks a call to action (or multiple calls to action for different people). What can individual readers of this article do to help realize this plan? For example, if I have money, where can I give it?
If he figures out how to make a good mesh network and bootstrap it then everything else is easy. That blogpost was pretty much a long winded way of saying "we don't have a mesh network that works."
Does anyone here have any experience with MANETs? I really like the idea, but is there any way to defend against malicious actors? Are there any protocols that are clear winners? Is the reason it hasn't caught on solely because the big ISPs are against the idea?
The IPv4 perspective is a red herring. NATting was indeed necessitated by IP address scarcity, but a domestic installation that does NAT comes with ancillary benefits, like giving you, the home user, a single place to control access to your network.<p>In IPv6 it's nice that you have an address space that's not only big enough to accommodate every device, but large enough to even burn through addresses and treat them as disposable, but once IPv6 becomes widespread there will need to be some rethinking as to how to manage firewall rules between your own devices, how to segregate your portion of the network from the spurious (and sometimes malicious) traffic of everywhere else.
People were sounding the same alarm in the '90s with AOL. History has shown us that, at some point, a leaner, more innovative company will surpass Facebook.
If you really want to rescue the web find a way to restrict adware, spyware, and the like. I upgraded internet at my house to 1gbps (about 890mbps down and 920mbps up) and I hardly notice any speed difference surfing the web. Sad. Everything else is fast as hell though.