I'm always glad to see people experimenting with different approaches to building and deploying apps. That said, the general idea of this Serverfree approach doesn't appeal to me. On any given day, I'll use 4 different devices. I need my data to synchronize seamlessly. I wouldn't use a program (in-browser or traditionally installed application) unless I can synchronize that data either by storing files in a Dropbox-like program or in the cloud. I don't want to have to remember which computer/browser combination I was working on.<p>Edit: forgot some words
Looking at this with an open mind, I'm curious what benefits running SQLite in WebAssembly with a proxied web worker API layer gives compared to using localStorage or something similar.<p>* Using SQL has clear benefits for writing an application. You can use existing stable tools for performing migrations.<p>* Using SQLite in a filesystem offers many advantages w.r.t performance and reliability. Do these advantages translate over when using WebAssembly SQLite over OPFS?<p>* How does SQLite / OPFS performance compare to reading / writing to localstorage?<p>* From what I know about web workers, the browser thinks it is making http requests to communicate with subzero, while the web worker proxies these requests to a local subzero server. What is the overhead cost with doing this, and what benefits does this give over having the browser communicate directly with SQLite?<p>* I remember seeing a demo of using [SQLite over HTTP](<a href="https://hn.algolia.com/?q=sqlite+http" rel="nofollow">https://hn.algolia.com/?q=sqlite+http</a>) a while back. I wonder if that can be implemented with web workers as an even simpler interface between the web and SQLite and how that affects bundle size...
I am thinking the Willow Protocol would make a good base for local-first. There would be no privileged “backend”, but some peers can provide automated services.<p><a href="https://willowprotocol.org/" rel="nofollow">https://willowprotocol.org/</a>
I believe what is being proposed is a static site where user data is persisted locally using the WASM sqlite + OPFS. I guess it is also organized like a typical web app, but the app logic and database logic run locally.<p>I was expecting something different because it started with phrases like "no servers at all" and "entirely without any servers", but there's a regular web server serving static files.<p>I'm not a fan of the term "serverfree", though, since there is a web server. Also, the app and database servers from classic web apps continue to exist, albeit in a logical, local form. If this term somehow catches on for this style of app it will just cause endless confusion. I suppose it isn't a lot worse than some existing terms we've gotten used to (like "serverless"), but I'm always going to advocate to not repeat the mistakes of the past.
These ideas are some of the founding principles of "local-first software": <a href="https://www.inkandswitch.com/local-first/" rel="nofollow">https://www.inkandswitch.com/local-first/</a><p>As I like to put it - Local-first is the real serverless. Your user's device is the real edge.<p>I think the future of the web needs to be that the server is optional, we need our data (albeit personal or our companies) to be on our own devices.<p>We are all carrying around these massively powerful devices in our pockets, let use that capability rather than offload everything to the cloud.<p>One of the things I find most exciting about local-first (and I'm very fortunate enough to be working on it full time), is the sync tech thats being developed for it. 2024 is I think going to be the year local-first goes mainstream.
What we need is updates that only include security patches.... FEATURE CHANGES SHOULD BE OPTIONAL. Because all software tend to decrease in quality overtime.
> One morning, as I was contemplating ways to procrastinate on doing marketing for SubZero without feeling guilty, an idea struck me. "I know, I'll engage > in some content marketing... but what do I need for that? Oh, a cool demo > project!" And just like that, I found a way to spend a month doing content > marketing writing code.<p>I absolutely don’t understand the point of this. Just reading the intro, it reads about technology for its own sake, just because you can. But what is the value, what are the downsides?
Show me where you store the data.<p>If you store it on your phone, then it's not showing up on your other devices. If you lose or break your phone, then your data is gone. There are very few applications for which that's acceptable - basically just your calculator app.<p>If you <i>don't</i> store it on your phone, then it's stored on <i>some kind</i> of server, <i>somewhere</i>. Do you own and control that server, or does someone else? How does the application consume and update the data?
"It depends".<p>On one hand, if I were to have the goal to make a bunch of money, and software just happens to be the means to an end, making a gated software portal where I control everything would suit me very well. You get nothing until I get the money, and I only maintain what I want to maintain. (pretty much the model every SaaS has)<p>On the other hand, if I know I have a very small customer base, and everyone is making a lot of money or because of my program, and I don't really care that much about the money above a certain number, I might as well distribute it as a static/stale build. You get a binary, or a virtual machine or something like that, and it just does everything. Maybe if piracy were a concern I would add some sort of hardware dongle, but I would also be aware that it's going to get cracked anyway and the only people that are annoyed/limited by it would be my actual paying customers.<p>On the other more different hand (third hand?): if my program has requirements about robustness, locality or longevity, I would make sure it depends on as few things as possible, make sure that it's documented well enough for future users and administrators to run it on future environments, and perhaps not sell the software in itself as much as I'd sell support. The risk and downside is that specialised and unique software tends to be quite annoying and costly to create while there isn't a lot of telemetry or feedback to figure out what's working well and what isn't, so that would drive up the price significantly. I'd say you're looking at two orders of magnitude vs. a SaaS thing.
Mh, a small note: in the very past of IT DocUI was the norm. Actually WebUI are a <i>limited</i> and <i>limiting</i> DocUI, limited by the fact the user can't easy bend it to his/her own needs and desire, limiting because the WebVM underneath, the whole stack is NOT made for end-user programming as classic systems was.<p>Just try to look at a modern Emacs with org-mode and elisp: links. It's a DocUI, with some limits, but far more simpler than a WebApp in a local WebVM. And it's pretty local with full filesystem access and so on.<p>I'm curious how many more DECADES will be needed to reinvent the wheel discovering that classic local DocUIs are far superior and can be networked as we with technically. My own local fast GMail is notmuch-emacs, and it's not one, it's fully integrated and integrable by myself with few SLoC in my desktop. It's not so nice for some aspects not because of the model but because the little development base. If we invest a fraction of the effort put in modern web the classic desktop would outshine any other tech.
I want this too, there isn’t a great FOSS way to currently do this besides supabase or roll your own unfortunately. For PWA local save/write, try network update, refresh local upon success is the gold standard for data integrity. And for network reading fallback to local cache upon offline is great for UX. I haven’t found good tooling for this yet and i’ve been looking.
this is very close to an inventory control app I built at work (with the exception that eventually when the client is online it will sync data to the server).<p>I've often thought, if I had the time and capability ... take it a step further. No server sync at all. Clients form a peer-to-peer network and sync data between themselves. (perhaps bluetooth or something like Apple's Bonjour etc)<p>Actually something like that, plus an optional server sync when server is available is really even better. I'm thinking specifically of a use case in large warehouses that often have no internet connectivity but in which there are multiple users performing inventory who are duplicating work because they don't know a peer already inventoried a specific area and neither of them can sync to server because no wi-fi.<p>dang even better. something like a bit-torrent swarm with something like an admin certificate for releasing code patches, and user-level certs for syncing app data.
So is this article advocating storing everything in a browser SQLite database and then the SQLite database syncs with another server somewhere else? But to do that doesn’t it need to call a server somewhere. I’m trying to understand as it seems there are still servers here??
I like the idea of a hybrid of client-side software that optionally uses third-party cloud storage for cross-device sync. NetNewsWire, ByWord (markdown editor), and Scrivener are some examples. Older versions of 1Password also had this, but I don't know if it still does anymore.<p>Some of these programs managed this better than others. NetnewsWire and 1Password seemed to just work. Byword and Scrivener had occasional sync conflicts that had to be resolved. In general, though, this seems like a nice system: if you (the user) are subscribed to cloud storage, then you get syncing without paying for an extra service. If not, you can still use the software without syncing.
Imho, good integration with existing file-clouds is a good approach. Then it's just serverless but with helpers to say "Store the config on Google Drive", and you get free sync of your config between devices.
I like the concept and I'm building a similar framework but I think you made some confusion in the implementation?<p>Your server.ts (which uses express) runs in Vite or in a worker (which would require a lot of adaptations/might not even be possible)? If it runs using vite, that's a server. Then the distribution of your app is compromised: or people would run it locally having to start the server or they would need to spin up a server somewhere. How does it become ""serverfree"?
I go to fairly great lengths to do everything in the browser to avoid having to support any backends (for the tools, etc. I make; & hobby project gamedev). It would be a great thing if (perhaps legislation) could break the app-store model, then fully-fledged apps could be distributed as web sites (with their own localstorage, etc.)<p>I wonder if there's some legal way of saying, "the web is critical communication infrastructure and all core comms devices need to support X standards"
Offline-first isn't enough.<p>I've done user support with users in the mountains who don't have a reliable internet connection.<p>Being able to say 'don't worry, the app works offline, you can (optionally) sync when you're next in the city' is extremely rewarding, and vital for software to work for these people.<p>----<p>Offline-only is not enough either. Ideally users should be able to sync between devices when offline, and have the option to sync to the cloud when online
I'm lucky enough to have had tons of time to build something with these values.<p>A huge blocker I didn't grok at the beginning is API keys. Unless the app interacts with 0 services, at all, you need edge functions that essentially just add an API key header to a request from the client.<p>It offends me because I don't want people to have to trust me, but...there isn't anyone who will recommend otherwise. :/
I like the author's article but he makes no mention of countless people who tried to popularize similar ideas! Some examples:
<a href="https://unhosted.org/" rel="nofollow">https://unhosted.org/</a>
<a href="https://nobackend.org/" rel="nofollow">https://nobackend.org/</a>
What about installation and updating/patching? Or is the intention here to still serve this package of web code, db, etc over the net (via cdn maybe), and then execute locally?
I was actually thinking of this the other day, but taking it a step further by actually distributing computation such as worker queues in a peer-to-peer fashion.
So is this basiclly a fully peer-to-peer application, like bittorrent clients?<p>Or something like bisq (<a href="https://bisq.network" rel="nofollow">https://bisq.network</a>) when the program runs locally peer to peer and hosts all user data locally, but still pings oracle servers for outside market price data?
I believe what the author is describing is called <i>A Desktop Application</i>. They were these crazy forms of web apps, that ran on a local computer, stored data locally, and didn't use a server. Or a web browser. Legend has it that they used little memory, were fast and snappy, and enabled native integration with all of an operating system's capabilities, widgets, etc, while still allowing a user to completely control what happens with the data the program used.<p>Porting this type of application could take a lot of work! So at some point, somebody invented a programming language called Java, which used a thing called a Virtual Machine, so that one application could run the same way on any computer that could run the Java Virtual Machine. They called this <i>A Portable Desktop Application</i>.<p>Unfortunately, this obscure language was somewhat difficult to use, so this paradigm was abandoned for simpler languages, and with it, so went the idea of <i>A Portable Desktop Application</i>.<p>Decades later, somebody reinvented the idea. But it required a much more limited application platform, running inside a kind of hacked-together document viewer, with a simplistic programming language designed to control how documents were displayed. It took about 1000x as much memory and CPU, but, with addition of about 50 different frameworks and 30 different new custom APIs, you could finally get close to the same functionality as the first solution.<p>Ah, technological progress...
Once upon a time at a town hall at Google about 7 or 8 years ago where the SVP over GCP was present a man wiser than I asked the question, "In computing the pendulum has already swung a couple of times between client-centric and server-centric. What are we doing to prepare ourselves for the next swing back into client-centric?"<p>The SVP responded as if the guy asking that question had just stepped out of an alien spacecraft from Alpha Centuri. At the time in that room it seemed incomprehensible to most present how anybody could possibly bask in the glory of the Google infrastructure and then want anything other than that.
Tldr: Ship the entire app in the browser.<p>This may sound a bit snarky, but here is a serverfree app for your enjoyment as well:<p><html><p>serverfree</p></html>
> By now, everyone knows that serverless doesn't actually mean without servers; it just means using someone else's servers.<p>What? Is this some kind of confusing reference to lambda and competing providers? As far as I know most of computing is serverless.