So basically Deno has its own bundler that lets you not have a local build step and it gets bundled dynamically per-route as requested by users, right? This is very different from industry standards and possibly has many new concerns from devs, none of which are addressed in the article since it's treating the system as a perfect solution, which makes sense since it's a marketing page ("content marketing"). If it was a 3rd party article, I'd be interested in things like:<p>- How do you measure bundle size and make sure it remains small? (e.g. make sure that a PR doesn't accidentally import a massive dependency)<p>- How do you measure bundling speed/performance, does it add significant time to the first request? To subsequent requests? Is it linear, exponential, etc. with the number of LOC? Again like in the previous point, how do we make sure there are no regressions here?<p>- How does this work, well, with absolutely anything else? If I want my front-end in React? Vue? etc.
I worked on JS infra for Google. One thing we found in this space is that when your apps get very large in terms of number of source files, there is a developer-impacting load time cost to the unbundled approach.<p>That is, your browser loads the entry point file, then parses it for imports and loads the files referenced from there, then parses those for imports and so on. This process is not free. In particular, even when modules are cached, the browser still will make some request with If-Modified-Since header for each file, and even to localhost that has time overhead cost. This impact is greater if you are developing against some cloud dev server because each check costs a network round-trip.<p>However this may only come up when you have apps with many files, which Google apps tended to do for protobuf reasons.
In other languages, devs build better tools for solving existing problems faster or easier. In no other language, build tools are so broken that the best tool changes every few years.<p>In the JavaScript world, a significant chunk of energy is directed inwards, solving problems created by using JavaScript!
I am perplexed by the focus on this. Clearly there are excellent devs working on Deno — but what setups are you running that the actual build is holding your productivity back? Developing in node/ts or rails I don't think it would move the needle in the slightest for me. It's simply not an issue outside of my brain finding beauty in any kind of optimization.<p>Is that all this is?
The decoupling of URLs that host your dependencies and the URLs that host your application feels like an important uptime measure currently. If the URLs that host your dependencies go down in an NPM world, you can't build and deploy new code but your app is still up. It seems, if the URLs that host your dependencies go down in a Deno world, your app goes down if those dependencies have not yet been cached (even on the server).<p>Am I missing something? This might not be terrible if it becomes the standard to host your own mirror internally.
So Deno is relying on native ESM imports in production code? Isn't that exactly what Vite _doesn't_ do, because of poor performance?<p>When you run the vite dev server it uses ESM, but when you build it uses rollup, because serving ESM is slow and with larger apps the client browser is going to make a bazillion requests. Wouldn't you rather traverse the dependency graph one time and bundle your code into modules so that everyone who visits your site doesn't force their browser to do it over and over again? Sure those dependencies will be cached between views or refreshes, but the first load will be slow as shit, then you still need to "code-split", just now you're calling it "islands".
My understanding of this is that this is trading a build step for just in time (JIT) compilation, which seems, ok? But it seems to me that you've just moved the problem around and I'm sure there are additional trade-offs (as with anything).
The history here is a little misleading. Client side bundling happened before node/npm. It's a performance optimization to reduce the number of requests the browser has to make. Typically people were just concatenating files. Concatenating was a painful dependency management challenge for larger code bases. Subsequently there were module systems like requirejs that also sought to fix some problems like these and ran without a build step in dev. Browserify really changed the perspective here and people started to think a build step wasn't sooo bad.<p>I do think, based on the requirejs code that commonjs/browserify didn't really need to be compiled anyways.<p>Also fwiw, the technique mentioned here is a way a colleague and myself introduced babel to a large company as well, we just transformed + reverse proxy cached in dev. And fwiw, webpack basically does this anyways these days.
> What exactly needs to happen to make server-side JavaScript run in the browser?<p>That sounds like an oxymoron to me. I have honestly no idea what they mean by that. To me, a browser is client-side software, so saying you want to run server-side JS on it doesn't make any sense. They mention it several times in the article but I simply can't follow.<p>Could someone with a deeper understanding ELI5 this to me?
Of course Deno has a build step. The difference is you don't have to configure it and it happens on demand rather than aot.<p>It's definitely an improvement but the title is misleading.
IMO, this post doesn't discuss the tradeoff of removing the build step. What a “build” is has been obfuscated. When you deploy an app, you now need to convert TypeScript into JS, and then the JS needs to be turned into an optimal representation for V8 to process.<p>For example, Fresh has a “build process” whose cost is paid for by the user [1]. You want to do these things <i>before</i> the user hits your page, and that’s the nice thing about CI/CD. You can ensure correctness and you can optimize code.<p>In the interest of losing the build step, a tradeoff is made for worse UX for developer experience (DX). Rather, I would recommend shifting the compute that makes sense to the build step, and then give developers the optionality to do other work lazily at runtime[2].<p>[1]: <a href="https://github.com/denoland/fresh/blob/08d28438e10ef36ea5965efc712b3d785b0a2aec/src/server/bundle.ts#L15">https://github.com/denoland/fresh/blob/08d28438e10ef36ea5965...</a><p>[2]: <a href="https://vercel.com/docs/concepts/incremental-static-regeneration/overview" rel="nofollow">https://vercel.com/docs/concepts/incremental-static-regenera...</a>
I'm certainly not the foremost expert in JavaScript build systems, but this just seems wrong.<p>Reducing build times (or eliminating the build step) by moving things to runtime is a great idea for a debug build/mode. But why is it a good idea not to have separate release build to optimise for runtime performance?
<a href="https://github.com/denoland/deno/issues/1739">https://github.com/denoland/deno/issues/1739</a> just crossed 4 years. if they want people to do transpiling inside their own tool create an API so we can use our own tools rather than ones behind their black box.<p>Would be a much better use of their time than writing this nonsensical bs
> And to be make your Fresh app more performant, all client-side JavaScript/TypeScript is cached after the first request for fast subsequent retrievals.<p>My understanding is that the client side JS is a result of backend compilation. How does this work if the backend is dynamically generating those JS files? `getPosts()` can return a different JSX based on what `getPosts()` returns. No?
They make you read an entire article about how bad build steps are, only to present you with the (no less appealing) alternative of JIT compliation with URLs. This does nothing to improve the "sea of dependencies" problem they spent so much time pointing out as a bad thing.
I'm not sure why so many are interested in not having a build step. You'll still want to have a step that runs typescript to check for errors, a linter, maybe your tests and other stuff.<p>Or do people just want to YOLO it and let it crash in prod?
Deno didn't have a structure that caught my attention in its early stages. But now i see it going in the direction i need logically. I guess I need to start experimenting with a side project.
I think new features move into the browsers pretty fast nowadays? The new stuff introduced is overly hipster. The old stuff still needs a lot of polish. We keep getting the means to do all kinds of new things but it (by lack of better words) progresses forwards. The sum of things adds up to greater things. The (almost) opposite approach is to look what people are doing then make a single thing that does that directly, without 100 weird steps. Update it to make it better just like modules do. Everything html does looks like it was slapped together in a weekend. If you look at the spec it is obvious a lot of work went in but the default behavior never fails to disappoint. Some examples out of hundreds: we wanted a range selector and a slider, we got a slider and they called it range. How do I do a range now and make it look the same as the slider? Oh, I write both from scratch? lmao
Half of json's greatness is in how sad the xml tools are but if I compare both to sql I wonder how I get any work done at all! Imagine a form was just a json. Like json in and json out. Dynamically creating form fields and populating them from a deeply nested json, allowing the user to add fields, then trying to get the json toothpaste back into the tube was a truly hilarious adventure. I eventually just set the attribute value to the value of the form field then stored the html in the db.
Did you know js has an xpath implementation? Not that one could use it but there it is. haha<p>I really think with some love we could just go back to writing html/js/css directly. Maybe it is just that I fail to see the point of nodejs.
I tend to stick with script tags as much as I can. Really the problem are all the frameworks pushing people to create a build step. Their excuse is optimising the code size, but for most cases that matters little, I don't mind including all of tailwind or font-awesome.<p>So please, if you own a framework like this, make sure a script tag with a CDN link is easily copyable.
This article is spot on. As a developer, I don't want to see the build step. As soon as you expose the mechanics of the build and transpilation process to developers, you add a ton of complexity; for example, it opens up the possibility of transpiler and JS engine version incompatibilities. Devs should only need to concern themselves with one number; the version of the engine which runs their code. If they need to worry about the engine version and the transpiler version separately, it makes code less portable because you can't just say "This library runs on Node.js version x.x." It sucks to come across a library which works on your engine version but relies on a newer version of TypeScript... It's like hoping for the planets to align sometimes.
The dumbest thing about people building JavaScript to me is that you burn all of the energy and labor of building with almost none of the meaningful benefits.<p>No one is building and ending up with bundles that are reducing the bloat of the web, you can’t tree-shake your way out of bad practices. Articles and real lived experiences show us that the web is still bloated.<p>And why are we transpiling anything? If people want to flirt with building, I wish JavaScript engineers would just build an implementation that compiles to machine code intermediate representation.<p>Which is it? Do you want to be a scripting language or a programming language that compiles to something? It’s so gross to me.
Pretty sure we were building code to validate it and unit test it before releasing it.<p>This article is strictly about JavaScript. What about all the server-side rendering frameworks?<p>Huge assumption that everything is built one way.
Great clickbait title, makes people wanna jump in the comments and say the OP is wrong.<p>I mean if we wanna get really pedantic about it then yes there will always be a build step no matter what you do, one could argue saving the file and alt-tabbing to the browser is a build step, but that's not the point is it? The idea is to lower that friction as much as possible and JIT is perfect for that
I thought this will be an article about adding import maps to deno, which would be great.<p>I hope builders start adding it (at least to dev instances) to decrease the magic.<p>I was trying to use import maps, but it's not trivial go create actually.<p>There are always problems with Node lagging behind browsers though that makes developing hard (no WebSocket support by default for example, crypto module is also not included)
Putting JS in the backend was a tragic mistake born of laziness and ignorance. It should not be a surprise that everything else that followed suffers the consequences of these root traits as well. JS was barely acceptable when it was caged in browser-land, and we'd all be better off had it stayed there.<p>FE development has some unique challenges, but in my experience a lot of people who work in this domain try to find their own solutions to problems that have already been solved decades prior. There's a reason the build chains are fragile and a nightmare to configure, that package lists are out of date the moment they're published, and that it takes a sustained effort to maintain a project viable even if you're not adding features or fixing bugs. It's absurd, and it's the status quo.<p>To take this into other areas of development (like BE for example) simply because that's what you're familiar with... it really is a special kind of masochism.
So, if I'm using URLs for dependencies, effectively I can't code while I'm offline? I know it's not the norm, but there have been plenty of times I needed to work without internet.
sure these framework may do just in time transpilation and compression, that doesn't mean you don't have a build step.<p>copying the code to the server becomes the build step.<p>except now you have no chance to lint the code before shipping it<p>"but I can lint it on my machine"<p>good, then you have a build system, and you may as well just get the optimized stuff on the server, since server startup time depends on your code size at some point or another, and you pay for that.
Maybe they could add a new transpiled language as well, like Civet[0] or ReScript[1]. Not every project needs to be a C#/.NET clone in TypeScript :)<p>[0] <a href="https://civet.dev/" rel="nofollow">https://civet.dev/</a><p>[1] <a href="https://rescript-lang.org/" rel="nofollow">https://rescript-lang.org/</a>
certainly, in terms of writing scripts, deno seems nicer. Like, if I want to write typescript that just runs and does something, without having to carry around a node_modules folder, etc, deno seems like it might be nice
Deno is the "these go to 11" of the Node.js world.<p>Creating a whole fork simply to not build TypeScript, reinvent a worse package management system and a useless security harness.