I met Aiden (the < 20 yo who started Million) a year or so ago. He presented about Million in front of a room full of 40+ grizzled JS devs. I don't personally see any reason to use Million JS, React is fast enough as it is if you memoize and use selectors correctly. Aiden said some similar things at the time (a solution in search of a problem, that got unexpectedly popular) but I gotta say, he's a hype man for sure. I wish him luck, I think if he made a more compelling library, it would be a rocket ship with his marketing. I do think he should take some hints from the other post regarding the deceptive benchmarks and make sure he can back up his marketing materials, but 14k stars on GitHub for something that (to me) seems pretty useless is truly bananas skill.
Or people could structure their react components efficiently and be fine as is.<p>I’m sure I’ll get a lot of downvotes for that.<p>But after 10 years of using react. Most performance issues were really just poor state management in the wrong locations and not the virtual dom holding it back.
> Instead of traversing every node, Million uses a compiler to directly update dynamic nodes, resulting in O(1) time complexity.<p>This sounds very hand-wavy. What does it mean to "use a compiler to directly update dynamic nodes"?
OP lied about benchmarks in the past:<p><a href="https://www.reddit.com/r/javascript/comments/x2iwim/askjs_millionjs_claim_of_11x_performance_increase/" rel="nofollow">https://www.reddit.com/r/javascript/comments/x2iwim/askjs_mi...</a>
Million's optimizations are only relevant if you're rendering a large number of identical stateless components (exactly like JS Framework Benchmark).<p>Real world applications are mostly deep trees of stateful components.
How does M3 stand with SvelteJS?<p>Seems like these are two conceptually similar things.<p>> React traverses the virtual DOM tree to update the UI, resulting in O(n) time complexity.<p>That's the worst case, on initial load. On most of UI changes nothing stops React to update only local portions of the tree - elements that have their state changed.<p>Educated guess: In Million and React.JS cases major bottleneck is inside browser re-layout mechanism, not on JS side I think.
I've built a real time updating app using react and I'm struggling to see the benefit here. For the very core parts I'm already avoiding the react overhead by using useref to maintain the same object. This reduces "hydration " and traversing to nil cost with no new concepts to learn. Why would I use million?
I haven't used React in a while and never used SolidJS, but would SolidJS not basically be a optimizing compiler as well for "React". Technically SolidJS is a separate framework and I don't know if it's a 100% drop in replacement like this make be.<p>Edit: My comment is probably not accurate. Please ignore what I said.
This looks really cool! It's interesting that this compiler provides performance optimizations by looking for data diffs rather than diffs in the Virtual DOM. Is this intended to be an alternative to React Forget (still in development)?
Never heard of this, sounds super cool! Gratz on the milestone!<p>Theoretically, could this be merged in the main React project or would this break something?
I'm not a React dev, so I can't comment on the project itself. Something I noticed on the blog post, though:
The image at the top of the page is served uncompressed at a whopping 18.5MB (9751px * 6132px)! Seems a bit extreme for what amounts to a simple logo and some text.
Kudos to the team, but why on earth should i choose React when we’ve now reached a point where it needs an optimization compiler, seems silly to be honest.