Very nice service but how do you manage the CDN subresource integrity (SRI) if the packages are different for each individual user?<p>It's possible to check subresource with es6 module but only if you know the signature first.(<a href="https://stackoverflow.com/questions/45804660/is-it-possible-to-use-subresource-integrity-with-es6-module-imports" rel="nofollow">https://stackoverflow.com/questions/45804660/is-it-possible-...</a>).<p>Even Webpack will not handle it with webpack-subresource-integrity (<a href="https://www.npmjs.com/package/webpack-subresource-integrity" rel="nofollow">https://www.npmjs.com/package/webpack-subresource-integrity</a>)<p>Of course HTTPS is strong but not a foolproof solution against man-in-the-middle attack.
My company would buy into this if there were some kind of boilerplate contract you sold to a business/institution vs the Patreon page you have right now.<p>Just FYI, you're missing out on some dollars because of that. For better or worse, the bean counters at my work place won't approve anything less. I have a feeling I'm not alone.<p>If you can quickly whip up some boilerplate business checkout with an invoice, you'd make more than a few dollars today.
This is fantastic - exactly what I've been looking for but didn't know I was looking for it! I've been re-importing libraries by doing something weird like: `import './three.js'; export default window.Three;` so I can use it as a normal module.<p>I love not having to use build tools for my personal projects anymore - everything feels so light and "old school". Here's my Minecraft-ish clone in native modules and WebGL2: <a href="https://github.com/mrspeaker/webgl2-voxels" rel="nofollow">https://github.com/mrspeaker/webgl2-voxels</a>. No dot files, nothin' to build... just view source!
This is very good development for frontend development. Build systems like webpack were useful technology in earlier days. But they are presenting a big hurdle for newer less experienced developers to enter the frontend development space today. I would love to see a future where we can again run a webserver from a folder to serve a frontend in development.<p>I do wonder how modular css fits into the picture of es modules though.
If you've got a site with decent levels of traffic host libraries yourself rather than use a JS CDN.<p>Retrieving critical content from a 3rd-party CDN has a number of issues:<p>- New TCP connection has to be created with added cost of TLS negotiation and it's own slow-start phase<p>- If you're using HTTP/2 then prioritisation only occurs over a single connection so it can't be prioritised against other content
`curl -i <a href="https://cdn.pika.dev/preact`" rel="nofollow">https://cdn.pika.dev/preact`</a> redirects me to a dist-es2019 package (I assume because it detects my user-agent supporting that) but isn't showing anything like a `Vary: User-Agent` header.<p>Won't this break for any situation in which users with different browsers share a proxy server?<p>(also tried with Chrome, didn't see a Vary there either)
The differential serving sounds like a neat idea. Naturally, everyone not using the newest version of Firefox or Safari will go to hell eventually, but until then it could really improve the web for a lot of people.
This is way cool. I recently started a new app and decided to see how far I could get without a build tool. My early impressions left me wanting to write a blog post "ES Modules Make JavaScript Fun Again." The whole development cycle felt clean and simple. Ultimately though I got hung up on dependencies. For a while I was just including things directly from node_modules/. But npm flattens things so that library location is not predictable (this crops up when en ES module dependency tries to look in its own node_modules/ directory for another ES module dependency, but that dependency has actually been flattened to the top level). So you're basically stuck downloading all your dependencies (and their dependencies) manually. This isn't 100% a bad thing. It pushes you to use smaller dependencies with fewer sub-dependencies. You're also stuck using libraries that export an ES module. Pika could be just the ticket to bridge these gaps.
This wouldn't work with a standard React project though, right? Because you still need to transpile JSX. You could use the development version of React, I guess, which is slower, but can understand JSX, but that's not something you want to ship.<p>I'd love to use something like this for teaching, tutorials, and even small projects, but there's some things I still need a transpiler for.<p>I also realize I could use the `htm` package instead of JSX, which gives a lot of benefits over JSX, including not requiring transpiling, but, since it's not widely used by the wider ecosystem, I'd be a little hesitant to include it in my projects.
<a href="https://www.pika.dev/search?q=jquery" rel="nofollow">https://www.pika.dev/search?q=jquery</a> -- so jQuery is not "modern" any more? That's quite surprising, giving for instance the dependency of the <a href="http://semantic-ui.com/" rel="nofollow">http://semantic-ui.com/</a> framework on jQuery (<a href="https://github.com/Semantic-Org/Semantic-UI/issues/1175" rel="nofollow">https://github.com/Semantic-Org/Semantic-UI/issues/1175</a>)
What is the business model? Where does the money come to pay for the dev and hosting? This is the question I'm left with.<p>Nothing is free and I didn't find this in crunchbase.<p>Something is paying for it. Is it tracking people and selling it?
Looks great, but I think the homepage should do more to convince me that I can trust it. Who runs it, how is it funded, is there any guarantee they won't run out of money and shut down, etc.
Can anyone explain how the differential serving works?<p>I get that they might have a User-Agent mapping to features. But how do they know which feature are needed by the loaded modules?
Pika CDN seems to facilitate user tracking by the CDN better than the current JS CDNs can (with simple browser privacy features that browsers should be doing already).<p>Also, wasn't clear to me whether they support SRI or an equivalent supported by the browser. If they don't, it could also be a centralized vulnerability for user-targeted injection.<p>(Solution: the best sites will pay to serve their own JS.)
I love the idea of a more efficient CDN for JS (and code overall!), but it isn’t clear to me how this handles the multitude of versions. None of the examples seem to include versioning, which is a huge oversight IMO.
A future I see is IPFS for this sort of thing. All objects identified uniquely, but cacheable by multiple entities.
I built a repo like this but for require (commonjs), where package dependencies was sent along the first request using http2. Only problem was that browsers didnt cache the preloaded files and re-requestsed them. Hopefully browsers will fix this or latency will be a huge problem with several layers deep dependencies.
Just a comment on the name: as a Python dev when I saw Pika I immediately thought the RabbitMQ Python package: <a href="https://pypi.org/project/pika/" rel="nofollow">https://pypi.org/project/pika/</a><p>May or may not be an issue for this project. Just bringing it up for visibility.
Their example doesn't work for me - it's just blank. Looks like CORS issues?<p><a href="https://pika-cdn-example.glitch.me/" rel="nofollow">https://pika-cdn-example.glitch.me/</a>
Isn't pika where you're eating things you're not supposed to? So by using this cdn your computer is eating things it's not supposed to?