Apparently some people are <i>trying</i> to open up as much attack surface as possible.<p>If you're implementing this... please <i>stop</i>. This <i>will</i> cause serious problems. There is no reason whatsoever to allow remote access to USB devices.
This spec is one of the most interesting features to come to the web. I can think of at least three ways I can use something like this to deliver interesting experiences to users, which would not otherwise be possible.<p>Yes, it increases the attack surface. This is a real danger, and should not be taken lightly. But no one is taking it lightly.<p>Another danger, which I haven't seen mentioned yet, is that it increases your fingerprint. If a website can list the Web USB devices currently connected, that's a way to deanonymize you.<p>But if you set aside those negative qualities, WebUSB, I think, is one of the more important features on the horizon.<p>For example, I could build and ship a light sensor. The idea is that you'd connect it to your laptop, and websites would be able to react to your environment's lighting + your current screen brightness. Think dynamic color schemes that always look good no matter what kind of monitor you're using, or whether it's night or day. This is only possible with hardware; no amount of software will make this feature available. And I can't think of any convenient way to do it other than WebUSB.
So I see a lot of negative comments about the security implications of this. Presuming that the browser gates access to USB devices in the same way it gates access to geolocation, what's the risk here? How is it different than installing 3rd party software on your OS?<p>I've read about the "inner platform" effect, and I don't know. I'm just not convinced it's a bad thing here. For all its warts, the web is far and away the best cross platform application delivery system. Use this in conjunction with things like IPFS and browser based persistent storage, and can run signed versions of trusted, auditable code that can do lots of awesome stuff. Without users having to futz with installers.<p>What am I missing?
`First, so that the device can protect itself from malicious sites it can provide a set of origins that are allowed to connect to it.`<p>Sounds kinda DRM-ey. The CORS system works because the owner of a URL is the person who can attach headers to responses. That doesn't sound the same in this case, as the owner of a device is the person who owns it, not the manufacturer who decides what metadata it broadcasts (though obviously working out what the UX should be is a hard problem)
It would be nice if this spec would be implemented in major browsers allowing access from JavaScript to USB crypto tokens.<p>In my country (and, AFAIK, in many other places) those tokens are used to provide some government services which require digital signature. Few years ago Java applets were widely used to provide that functionality. There's no other way for JavaScript to access crypto token. It wasn't that smooth, but it worked. Now, when Java applets started to disappear in major browsers, websites require you to download a program which must be run, it listens at local port and website uses websocket to communicate with that program. This process becomes much more complicated for user, especially for non-Windows OS.<p>If JavaScript would be able to access USB crypto device, the whole user experience improvement would be huge.
This is actually pretty exciting. I can see being able to plug in to an arbitrary computer and visit <a href="https://www.yourdevicemanufacturer.com/" rel="nofollow">https://www.yourdevicemanufacturer.com/</a> to manage your personal devices really taking off. No drivers or shitty apps to install or update? This will be huge.<p>I can see why people would be concerned about attack surface, but if you're plugging your devices in to foreign USB ports, then you're potentially 0wned already. At least this way, if it's done right with regard to permissions, the driver side will always be up to date. Plus Javascript is memory-safe.<p>Holy freakin ghost though, this should be HTTPS only.
What's next, WebFirewire? WebSATA? WebPCIe?<p>USB is enough of an attack surface locally. I don't think I'd ever trust a web page (or app, as the case may be) with access to my USB devices.
Though I agree with the concerns about security, I can personally think of a cool use-case: <a href="http://knightos.org" rel="nofollow">http://knightos.org</a>. It'd be nice if users could just plug in a calculator and install KnightOS on it from any web browser. Ditto with <a href="https://packages.knightos.org" rel="nofollow">https://packages.knightos.org</a> - plug in calculator and click a button to install the package.
Maybe the NSA thinks they don't have enough control on our machines? "If only we could control their USB devices too"...<p>Then the FBI asks/orders Yubico (or others) to help bypassing USB dongles remotely?<p>I think I won't trust such a thing...
This is what happens when the generation that brought us front ends that require 450MB of memory to display a web page start aging and join committees.
This trend of shoving as much functionality into browsers as possible is really worrying. It's indicative of lazy architecting, because most of the time the complex features you desire are fully available using native OS APIs. There does not need to be a bridge for every feature in the browser. For apps using OS level APIs, more often than not, users will need to grant some permission to access their device. The permission scopes in modern operating systems are very strict and hardened. It would be very difficult for an app to execute a "drive by" exploit without first requesting some form of permission. At the very least, the user needs to explicitly download the app into their device (as opposed to simply viewing a webpage). The permissions API of browsers, compared to OS, are less mature, operate in userspace, and are inherently less secure.<p>If an exploit exists in the browser, suddenly every device with that browser becomes vulnerable to a drive by exploit, as opposed to every device that downloaded a single malicious app. The user may be completely unaware, because there is no download confirmation or permission granting involved in viewing a webpage (assuming the exploit bypasses the weak permissions API of the browser). A malicious actor could load the exploit into an iframe via an ad network and the user would have no idea.
A demo using the WebUSB API with Arduino:
<a href="https://github.com/webusb/arduino" rel="nofollow">https://github.com/webusb/arduino</a>
A lot of points about the dangers of exposing a USB device to the internet. I'm concerned about a future where all my USB devices require internet access. Is network chatter destined to increase until nothing we own will function without phoning home? It's a firewall headache.
I'm surprised there's no Chromium or Firefox which removes microphone, webcam, js local file access, and this new vector, to name just a few.<p>Alternatively Servo would be modular enough that I could build a custom version without many features.
The ethical security concious part of me is saying "No, no no, no!"<p>The unethical job-security concious part of me is pleading "yes yes yes!! Add more devices to the internet!".
If implemented right this could be fantastic, especially for devices like Chromebooks, but am I reading it right that this API basically brings hardware drivers to JavaScript?
Pretending like USB devices aren't already a target, safe, etc. - won't make the net safe.<p>Being able to interface USB with the net is a natural step in the internet of things.
It wouldn't surprise me if someone were to use this to effectively share a licensing dongle.<p>Although I'd imagine latency might be an issue.
Nothing says "I love you" more than malware persistently embedded in your USB devices.<p>Here's how hardware teams work (in my experience)<p>----<p>Manager: "Hmmm... we need firmware. Hey, who can update the USB stack for the next version of our USB toaster?"<p>Team: <i>shrug</i><p>Manager: "Figby! Don't you do Arduino stuff on the weekends? You do it. Be done next Friday."<p>Figby: "Meep?"<p>USB is hard enough to get working properly, now you need to add "resistance to attack from the host" to the feature set. Figby is in trouble because he's being crushed under four rocks now:<p>1. USB is hard to get working. The standard is pretty complicated, and speaking frankly here, the device class standards are pretty badly fucked up. Camel-by-committee class bad. You spend a lot of time getting edge cases to work. (Don't get me started on DFU, OMFG).<p>2. Figby probably has a bunch of legacy to deal with. If the USB stack he's working with started from a fine commercial framework, he's better off. But he's likely working with something horrible hatched by a former semi-hardware guy a while ago, who was fired because six months into the prior version of the project, management finally realized he could barely spell 'C'. Oh, and the code makes liberal use of #define, and the source doesn't use curly braces, but DO..OD and WHILE..ELIHW, and there is deep, <i>deep</i> misunderstanding of what 'volatile' actually does.<p>3. Figby also has to work with the host driver team. If they are not actively hiding from the hardware team they are either on the wrong side of the planet, or have absolutely no time to devote to USB issues. The new driver will arrive about three days after Fibgy's "golden master" date.<p>4. Figby has no time. Everything is feature work. If there are security bugs in the code (... and there <i>are</i>) they have been ignored or deferred as "won't fix" for several product versions. If there are security reviews, they are cursory check-off meetings where people who don't know anything about security make collective shrugs around bugs that are embedded in the product at the level of DNA. The code is swiss-cheese and would take months to make a dent in. Besides, this is a hardware company; isn't the OS supposed to keep us safe?<p>... now this WebUSB thing lands on Figby's plate. "Marketing really really wants this feature so they can add another checkbox to the package." What are the chances that Figby's going to fix security issues in the product before just turning the thing on for the whole internet to see? Because Friday.<p>Figby adds command handlers and descriptors for WebUSB. He'll get to the security stuff later, when the rest of everything else works.<p>Yeah.<p>This is a horrible idea.
Promoting the Browser to the OS level!? Read-only for private key loading may be useful but beyond that seems dangerous. Use standard download and save method.
Yet another browser aperture that will require the usual multitude of post-implementation band-aid security fixes.<p>Sigh.<p>We get it.<p>We could make the browser the OS.<p>But should we? Not, could we?<p>I actually miss the days where the browser would be an application to view and consume content, with a modicum of scripting to progressively enhance the experience.
<i>A device announces support for the WebUSB command set</i><p>Wat.<p><i>This document describes an API for direct access to Universal Serial Bus devices from web pages.</i><p>This isn't what any of this is about. Expose USB devices to the browser or not, <i>do not</i> add rogue standards for USB devices to comply with!