Note that this happens even when using a BlueCoat proxy in non-MITM mode. BlueCoat tries to "analyze" TLS connections, and rejects anything it doesn't understand. This exact issue occurred with TLS 1.2 back when BlueCoat only understood 1.1/1.0.<p>In this case, it doesn't sound like they're reverting it because of overall breakage, but rather because it <i>breaks the tool that would otherwise be used to control TLS 1.3 trials and other configuration</i>. Firefox had a similar issue, where they temporarily used more conservative settings for their updater than for the browser itself, to ensure that people could always obtain updates that might improve the situation.
Amazing how this was predicted coming on a year ago*<p>> At this point it's worth recalling the Law of the Internet: blame attaches to the last thing that changed.<p>> There's a lesson in all this: have one joint and keep it well oiled.<p>> When we try to add a fourth (TLS 1.3) in the next year, we'll have to add back the workaround, no doubt. In summary, this extensibility mechanism hasn't worked well because it's rarely used and that lets bugs thrive.<p>* <a href="https://www.imperialviolet.org/2016/05/16/agility.html" rel="nofollow">https://www.imperialviolet.org/2016/05/16/agility.html</a>
This is even crazier than people may think on the first look.<p>The TLS community knew that there would be problems with the deployment of TLS 1.3 with version intolerance, because there always have been. That's why the version negotiation was changed and a mechanism called GREASE was invented to avoid just such problems. But it seems BlueCoat has shown us that there's no way to anticipate all the breakage introduced by stupid vendors.<p>The takeaway message is this: Avoid Bluecoat products at all costs. These companies are harming the Internet and its progress.
The title was editorialized. TLS 1.3 is a working draft and Chromium is just doing field trial with it.<p>A few days ago there were other issues with this causing Chromium to stop working on *.google.com so it's not just about middle-boxes.<p><a href="https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=855434" rel="nofollow">https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=855434</a><p><a href="https://bugs.chromium.org/p/chromium/issues/detail?id=693943" rel="nofollow">https://bugs.chromium.org/p/chromium/issues/detail?id=693943</a>
From <a href="https://www.bluecoat.com/products-and-solutions/ssl-visibility-appliance" rel="nofollow">https://www.bluecoat.com/products-and-solutions/ssl-visibili...</a><p>> <i>"Enterprise class Blue Coat’s SSL Visibility Appliance is comprehensive, extensible solution that assures high-security encryption. While other vendors only support a handful of cipher-standards, the SSL Visibility Appliance provides timely and complete standards support, with over 70 cipher suites and key exchanges offered, and growing. Furthermore, unlike competitive offerings, this solution does not “downgrade” cryptography levels and weaken your organization’s security posture, putting it at greater risk. As the SSL/TLS standards evolve, so will the management and enforcement capabilities of the SSL Visibility Appliance."</i>
It sounds like if you run a web server, you should think about only supporting TLS 1.3 with no downgrade support, to ensure security without the possibility of your visitors' being subject to interception by a third party (even if it is their own enterprise).
Many a head-scratching web application error investigation has resulted in an "a-ha" moment when you notice the `X-BlueCoat-Via` header in your logs. It does stuff like issuing GETs against URLs that only have POST handlers. It issues these random requests having procured its users' auth cookies even when the real user has since left the site.
There is a massive hypocrisy in browser vendors getting hysterical about self signed certs while letting MITM proxies operate with impunity or worse working with them.<p>Why isn't there an effort to detect MITM proxies and post equally scary warnings? Surely users have a right to know.<p>MITM is worse than self signed certs and if 'exceptions' can be found for MITM like corporate security, management etc then the same exceptions should be found for self signed certs for individuals rather than creating dependencies on CA 'authorities'. This just another instance of furthering corporate interests while sacrificing individuals.
I kinda hoped that TLS 1.3 had some magick in it so that those MITM proxies would no longer work because they can be recognized by the browser and the browser can say: how about no.<p>Also, wasn't there some security issues relating to the possibility to downgrade the encryption of a connection?
Wouldn't it be better to allow enterprises to do version pinning (which I believe used to be supported in chrome enterprise), rather than remove TLS 1.3 for everyone?
Browsers should add a button which allow being proxied, combined with a campaign to educate people on the difference.<p>I think its reasonable for a company to want to filter everything that comes through their pipe, if anything, it's a bit of a liability not to do it, but at the same time, non-technical people should understand that their connection is being unencrypted and re-encrypted, and be educated on the consequences.<p>There are a few local coffee shops which terminate SSL, and when people see me closing my browser and laptop, or starting to tether through my phone because of the cert error they tell me "oh, you just need to accept all those certs!".
Edit: oops, my mistake. Carry on.<p>> <i>Have some god damn ethics</i><p>Personal attacks are not allowed on HN. We ban accounts that do this, so please don't do it.<p>We detached this subthread from <a href="https://news.ycombinator.com/item?id=13750650" rel="nofollow">https://news.ycombinator.com/item?id=13750650</a> and marked it off-topic.