The original title of this submission was 'Linus to Nvidia: "Fuck You"', which was renamed by someone (the moderators? can the submitter change the title?) after a ton of people had upvoted and commented on it, and despite the link being not to the entire video but instead to a specific point in the video for which the title of the entire video is probably not even an appropriate description.<p>So, when people are reading the comments of this submission in the future, please keep this in mind as a historical note. (This, humorously, was actually the kind of situation that caused the complaint[1] that itself turned into a massive hullabaloo recently regarding what can be discussed on HN and what the policies regarding hell-banning are; to view the reference you will need showdead.)<p>[1]: <a href="https://news.ycombinator.com/item?id=4102013" rel="nofollow">https://news.ycombinator.com/item?id=4102013</a>
TLDR: Linus at a Q&A, a woman asks why Nvidia still isn't providing any support for Optimus on Linux, Linus responds that Nvidia is being really difficult about that without good reason and "Fuck You Nvidia!" (flicks off the camera).<p>Her question actually starts a minute earlier than the link:<p><a href="http://www.youtube.com/watch?v=MShbP3OpASA&t=48m14s" rel="nofollow">http://www.youtube.com/watch?v=MShbP3OpASA&t=48m14s</a><p>PS - Bravo Linus. This is issue is a real PITA, and a bit incongruent considering the historically awesome driver support Nvidia has provided for Linux.
I'm behind the times here, but it took me a second to realize this was really Linus in the video. I still think of him as looking like the now-10-year-old photo gracing his Wikipedia article (<a href="http://commons.wikimedia.org/wiki/File:Linus_Torvalds.jpeg" rel="nofollow">http://commons.wikimedia.org/wiki/File:Linus_Torvalds.jpeg</a>), but he looks a <i>lot</i> older here, possibly accentuated by the business attire.
if you skip ahead to 1:00:30 a guy in the audience who works at Nvidia responds during Q&A, ever so politely. Linus responds, quite politely as well.<p><a href="http://www.youtube.com/watch?v=MShbP3OpASA&feature=youtu.be&t=60m30s" rel="nofollow">http://www.youtube.com/watch?v=MShbP3OpASA&feature=youtu...</a>
Aside from the FU, its a nice and interesting talk with an engaging Q&A that spans: decision making, being blunt by choice, micro-optimization, licensing, commercial interests, limited success on the desktop....
...anybody else picked up the line where he basically says that WEB PROGRAMMING ISN'T [REAL] PROGRAMMING? I wonder what a community like HN, with so many people developing web apps thinks about this... :)<p>Quote (~min 11:30): "I have never in my love done any web programming because I'm not interested, I think that kind of stuff... there's MIS people to do that for you, right? I'm interested in programming"<p>[edited some spelling bugs]
There is a bunch of hardware in the world that is inaccessible. I don't see this changing.<p>Reasons include: Keeping competitors away from what you think are valuable secrets and maintaining an advantage. Keeping people away from features that, if misused, could result in chip damage. Keeping security holes secret (e.g., badly designed DMA hardware that could be exploited, if the flaws were known). Limiting access to known buggy features, or unfinished features that either don't work or that could leak damaging hints about strategic direction. You have purchased or licensed 3rd party technology that you contractually cannot divulge details of. For interoperability with other products you have embedded knowledge of them in the product, under NDA.<p>More: It's expensive to document chips to the point that outside development can be done. Perhaps the documentation doesn't exist, at all, and would have to be reverse-engineered out of the chip design (yes, this happens). It's expensive to write drivers for multiple platforms, or even to get software into a state where it can be consumed by an outside party (just dumping a tree onto GitHub is /not/ a release). You feel that "forking" would result in a loss of control of your own product (and would dramatically increase the cost of future releases, lest you break things). You regularly rev chips and cover the changes transparently in the software layer, and this would /not/ be transparent if you released product details (thus increasing the cost of revisions).<p>More (the slimey side): You have misappropriated technology and divulging it would be harmful to you. There are design errors or bugs verging on malfeasance that could expose you to litigation. You have lied about the product's capabilities and a release would reveal this (whereupon, litigation).<p>Or, it's a pain in the ass, the market is significantly less than 1 percent of your total, and you have a horizontal skyscraper of engineers already behind schedule. "Good faith and being nice" doesn't pay the bills.<p>[I have also heard, from other parts of the industry, that the company in question is hard to deal with].
Let the memes begin! Here's an image template of Linus giving the finger: <a href="https://skitch.com/cpinto/ebemj/linus" rel="nofollow">https://skitch.com/cpinto/ebemj/linus</a>
We desperately need open sourced chip cores. An open community would build them with open source EDA tools and the final tapeout would be physically produced by a foundry. I know open source EDA tools are still decades away from proprietary ones, but we won't go anywhere if we don't start somewhere.
A few observations:<p>1. One of the highly distinctive characteristics of being a Free Software project leader is having the freedom to speak your mind. What Linus does (hacking the Linux kernel) and who pays him to do it (presently the Linux Foundation) are pretty loosely linked. The primary objective of LF is to fund Linux development, and Linus is pretty much the guy to get that done. If LF didn't pay him for it, someone else would. He can state his opionions on relevant technical matters with few if any fears of repercussions. I'm looking forward to next week's press releases from Nvidia.<p>2. Linus addresses what Nvidia are doing wrong at a few points, both directly and indirectly.<p>Around 15 minutes in he talks about what Free Software provides in the way of <i>developer</i> freedoms: you can focus on what <i>you</i> are interested in and what <i>you</i> are good at. In Linus's case, issues such as maintaining Linux-related websites, init, QA, and Linux distributions is stuff he fundamentally doesn't care about (while other bits such as, eventually, creating a useful revision control system he does). Free Software lets you focus on your own core competencies.<p>He also makes the point, around 35 minutes, that it's very important that people need to know how he feels about things. Including how he feels about support received from hardware vendors.<p>More specifically, for hardware manufacturers, playing nice and closely with the kernel development community leads to both better product performance and customer relations. The woman asking the Nvidia question clearly wasn't happy with her Nvidia experience. I've learned in assessing hardware compatibility to treat any Nvidia componentry as at best a red flag if not a show-stopper. I'll actively go out of my way to avoid their products (Intel have gone out of their way to ensure compatibility and open specs, my most recent purchases centered on Intel chipsets, in particular for graphics). Playing well with devs also means that issues are addressed in a timely manner, compromises can be reached, and in general communications are open and positive. I don't know the full backstory on the Nvidia front (though searching the LKML mailing list should turn up some bits).<p>3. ... and yes, the HN moderators fubared this one.
The title you gave it doesn't make much sense considering what this post/part of the speech and following discussion really is about - Nvidia's lack of support for Linux and Linus' reaction.<p>You should have titled it "Linus Torvalds Angry at Nvidia, Flips the Finger at Speach" or something.<p>That would have been more descriptive, just a heads up for next time! ;)
<a href="http://youtu.be/MShbP3OpASA?t=20m50s" rel="nofollow">http://youtu.be/MShbP3OpASA?t=20m50s</a>
"If you think like a computer, writing C actually makes sense."
At one point in the video he says he's proud that Linux is the only OS (I'm presuming he meant kernel) that is in mobile and desktop/server systems. Correct me if I'm wrong, but I was under the assumption that OSX and iOS were both based on essentially the same Darwin/XNU kernel?
This was a really interesting talk. Does anyone know where to find other lectures (or whatever this is technically called) by other influential programmers/computer scientists?
Optimus has to be one of the worst products nVidia has pushed. There's the Linux incompatibility, and there's the little problem where you are always tied to the vastly inferior integrated graphics. Games that I should be able to play at 60+ FPS with highest settings and resolution play at 20-30 FPS because I can't use the graphics card to the fullest. Then I need to do obscure hacks to get some games to even recognize the nVidia graphics card because the game developer thought it would be a great idea to wait until you downloaded and installed a 7GB game and launched it to tell you, "Oh, you're one of those Optimusers, well fuck you." Just google "sonic generations optimus" and you'll see what I mean.