I really like that somebody is paying attention to the user experience of these things instead of focusing on pointless spec wars. Those 3 second lags all over the place are subtly annoying every time. I'd much rather the manufacturers fix that then do more resolution bumps that I don't have the bandwidth to get or the screen size to notice, on the 3 shows that actually use it, 2 of which I'm not particularly interested in anyways
Why does the article make a big deal about the faraday cage but not explain <i>why</i> it's being used? Is the rest of the office too noisy and they want to test TVs with a single wifi configuration?
What, if any, are the bad consequences of instant on for TVs?<p>Do they have to sleep less deeply than current TVs, consuming more standby power?<p>Will they be emitting more RFI in standby? My several year old Samsung gives off quite a bit of junk in the 2m, 1.25m and 70cm ham bands when on. That almost all goes away in standby. I'd be displeased if I got a new TV, and it gave off that interference when "off" for the sake of turning on instantly instead of in 10-20 seconds.
Am I the only person who thinks televisions don't suck? This all seems to be about smart TV functionality, which strikes me as superfluous anyway. I would rather buy a TV based on stuff like picture quality and size and buy something to plug into the TV for media consumption.
Maybe this team could focus on the Netflix software itself on TVs. They recently removed the ability to view all movies in a genre and totally removed all sub-genres. The main reason I bought my TV was for Netflix so now I don't have much use for it or their service.