I'm very confused by Nvidia's numbering scheme. It used to be generation-model, so my 1080 is generation 10, and "80" is an arbitrary model number where higher is better. It's the logical successor to the 9-80.<p>The 20-x cards I suppose are OK, a big jump to signify a big change in architecture.<p>But now we have... 16-60? Why 16? Is this the successor to the 1060? And it's a "Ti", but there isn't a non-Ti 1660?<p>I'm confused.
The NVENC hardware encoder in Turing is actually comparable to x264 at fast / veryfast. This card will be quite interesting to Twitch streamers as it opens up the possibility to stream without CPU impact for those with a limited budget.
The question is: With prices returning to sanity, is it a better bang for your buck to get a 1660 or 2060. Seems like a $100 difference. But the 2060 seems like still an amazing value today.
Why do they benchmark at 1080p, can't every card under the sun run games at 60fps with such a low resolution.<p>And no 4k ? Wtf? Shouldn't that be the standard now.
Looks like 1440p gaming at a reasonable budget is finally here. Now we just need some good (non-curved) Nvidia-approved adaptive sync cards at 32" 1440p IPS.
For the price, I'm kind of skeptical that it will outperform a previous-generation GeForce 1060. Which can be purchased from $259 to $299 in various places.<p>I just got a 1060 that has 3 x displayport 1.4 outputs + 1 x HDMI 2.0 output, it can drive four 4K displays at 60 Hz.<p>The 1060 is probably somewhat more power hungry and hot under load.