Oh dear, what were they thinking with that name. First they released the Maxwell-based <i>GTX Titan X</i>, then replaced it with the Pascal-based <i>Nvidia Titan X</i> which nearly everyone called the <i>Titan XP</i> to disambiguate the confusingly similar names, and now Nvidia goes and uses that universally accepted nickname as an actual product name for a different product.
Damn, the 2016 Titan X was confusing because the 2015 Titan X was named the same, so people nicknamed the 2016 one as Titan XP and they fucking went ahead and one-upped the confusion by making an actual Titan Xp.<p>Bravo Nvidia LOL<p>I had to check if my hacker news app wasn't updated with the newest post list since April Fools.
For comparison, the 1080Ti is ~11.3TFLOPS + 11GB RAM @ $700 vs the Titan at ~12.1TFLOPS + 12GB RAM @ $1200. ~9% more performance for 70% more money.<p><a href="https://en.wikipedia.org/wiki/GeForce_10_series#GeForce_10_.2810xx.29_series" rel="nofollow">https://en.wikipedia.org/wiki/GeForce_10_series#GeForce_10_....</a>
>Currently Mac users are limited to Maxwell GPUs from the company’s 9-series cards, but next week we’ll be able to finally experience Pascal, albeit a $1200 Pascal model, on the Mac.<p>>We have reached out to Nvidia for a statement about compatibility down the line with lesser 10-series cards, and I’m happy to report that Nvidia states that all Pascal-based GPUs will be Mac-enabled via upcoming drivers. This means that you will be able to use a GTX 1080, for instance, on a Mac system via an eGPU setup, or with a Hackintosh build.<p><a href="https://9to5mac.com/2017/04/06/nvidia-titan-xp-beta-pascal-drivers-mac/" rel="nofollow">https://9to5mac.com/2017/04/06/nvidia-titan-xp-beta-pascal-d...</a>
Is there something in Titan XP I would benefit from for ML/DL/AI comparing to 1080Ti (except for extra 1GB)? I am considering getting 8c Ryzen with 1080Ti (1-2x) and am wondering if Titan XP has something that would render 1080Ti obsolete for training models?
Regarding the new Nvidia provided mac driver, does this have any influence on Vulkan or modern OpenGL support? Or would that require changes in macOS itself (that would presumably never happen)?
A few more details: <a href="http://www.tomshardware.com/news/nvidia-titan-xp-graphics-card-gp102,34079.html" rel="nofollow">http://www.tomshardware.com/news/nvidia-titan-xp-graphics-ca...</a>
not sure why anyone would spend double money when you can get about the same performance using a gtx 1080 TI. The performance looks marginal IMO -- optimizing the code on gtx 1080 TI (cuda and/or shader assembly) would probably yield very satisfactory results and definitely better perf/buck.
Why not more memory? Esp. with the 1080Ti at 11GB and half the price, it would've made sense to push this to at least 16GB or even 24GB to distinguish it.
Is there any point (for games) in having that much memory when you can really only address about 9 GB a frame at 60Hz (Titan Xp is 550GB/s)?<p>I mean its certainly better than the Titan X (Maxwell) which could only address less than half it's memory while running at 60Hz.<p>It just seems like an effort to inflate the price of the product without adding much value.