People have been wishcasting the end of dedicated GPUs for like 20 years now.<p>It's nice that they're slowly improving, that every Apple Silicon Mac has a half-decent integrated GPU, that ML upscaling is pretty good if you want "4K". But they really don't come close to an NVIDIA card at 200-400W.
Mh, so in the long run instead of a mobo you prize a desktop composed by a large unique integrate brik with at maximum external PSU and screen? Or the other end, looking at my desktop where the Nvidia card it's not much a card but a computer inside the computer with GPU, memory etc and it's dedicated nvtop listing processes and resource usage like [hb]top for "the larger computer"?<p>Such "migrations" are a classic, we merge some components in a "package", we split a "package" in some discrete components and so on. Personally I'm more interested in the possibility of easy repair and custom component selections instead of crappy glued stuff with plastic clips an no standard to force drop an entire car just for a punctured tyre.<p>BTW most assembled system today have a useless super-CPU, too little ram and bad storage choice to last 3-4 years instead of 8-10 in comfort. That's one of the biggest issues for us.
Maybe they'll be replaced by gpgpus but i doubt they're going extinct. Upscaling is not a replacement for 4k rendering, it's a crutch for slowing advances.
> Sound cards and network adaptors were an integral part of custom PC builds for years, but those eventually got swept away as motherboards improved and started to integrate those features.<p>If audio and network technology needed and were able to keep up with demand equal to what GPUs are, you’d see the same results. You need both the demand and the potential to be there at scale in order to drive that kind of advancement.<p>If we get to a world where high fidelity graphical demand is a niche, similar to audio, then I could see this argument having merit. I don’t expect that will happen in any way we could reliably predict in 2024.
Okay, so, GPUs will go away to be replaced by AI, which... runs on GPUs? Even if the target application changes from games to AI, we still need widely parallel processors (aka GPUs) and serial processors (aka CPUs) if we want to keep growing performance.
The APU offerings from both AMD and Intel have been improving pretty rapidly recently but they're still pretty low end by dGPU standards. I can certainly see them causing the death of dGPUs in laptops but it's difficult to imagine a scenario where they're competitive with mid to high range dGPUs in desktops. I can't see either AMD or Intel trying to cram an iGPU that large onto one of their chips, it would be an extremely niche product and would be badly bandwidth starved.
Graphics cards are already dead for me (as a causual gamer). I recently bought a used notebook with a 12th Gen Intel Core processor and it can run an impressive list of games.
I've been calling it[1] for a while now; this is history rhyming with itself for anyone who's paying attention to computing history.<p>As unsettling as video cards going dodo might be, it definitely will be cheaper for the general consumer if integrated audio and NICs are any indication.<p>[1]: <a href="https://news.ycombinator.com/item?id=40236186">https://news.ycombinator.com/item?id=40236186</a>
"AI" upscaling. Are we entering a new era where all games look kinda the same because of the default "AI" settings?<p>Like when Unity was new and I guess everyone used the same example shaders to build on...
We have a whole subheader on how AI is supposedly going to kill dGPUs, but the paragraph explains nearly nothing as to why. It doesn't even begin to contemplate the idea that if AI does continue to grow in ubiquity, there's even more demand for compute that handles that sort of workload efficiently, and we know GPUs serve that purpose better than CPUs in the vast majority of cases. It even mentions that AI is being used for graphics purposes in gaming. It ignores all of this to suppose that the distant second in dGPU sales not releasing an already niche card on some arbitrary schedule is indicative of the market as the whole.<p>It mentions sound cards, but a lot of what killed sound cards is that the only people buying them after integrated audio being a thing were audio enthusiasts, and audio enthusiasts have moved nearly universally to external DACs via USB. They're just as specific of a product, serve the same purpose, and exist across the whole price range that sound cards did and far beyond, for people that want to light enough money on fire. I'd wager the DAC market is far larger today than the soundcard market was even before integrated audio became a thing. Things way have changed so it's no longer something you're slotting in as an expansion card, but it's still an additional purchase for a premium purpose. I don't see dGPUs going anywhere for the exact same reason, barring physics-defying technological advances.
Well it seems to me that games from 10 years ago run on a today's integrated GPU just fine. But that's not the case with TODAY's games which barely run on a dedicated GPU from a few years ago.<p>So until game developers will completely ceise to target the powerful dedicated GPU of today and limit themselves willingly to technology 10 years in the past, we won't see the death of dedicated GPU.<p>Only place where that could happen would be Soviet Russia by decree of the central committee of the party. In a competition based capitalist society it will happen... never. Someone will always use the latest and greatest GPU and if you don't do it too your company falls behind. Hence: not a chance.
> Sound cards and network adaptors were an integral part of custom PC builds for years, but those eventually got swept away as motherboards improved and started to integrate those features.<p>Yeah, for better or worse.