I wonder when the PC world will transition to unified memory and more monolithic architectures as seen in the Apple M-series chips, but also in the NVIDIA GB200 "superchip", the AMD MI300 accelerator, the X-Box and Playstation consoles, and even in some mobile phone system-on-a-chip designs. It feels like the PC is the "last holdout" of discrete upgradeable components with relatively low bandwidth between them.<p>Sooner or later, AI will need to run on the edge, and that'll require RAM bandwidths measured in multiple terabytes per second, as well as "tensor" compute integrated closely with CPUs.<p>Sure, a lot of people see LLMs as "useless toys" or "overhyped" now, but people said that about the Internet too. What it took to make everything revolve around the Internet instead of it being just a fad is <i>broadband</i>. When everyone had fast always-on Internet at home <i>and in their mobile devices</i>, then nobody could argue that the Internet wasn't useful. Build it, and the products will come!<p>If every gaming PC had the same spec as a GB200 or MI300, then games could do real-time voice interaction with "intelligent" NPCs with low latency. You could <i>talk</i> to characters, and they could talk back. Not just talk, but argue, haggle, and debate!<p><i>"No, no, no, the dragon is too powerful! ... I don't care if your sword is a unique artefact, your arm is weak!"</i><p>I feel like this is the same kind of step-change as floppy drives to hard drives, or dialup or fibre. It'll take time. People will argue that "you don't need it" or "it's for enterprise use, not for consumers", but I have faster Internet going to my apartment than my entire <i>continent</i> had 30 years ago.