> The 4.5-mm-square chip, developed using Korean tech giant Samsung Electronics Co.'s 28 nanometer process, has 625 times less power consumption compared with global AI chip giant Nvidia's A-100 GPU, which requires 250 watts of power to process LLMs, the ministry explained.<p>>processes GPT-2 with an ultra-low power consumption of 400 milliwatts and a high speed of 0.4 seconds<p>Not sure what's the point on comparing the two, an A100 will get you a lot more speed than 2.5 tokens/sec. GPT 2 is just a 1.5B param model, a Pi 4 would get you more tokens per second with just CPU inference.<p>Still, I'm sure there's improvements to be made and the direction is fantastic to see, especially after Coral TPUs have proven completely useless for LLM and whisper acceleration. Hopefully it ends up as something vaguely affordable.