As you may have seen, Fortnite will run at 60 fps on the latest Honor View 20 at a resolution of 1080 x 2310 with passive cooling and 0 noise (the game also runs at stable 30 fps+ on last year now 300 usd phones).(1)<p>Meanwhile, on the latest intel CPU (with integrated so called HD graphics GPU) at a resolution of 1280 x 720, the game performs at between 30 and 50 fps with considerable amount of noise and heat.(2)<p>So, my question is: how a 600 usd phone (300 usd in a year or so) can outperform by that much a 1,000 usd + laptop? It just beats me...<p>Are Intel/AMD chips that bad? Are Huawei/Qualcomm chips that good?<p>1. https://www.soyacincau.com/2019/01/23/honor-view-20-fortnite-60-fps/
2. https://www.laptopmag.com/articles/play-fortnite-intel-hd-graphics
From my understanding, post processing is completely different, like anti aliasing used, HDR pipeline used, shaders overall used, etc. [1] goes into detail what exactly might differ.<p>It's not like mobile phone is superior by any means to PC. One can compare TDP and figure out difference in raw performance. Is more that rendering techniques that give "diminishing returns" (in some peoples opinion) are exponentially more resource heavy, for example lightmaps (almost free even on mobiles) vs dynamic shadows (can be insanely expensive).<p>- [1] <a href="https://docs.unrealengine.com/en-us/Platforms/Mobile/Performance" rel="nofollow">https://docs.unrealengine.com/en-us/Platforms/Mobile/Perform...</a>
The game is using lower resolution textures on the phone and probably a lot of stuff turned down beyond the settings available in the desktop version<p>Edit: The version running on the phone is basically incredibly optimized
The same reason PlayStation and XBox can provide better graphics than a monster PC, despite having modest specs: When you are targeting a single model of CPU/GPU, you can do all sorts of optimizations.