Qualcomm, a leading player in the semiconductor industry, has unveiled its latest PC chip, the Snapdragon X Elite processor, aiming to compete head-on with industry giants AMD and Intel. While much was initially unknown about the integrated GPU, recent tests have shed light on its gaming capabilities.
At a closed-door event, Qualcomm provided two laptops equipped with the Elite X chip for limited testing. One featured a 15.6-inch 4K display with an 80W power limit on the chip, while the other housed a 14.5-inch 2800 x 1800 screen with a CPU power limit of just 23W. Journalists, including tech YouTuber Geekerwan, provided valuable insights into the chip’s gaming performance.
Though details about the GPU remain somewhat elusive, Qualcomm stated a figure of “up to 4.6 TFLOPS” to highlight its shading ability. While lacking specific context, this approximation places it in the realm of a GeForce GTX 1650 Super, assuming it pertains to FP32 data values without dual-issue optimizations.
In addition to processing power, a robust cache system and ample global memory bandwidth are critical. While information regarding the former is still pending, both test laptops utilized LPDDR5X-8533, a high-speed, low-power RAM, ensuring strong support for the GPU.
Geekerwan conducted a comparative test, pitting the 80W X Elite against a Ryzen 7 7840HS CPU coupled with a Radeon 780M GPU, using the 3DMark Wild Life Extreme benchmark. Unlike some 3DMark tests, Wild Life Extreme is designed to be versatile, and compatible with platforms including Android, iOS, and Windows on Arm. Operating at an internal 4K resolution before scaling, it provides a reasonably demanding assessment.
The Elite X exhibited an average performance of 44.8 frames per second (fps), surpassing the AMD chip’s 30.7 fps by an impressive 46%. It’s worth noting that the AMD processor operates under a significantly lower power limit. The evaluation of both laptops in 3DMark Wild Life Extreme revealed the 23W version achieving around 39 fps, a notable feat.
For comparison, a Core i7 9700K paired with a GeForce RTX 4070 Ti achieved an average of 263 fps. While nearly seven times faster, it’s crucial to highlight that the 23W Elite X operates on a fraction of the combined power of the 9700K and 4070 Ti.
Subsequently, Geekerwan conducted a brief test of Control, Remedy’s third-person psychological shooter, at 1080p with low-quality settings. The 23W laptop achieved frame rates averaging around the mid-40 fps range, occasionally exceeding 50 fps. This is particularly commendable for an integrated GPU, especially considering the additional load of x86 emulation required to run Control.
Games that have been ported to Windows on Arm are likely to perform significantly better compared to those requiring emulation, a factor worth considering for potential buyers of an X Elite-powered laptop in the coming year. It’s worth mentioning that, despite Control’s impressive ray tracing capabilities, the limited testing revealed that Qualcomm’s new GPU does not currently support it.
While future updates may address this limitation, whether through driver enhancements or improved x86 emulation for handling the DirectX RT API, the current performance of the Elite X is already a promising indicator. Performing on par with, and occasionally surpassing, a Radeon 780M is noteworthy, especially considering the latter’s 12 RDNA 3 compute units. This GPU powers devices like the Asus ROG Ally and Ayaneo Air 1S gaming handhelds, as well as the integrated graphics in the latest Framework AMD mainboard.
Qualcomm’s new Snapdragon X Elite processor may well mark a significant advancement in handheld gaming PCs, and could potentially deliver gaming-capable thin and light laptops, especially given its impressive performance at 23W.
Although it’s still early days for this newcomer, much will hinge on the resources Qualcomm dedicates to driver and emulation development. Nevertheless, its entrance into the PC chip arena has undeniably injected fresh competition, challenging the established dominance of AMD and Intel.
Related:
Dedicated Graphics Card vs. Integrated Graphics: Which is Better? 2023