Nvidia may announce new GPUs at CES (opens in new tab), some of which we already know about, but according to a press release from Nvidia itself, the company plans to take the wraps off of several affordable mobile options, including the MX550, MX570 and one that caught us by surprise, the RTX 2050. These GPUs are slated to appear in laptops to be released sometime in the Spring of 2022 (Autumn 2022 for the southern hemisphere folks).
An RTX 2050 Turing generation GPU we hear you ask? No, it isn’t. The RTX 2050 isn’t a Turing part as we'd expect, but an Ampere generation product, based on the GA107 GPU. This is the same GPU that’s slated to appear in upcoming RTX 3050 cards. The chip crunch is leading to some rather odd products that otherwise wouldn’t see the light of day.
Normally, we’d criticize Nvidia for renaming a three year old product and re-releasing it, but that’s not what Nvidia is doing here. It’s actually doing the reverse, taking a newer generation Ampere GPU and slapping on RTX 20 series branding. It’s a rather odd move. Why Nvidia isn’t calling it the RTX 3040 is puzzling.
According to specifications received by Anandtech (opens in new tab), the RTX 2050 will feature 2048 cores and a boost clock of up to 1477 MHz. The weak part is its memory configuration. 4GB of 14 Gbps GDDR6 over a 64-bit bus is definitely on the weak side. This spec points towards the 2050 being a significantly cut down version of the GA107 GPU. Given that the 2050 carries the RTX branding, users will expect some kind of ray tracing performance, but even with DLSS, its ray tracing capabilities are likely to be poor. At least the TDP number looks good at 30-45W. This suggests it won’t require a powerful cooling solution.
By the time Spring rolls around, we’ll have next generation integrated graphics to compare to the RTX 2050. Putting aside manufacturing agreements and the marketing value of featuring RTX branding, Intel’s 12th Gen mobile CPUs with Iris Xe graphics and AMD’s Rembrandt APUs (opens in new tab) could leave entry level discrete GPUs looking pretty average. We’ll wait and see how these products perform in the real world before passing judgment.