Nvidia is using a last-gen encoder in its GTX 1650 to keep the price down

GTX 1650 lacks Turing NVENC

In an odd play, Nvidia has omitted Turing's NVENC from their shiny new budget 16-series offering, the GTX 1650, which is otherwise built around Turing microarchitecture. Each of Nvidia's graphics card architectures, on which many of the best graphics cards are based, going all the way back to Kepler, has included a new encoder block. So to see the older Volta NVENC instead of the new Turing NVENC in the 1650 comes as a bit of surprise, particularly when you consider that both of the other cards in the 16-series (the 1660 and 1660 Ti) include the newer version.

NVENC is a portmanteau for Nvidia encoder, the company's proprietary solution for offloading encoding tasks to free up processing power for other tasks. The gap in encoding workload performance between the Volta and Turing solutions is somewhere around 15 percent—not a massive gulf, but definitely a consideration for streamers or content producers waffling between the 1650 and one of its big brothers in the Turing-architecture-minus-RT-and-Tensor-Cores family.

Nvidia hasn't officially indicated why it opted for this retrograde inclusion, but a couple of possible explanations spring to mind. The most obvious is that it's a cost saving effort; in the budget space, shaving even a few dollars off of production costs can lead to significant profit on the back-end. This could either mean a higher profit margin per unit for Nvidia or a way to compress retail prices to compete with AMD's budget GPUs and inflate sales volume. With their RTX offerings underperforming and their stock slumping even as chip stocks surge, it's not a stretch to imagine the company is trying to reduce costs across every possible channel.

The other possibility is that Nvidia is attempting to reduce the transistors on the 1650 to decrease power draw. As our friends at Tom's Hardware reported, the 1650 is "an efficient graphics card well-suited to environments sensitive to power consumption." While only a fraction of that efficiency can be attributed directly to swapping out the NVENC block, this is another category where Nvidia has proven themselves beholden to the "every little bit counts" philosophy of design. 

Alan Bradley
Alan's been a journalist for over a decade, covering news, games, and hardware. He loves new technology, Formula 1 race cars, and the glitter of C-beams in the dark near the Tannhäuser Gate. Find him @chapelzero on Twitter for lengthy conversation about CRPGs of the early 90s and to debate the merits of the serial comma.