Nvidia is using a last-gen encoder in its GTX 1650 to keep the price down
The 1650 is the odd man out in Nvidia's 16-series product stack.
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
You are now subscribed
Your newsletter sign-up was successful
Want to add more newsletters?
Every Friday
GamesRadar+
Your weekly update on everything you could ever want to know about the games you already love, games we know you're going to love in the near future, and tales from the communities that surround them.
Every Thursday
GTA 6 O'clock
Our special GTA 6 newsletter, with breaking news, insider info, and rumor analysis from the award-winning GTA 6 O'clock experts.
Every Friday
Knowledge
From the creators of Edge: A weekly videogame industry newsletter with analysis from expert writers, guidance from professionals, and insight into what's on the horizon.
Every Thursday
The Setup
Hardware nerds unite, sign up to our free tech newsletter for a weekly digest of the hottest new tech, the latest gadgets on the test bench, and much more.
Every Wednesday
Switch 2 Spotlight
Sign up to our new Switch 2 newsletter, where we bring you the latest talking points on Nintendo's new console each week, bring you up to date on the news, and recommend what games to play.
Every Saturday
The Watchlist
Subscribe for a weekly digest of the movie and TV news that matters, direct to your inbox. From first-look trailers, interviews, reviews and explainers, we've got you covered.
Once a month
SFX
Get sneak previews, exclusive competitions and details of special events each month!
In an odd play, Nvidia has omitted Turing's NVENC from their shiny new budget 16-series offering, the GTX 1650, which is otherwise built around Turing microarchitecture. Each of Nvidia's graphics card architectures, on which many of the best graphics cards are based, going all the way back to Kepler, has included a new encoder block. So to see the older Volta NVENC instead of the new Turing NVENC in the 1650 comes as a bit of surprise, particularly when you consider that both of the other cards in the 16-series (the 1660 and 1660 Ti) include the newer version.
NVENC is a portmanteau for Nvidia encoder, the company's proprietary solution for offloading encoding tasks to free up processing power for other tasks. The gap in encoding workload performance between the Volta and Turing solutions is somewhere around 15 percent—not a massive gulf, but definitely a consideration for streamers or content producers waffling between the 1650 and one of its big brothers in the Turing-architecture-minus-RT-and-Tensor-Cores family.
Nvidia hasn't officially indicated why it opted for this retrograde inclusion, but a couple of possible explanations spring to mind. The most obvious is that it's a cost saving effort; in the budget space, shaving even a few dollars off of production costs can lead to significant profit on the back-end. This could either mean a higher profit margin per unit for Nvidia or a way to compress retail prices to compete with AMD's budget GPUs and inflate sales volume. With their RTX offerings underperforming and their stock slumping even as chip stocks surge, it's not a stretch to imagine the company is trying to reduce costs across every possible channel.
The other possibility is that Nvidia is attempting to reduce the transistors on the 1650 to decrease power draw. As our friends at Tom's Hardware reported, the 1650 is "an efficient graphics card well-suited to environments sensitive to power consumption." While only a fraction of that efficiency can be attributed directly to swapping out the NVENC block, this is another category where Nvidia has proven themselves beholden to the "every little bit counts" philosophy of design.
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.


