While there's a certain appeal to Stadia's near-hardwareless design, cloud gaming means you lose total control over your game performance. Unlike hand-picking the best possible graphics card, processor, or monitor according to your budget, Google Stadia offers three resolution settings for your devices (minus 4K on PC, currently), but will automatically lower those settings if it detects bandwidth instability.
There are a few things you can do to get the most out of Stadia other than up your internet package, but how will it compare to PC gaming, or console gaming for that matter? That's a simple question with a complicated answer. I covered a lot of that in my Stadia review, but my testing process went beyond what I covered there, getting baseline readings on certain games on local PC, limiting bandwidth to simulate different internet speeds, and so on.
Local PC and internet specs
CPU: AMD Ryzen 7 2700X
GPU: Nvidia GeForce GTX 1080 Ti
Motherboard: Asus ROG Crosshair VII Hero (Wifi)
Memory: G. Skill TridentZ RGB 16GB (2 x 8GB) DDR4 3200
Storage: Intel 760p 1TB SSD PCIe NvME 2.0; Seagate 2TB HDD 7200 RPM
PSU: EVGA SuperNOVA 750W 80+ Gold
Cooler: NZXT Kraken X52
Case: NZXT H700i
Monitor: ASUS MG248QR 24-inch 1080p HD 1ms 144Hz
Keyboard: Asus ROG Strix Flare
Mouse: Logitech G403
Modem: Netgear CM600
Router: Asus ROG Rapture GT-AC2900
Internet: Cable, Spectrum, 400+Mbps wired, 115Mbps (aro) wireless
Television: TCL Roku 50-inch LED 4-series 4K UHD TV with HDR
What I found rings true for other cloud gaming platforms, like Geforce Now and Shadow: Singleplayer games are mostly fine to play through the cloud, but any cloud gaming platform is going to be a no-sell for people who only play multiplayer games, even with a good connection. Here's a deeper look at side-by-side performance of Stadia vs. a local PC.
There were a few limitations to my tests: 4K isn't available via Stadia on PC; I could not connect the Stadia controller to my PC; and I could not connect my Chromecast to an Ethernet cable due to the layout of my home. (The only cable outlet is in the bedroom, which is on the opposite side of my home from the TV with the connected Chromecast.) However, I was able to conduct latency tests both on my TV and PC at the maximum wireless bandwidth possible, 150Mbps, as well as compare the results to local PC performance. While Google Stadia recommends 35Mpbs minimum for 4K gaming, we recommend at least 100Mbps (and 50Mbps for 1080p) if you want smooth play with a stable resolution.
All the below testing was performed on a 5Ghz wireless connection, and latency testing was filmed at 240 fps, counting the frames between key/button press and the action on screen to get the latency in milliseconds. The native latency of my TV (which is quite low) and monitor factors into each result, but most of it will come from my internet connect and how many 'hops' it takes to connect my home internet to Google's servers and back again.
|Game||Local PC 1080p||Stadia PC 1080p||Stadia TV 1080p||Stadia TV 4K|
|Shadow of the Tomb Raider||63||125||213||279|
Local PC latency (1080p, 100-150Mbps, keyboard/mouse)
When it comes to latency, there is no contest between Stadia and playing on a local PC. The local PC beats Stadia every time. With a local connection, and especially with wired peripherals, you're shaving dozens, maybe hundreds of milliseconds off your input latency; it's a direct connection between your inputs and the game installed on your local machine depending on your connection, so it doesn't take long for your PC to react. Stadia, by contrast, uses a wireless controller connected to your wi-fi to detect your inputs, process them on its remote servers, and then feed the image back through your Chromecast.
To form a comparison, I established a baseline on my local machine for input latency. As you can see from the videos below, not only is the performance of each game free of skips and stutters (aide from a brief moment or two), but even with the latency tests slowed down input lag is virtually undetectable.
Much of that is due to the hardware in my machine, and it's possible that if I had a current-gen processor and graphics card the games could perform even a little better, but I'm only running a 1080p monitor, so what I have is more than sufficient. And my internet speed helps keep packet loss, latency jitter, and other bad lag at bay.
Destiny 2 latency (high graphics) - 83ms
83 ms of latency between my finger pressing all the way down on the left mouse button and the muzzle flash emitting from the rifle.
Shadow of the Tomb Raider latency (high graphics) - 63 ms
63 ms of latency between my finger hitting the actuation point on my spacebar and Lara Croft bending her knees to jump into the air.
Both games played near-flawlessly. Any sudden camera movements was my hand jerking the mouse to the side. The low input latency for both games is a clear indicator that my system isn't bottlenecking or suffering from any serious hardware limitation.
Stadia PC latency (1080p, 100-150Mbps, keyboard/mouse)
You can visually see a difference between the input latency on local PC verses Stadia PC, even though it's not much at all. When you're playing on Stadia, TV or PC, if you're not really paying attention to it, you won't notice the minescual increase in latency, but if you go back and forth between the two you will.
Going from local PC to Stadia on PC not only increased the latency, but there was a noticeable difference in visual quality, too. A lot of this has to do with the fact that it's a real-time video game streaming over the internet that is constantly adjusting itself to your bandwidth, so you have no control over that. You also don't have the same granular controls over graphics quality with Stadia as you do with a PC. Settings like texture quality, shadow quality, and texture filtering, are all missing from Stadia, replaced with a simple, three-choice resolution setting: 4K, 1080p, and 720p.
These settings are more geared toward controlling the amount of data your network uses in any gaming session, not necessarily controlling the resolution, although playing in 1080p on TV is noticeably duller than 4K.
Destiny 2 - 150 ms
150 ms is nearly double the amount of input lag for the playing the same game on a local PC. Generally, multiplayer games will end up with more input latency because the connection still needs to ping the game servers run by the developers/publishers.
Shadow of the Tomb Raider - 125 ms
Also double its input latency, Tomb Raider had 125 ms when played over the internet on PC. It's kind of amazing (in both a good and bad way) how much input latency cloud gaming can create.
Notice the decrease in visual quality from playing on a local machine to a browser. While there isn't much of a difference in gameplay, it's obvious that these games are being streamed over the cloud rather than played locally.
Also, note the rubberbanding in Tomb Raider. The sudden, jerky movements you see are not my hand moving the mouse. That's the game randomly changing angles.
Stadia TV latency (4K, 100-150Mbps, Stadia controller)
Destiny 2 - 167 ms
Making the move from PC to TV didn't make a huge difference in Destiny 2's case, even at 4K. But even though the input latency only went up to 167 ms in this test, there was still noticeable lag during my game, like someone trying to pull a heavy object with a rope, more stuttering than rubberbanding.
Shadow of the Tomb Raider - 279 ms
279 ms is more than double the amount of input latency playing Stadia on PC at 1080p. (And it's not much better at 1080p on TV, as you'll see below.) I'm not sure what is causing this massive increase, only that it was consistent every time I performed this test. The only thing that might make a difference is the 'performance verses quality' option only available if you're playing Tomb Raider on TV. I had mine set to quality, which doesn't make sense as a choice in a game running on a cloud server, anyway. It should be able to do both, like a PC game.
I didn't experience the same rubberbanding here I did on PC, however. Despite the massive input latency, the game ran consistently smoothly.
Stadia TV latency (1080p, 100-150Mbps, Stadia controller)
Destiny 2 - 163 ms
As I already mentioned, Destiny 2 input latency clocked in at 163 ms on 1080p—not that different from 4K. The game did perform slightly better, although it still felt like it was struggling at times to keep up with my inputs.
Shadow of the Tomb Raider - 213 ms
213 ms is a much better number than at 4K, but it's still a massive jump from Stadia PC, and especially if you compare that number to gaming on a local machine as mine is configured.
Again, rubberbanding wasn't an issue here and gameplay felt much smoother than Destiny 2 at both resolutions.
Of course, Google Stadia has more games than just the two tested here, so individual input latency results of each of those games will vary. But this should give you a good idea of what to expect in terms of how the input latency changes between a local PC, Stadia on PC, and Stadia on your TV. You'll need to keep your own monitor and TV's latency in mind, and enable any special 'gaming modes' if those options are available to get the best possible results.
What's missing: Graphics settings and more
Our guide to how to use Stadia on PC lays out your current options, like compatible controllers. But one of the most important things to know about how Stadia compares to PC gaming is what it's missing.
In the Stadia games we've tested, there are no PC-style graphics options. On PC, Shadow of the Tomb Raider and Red Dead Redemption 2 have extensive options that make the games look dramatically better than on consoles. Stadia has no resolution options, so aspect ratios other than 16:9 are off the table. Mod support is nonexistent, of course. And while it's possible to take screenshots with a button on the Stadia controller, there's currently no way to save and share those screenshots.
Stadia is currently limited to 1080p, 60 fps on PC, and 4K, 60 fps on the Chromecast Ultra.