There’s a delay between the time when you click your mouse in a game and when the event actually fires on screen. This is known as the end-to-end system latency, so is there a difference between Intel and AMD gaming laptops? I’ve tested two laptops with and without Optimus enabled to see what the differences are.
What Is End-To-End System Latency?
Here’s how Nvidia defines end-to-end latency:
Basically it includes all links in the chain between the input device (mouse) and output (display screen). As we can see, there are many steps in between that could be dependent on the underlying hardware, which is why I want to compare Intel and AMD laptops.
What Is Optimus?
I also want to find out how much of a delay Optimus adds. The diagram below shows how Optimus works in laptops.
Basically when the Nvidia discrete graphics is rendering a game, the frames are first sent via the integrated graphics of the processor before reaching the display.
Some laptops have a MUX switch, which gives you the option to disable Optimus after a reboot. Upon doing this, the Nvidia discrete graphics will be connected directly to the display.
Based on the fact that Optimus literally has an additional link in the chain, I’m expecting the total system latency with it enabled to be higher.
With that understanding in mind, here are two main goals for the latency testing in this article:
- What is the total system latency difference between Intel and AMD gaming laptops?
- How much of an overhead does Optimus add to each?
I’ve done the testing using Nvidia’s LDAT (Latency Display Analysis Tool):
Basically this sits on the screen and measures the latency of the entire laptop.
The LDAT tool is placed on a section of the screen in a game that will change in luminance when a mouse click occurs. In this test, I’ve used CS:GO at 1080p with all settings set to minimum using the instructions provided by Nvidia in their LDAT user guide.
The LDAT tool contains a luminance sensor and a physical button which initiates a mouse click. It connects to the laptop via USB cable, and there is software running on the laptop which measures the amount of time between the “mouse click” and when the content on screen changes in luminance. In the CS:GO example above, a mouse click instantly fires and the section of the screen changes from black.
The software makes it easy to perform multiple tests. I have tested each laptop configuration 100 times and taken the averages.
All testing was done with the XMG Neo 15, a Tongfang chassis, aka Eluktronics Mech-15 G3. Both laptops were tested with the exact same 16gb DDR4-3200 CL22 dual channel memory kit, both have RTX 3070 graphics with the same power limits (125-140W), and both have the same 1440p 165Hz screen (BOE0974).
Both laptops were also tested with their highest performance modes with fans set to maximum speed with a cooling pad for best results.
It’s important to note that despite both laptops having the same model of panel, there will still be small a difference in the response times. I could have used the same external monitor to rule out this difference, however that would bypass Optimus and not achieve our 2nd goal above.
Total System Latency In CS:GO
These are the total end-to-end system latency times from the testing. We’ll start with Optimus enabled, followed by Optimus disabled.
With Optimus enabled, the AMD laptop is 0.5ms faster on average. What I found more interesting was that the standard deviation on the AMD laptop was lower, so out of the 100 test samples, the AMD laptop was offering more consistent results.
Let’s see how things change with Optimus disabled.
The total latency lowers with Optimus disabled, and although both laptops now have the same average latency time at 28.2ms, the AMD system still has a lower standard deviation and thereby more consistent results.
It’s important to note that not all laptops offer the ability to disable Optimus, however it appears that this is another benefit of the feature. I’ve already shown that disabling Optimus can boost FPS in games, now we know it can improve overall latency too.
An Easier To View Summary
I’ve summarized the total end-to-end system latency from each configuration in the table below for easier viewing:
|AMD Ryzen 7 5800H||Intel i7-10875H|
Optimus enabled is adding more latency. This makes sense, given the display signal from the Nvidia discrete graphics must first go via the integrated graphics before reaching the screen.
With Optimus disabled, the integrated graphics are bypassed and the Nvidia GPU outputs directly to the screen.
Both laptops were averaging 28.2ms with Optimus disabled, however the Intel system was slightly slower when both had Optimus enabled.
Based on these results, we can say:
- Total AMD laptop latency increases by 8.16% with Optimus enabled
- Total Intel laptop latency increases by 9.93% with Optimus enabled
Although I have made an effort to use two laptops that are as close to being the same as possible with both Intel and AMD processors, the sample size is too small to draw sweeping conclusions.
Given both laptops scored the same with Optimus disabled, it may be the case that the difference in processor is otherwise too insignificant to matter. It could also change based on the game, I chose CS:GO as it is known to be fairly processor dependent.
With Optimus enabled, it may be the case that an Intel + Optimus setup adds more overhead compared to AMD + Optimus. At least this is what this data points to, but again I need to test a lot more laptops before making such claims.
Regardless, I think these are still interesting results, and I will likely start testing this on all gaming laptops going forward. Once I have a large sample size it may be easier to identify trends.
We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.