I’ve been sharing game benchmarks on laptops and PC hardware for years, but why do I always use graphs instead of showing actual game play?
Here’s what it comes down to: Graphs are a more efficient way of displaying lots of information in an accurate manner that can be fairly compared.
Let’s break this down.
Graphs are a more efficient way of displaying information
This is literally the point of a graph after all, to concisely display information. It is much easier and faster for the viewer to take away the results from a graph than from watching game play.
Take Cyberpunk 2077 for example. I test this game at 6 different setting presets on all gaming laptops. Each setting preset is tested 3 times for 90 seconds each, so once you include reloading the game after each test, we’re looking at about 30 minutes of testing time to produce the graph below:
I honestly doubt anyone is going to want to sit through ~30 minutes of game play to see the same results. Even if I included all 6 setting presets on screen side by side, that’s still 270 seconds if I was to show all three test runs. Now you could argue that showing just one test run is fine. That’s still 90 seconds, longer than it takes to digest the graph above, but sure, much more reasonable than half an hour.
Graph data is easier to fairly compare
There’s a reason averages from 3+ test passes are taken. It will vary by game and may even depend on the hardware to some degree, but there’s always run to run test variance.
You’re not going to accurately pick up this sort of thing watching game play. Good luck watching multiple three side by side videos and doing the math on the constantly changing average and 1% low FPS values.
Showing the average values in graphs makes it more accurate when comparing, as we’re looking at data from multiple test runs.
Recording game play can affect results
The act of screen recording requires system resources. By screen capturing game play you may be inadvertently taking away CPU/GPU/RAM/Storage resources that could otherwise be used by the game to perform better – inaccurate results.
Many screen recording options such as Nvidia ShadowPlay use a surprisingly small amount of resources these days, and often the results can be negligible, however this still adds overhead compared to not doing it at all.
Bypassing this with an external capture device might actually make things more misleading, especially when it comes to laptops. This is an entirely different topic, but essentially many laptops have Optimus, and this can be bypassed if an external display output port connects directly to the laptop’s discrete GPU. This is the same reason that using an external screen on a gaming laptop can significantly boost FPS.
Other common concerns
These are some of the arguments I often hear against graphs in favor of seeing game play.
Watching game play lets me experience how the game will perform
Given YouTube videos are limited to 60 FPS best case, while 30 FPS is still more common, you’re not actually experiencing the game play through the video in the same way as actually playing it.
This is especially true with high refresh rate panels and games that can hit higher frame rates.
But you can just fake the graphs!
Some people argue that the data in my graphs can be faked, as opposed to seeing the game play for themselves. This is a pretty weak argument in my opinion.
Firstly, if my data was faked, there wouldn’t be countless other people (both other reviewers and viewers alike) reporting similar results over the years. Yes, collecting the data takes a lot of time, but it’s essentially my job and I enjoy doing it. It’s not difficult and I have access to hardware, there’s no need to fake it and lose the trust of the viewer that I’ve established over the years. In short, there’s nothing to gain.
Secondly, it would be quite easy to fake game play as well. Just because you can see a screen capture doesn’t mean you know all the details about the hardware it’s being run on. Sure, it would be harder to fake if you’re literally pointing a camera at a laptop, however the previously made points still remain.
At the end of the day, in either case there’s a level of trust you’re giving to the reviewer, so let the data they’ve produced over time speak for itself.
Your results don’t match mine, so you must be faking them!
It’s important to understand that you’re often not going to be able to directly compare your test results with mine, or any other reviewer for that matter.
Take Fortnite for example, the performance in that game will depend entirely on where in the map the test takes place and what other players were doing during the test run. Ideally, a reviewer will attempt to test in the same place so that results are comparable within their own content. Personally I do this using the replay feature so that the same scenario can be consistently tested.
This means that unless you know where in the game I generate my replays, you cannot directly compare your results between mine, or my results with any other reviewer.
I have considered screen recording the test runs I do through all of the games I benchmark and providing them somewhere on this website, which would help others in comparing to my results, but this is not something I’ve yet completed.
You can only really compare directly between others by using built in benchmarks that are provided by some games, as these attempt to perform the same reproducible test run.
This is why I choose to test a mixture of games that offer built in benchmarks so that viewers may attempt to see how their hardware compares, as well as general game play, as this often better reflects actual performance compared to many of the game benchmarks out there.
Additionally, when I test all setting levels in a game, these are done from lowest first up to highest in the same session. I often get people saying that their laptop performs better in say the Shadow of the Tomb Raider benchmark at highest settings after running one or two test passes. By the time I’m getting to highest settings, the test has already been run multiple times at lower settings and warmed up the laptop, as opposed to running it from a cold start. This matters much more when it comes to laptops compared to a desktop PC, as thermal and power limits are more constrained, and it can definitely affect results.
I test all laptops this way while most others do not, so for this reason alone I would only consider my benchmark data comparable within my own content. I feel this better represents how you’d actually play a game, after heat saturation kicks in, not for just a few minutes after doing nothing.
Is game play useless?
I’m not saying people that show game play are wrong or that there’s no place for that sort of content. I am just giving the reasons why I personally prefer to show my data visually through graphs.
In my opinion, the main benefit of watching game play is so that you can see how the frame counters, clock speeds, power limits and resource usage metrics look in real time with an overlay such as MSI Afterburner.
Personally I prefer to test a single game when collecting thermal data for comparable results, and again I show these in graph format, but I can see why the above information would be appealing.
Regardless, for the reasons outlined above, I will continue using graphs to display the game data that I collect. If you still want to see actual game play, then my videos are not the content you’re after.