If you missed the first post, which was an overview of the different levels of optimisation a game can have, then you can find it here:
99% of all graphics cards are made by the duopoly of AMD or NVIDIA (NV). As well as controlling graphics hardware, both companies have expanded into software, creating a middle layer that goes between the graphics card hardware and the games software. Both companies have a similar box of tricks, and I’ll explain a little of what they both offer.
While they are similar in many regards, the major difference at the moment is how much influence the company can have both after a game is released and, more importantly, on the development process of a game. Current graphics card poster child Watch_Dogs is the game in focus today.
NVIDIA’s armoury:
- GeForce Experience program provides one-click optimisation for all games running Direct X 9, 10, and 11.
- HABO+ Ambient Occlusion: an improved way of rendering objects that cause shadows, meaning more realistic shadows.
- TXAA Anti-Aliasing: a new anti-aliasing technique, which makes motion smoother and removes jagged or flickering object edges.
- G-Sync monitors: monitor that can synchronise refresh rates to gpu speed, rather than keeping a constant refresh rate. Aims to remove any instance of screen tearing and display stutter.
- GameWorks: a package of tools for developers to use during the middle stages of building games. This is one I’ll be coming back to later.
GeForceExperience uses cloud preference data to identify choices for optimisation. The first image I got from that was people effectively optimising their games by majority vote- though that’s not the case, its an interesting idea.
My main criticism here is that players can optimise their game and graphics card settings on the first launch, or even straight after purchase. To me this doesn’t make much sense. That seems like having a strong preset EQ for music, and only ever listening to music through that EQ setup, without ever listening to it by itself. Yes, it is a case of personal preference, but there’s a part of it that could easily go wrong.
I can imagine people unfamiliar with the intricacies of graphics cards and visual settings (aka most people who don’t build their own PCs, who are therefore most likely to need a one-click optimisation program) using the program by default under the assumption that it makes their game look and run as best as possible. However, if there were then problems with the game-either through an incorrectly applied setting in the program, a driver/graphics card conflict, or an idiosyncrasy of the user’s hardware setup- that could then be attributed as the game’s fault, or the game could be negatively reviewed for poorer performance than expected.
Back to the music analogy, imagine if someone said an album didn’t sound good to them, but it only didn’t sound good because their EQ settings made it sound different to what the artist intended. That doesn’t completely invalidate their opinion of the album, but it does means their review should taken with a large pinch of salt.
AMD’s armoury:
- AMD OverDrive: resource software which can change fan speed, automatically overclock components, and set memory profiles to optimise gaming RAM usage.
- AMD Dual-Core Optimiser: bypasses windows timing APIs (APIs are elements of programs that manage connections between different programs) to use dual-core CPUs more efficiently in gaming.
- CrossFire: a hardware system that lets multiple GPUs work together. This means someone can install two graphics cards in the same pc, and CrossFire can use both simultaneously.
- AMD Gaming Evolved (otherwise known as Raptr): a software package including one-click optimisation, driver management, game-recording, and a social community with achievement and play-tracking. Part of Catalyst Control Centre, which controls advanced graphical features and in-depth visual settings. This can be seen as AMD’s equivalent of GeForce Experience.
- Mantle: a graphics middleman and future alternative to DirectX. Mantle aims to remove as many layers between programming and hardware as possible, to reduce stress on the CPU and avoid slowing the system down. It also focuses on multi-core PCs, efficiently sharing instructions and commands between all cores. The first game using Mantle is Battlefield 4, while future games include Mirror’s Edge 2, Star Wars Battlefront 3, and Dragon Age: Inquisition.
Flaws
As you would expect, all of the software mentioned here is proprietary. In some cases, such as Dual-Core Optimiser, this isn’t too much of a problem, as other software can do the same job. Driver management, while a good idea in terms of convenience, may conflict with other maintenance programs or schedules; similar to how a computer with two anti-virus suites may conflict if one suite considers the other to be malware. Someone wanting to try CrossFire, on the other hand, may need to replace or upgrade the majority of their PC to have compatible parts.
Mantle is the most problematic aspect, as games using Mantle show significant performance increases on PCs using AMD hardware and graphic cards. While Intel sets can still benefit from Mantle, AMD cards, especially combined with CrossFire, receive much greater performance gains.
This could threaten Nvidia, as in theory a mid-range AMD card could be boosted by Mantle up to the level of a high-end Nvidia card, while people looking for the highest possible framerate would be drawn to AMD cards. However, Mantle features are application-dependent, not linked to specific hardware; implementing Mantle therefore relies on games being developed with the use of Mantle in mind at the start. This puts it parallel with Nvidia’s GameWorks, rather than elevating it to a class of its own.
The main question from all of this information is, how do proprietary middleware and tools affect games?
Small differences between games across the two systems are relatively common, and are usually not a big deal, often being caused by a new graphical improvement or technology not being fully working on the other system. For example, Tomb Raider (2013) faced some criticism on release because AMD’s TressFX hair and fur physics system was faulty on NVidia-based PC’s. While this did mean a slight graphical imbalance, it wasn’t a major flaw because the game was still fully playable, just missing finer details.
Watch_Dogs, on the other hand, has been facing almost every possible graphical controversy since E3 2012, including the expectation of unfairly imbalanced performance for Nvidia systems.
Watch_Dogs and GameWorks
Watch_Dogs is strongly connected to NVIDIA: built with heavy input from NV developers and GameWorks, using all of the NV box of tricks such as TXAA and Ambient Occlusion, and even given away free with some NV cards.
The pervasive use of GameWorks in developing Watch_Dogs has caused controversy amongst claims that Ubisoft and NVIDIA’s close relationship has shut AMD out entirely, crippling performance in their systems. AMD officials even claimed, in an pre-release-day article from Forbes, that developers using GameWorks can be prevented from using any AMD-suggested optimisation.
NV refuted that argument, claiming nothing in GameWorks prevents AMD optimisation. This is technically correct, as AMD can try to optimise the game. However, the code of Nvidia’s extra features like TXAA and HBAO+ is not shared with AMD developers.
This is like Nvidia making a cake and leaving AMD to cook it correctly without knowing the ingredients, then arguing that nothing was wrong because they’d given AMD access to the cake. Technically correct, but not good for the people that actually want to eat the cake.
After Watch_Dogs was actually released, the issues were less one-sided than these articles suggested. A majority of this is from Watch_Dogs performing so erratically even on strong systems that performance was difficult to compare. The trend appears to be that at lower settings, using GameWorks meant a performance boost for AMD. However, at high settings, excessive Video RAM requirements caused by poor optimisation cancelled out the effects of GameWorks.
So… well done to Nvidia, AMD, or neither?
The lesson from Watch_Dogs appears to be that throwing every possible graphical toy at a game doesn’t help if it hasn’t been optimised properly. That, or instead of too many cooks spoiling the broth, too few cooks doesn’t help much either.
One thought on “Game Optimisation, Part 2: AMD vs NVIDIA”