While AMD seems to have made up with Slightly Mad Studios, at least if this tweet from Taylor is anything to go by, the company is facing yet another supposedly GameWorks-related struggle with CD Projekt Red’s freshly released RPG The Witcher 3. The game makes use of several GameWorks technologies, most notably HBAO+ and HairWorks. The latter, which adds tens of thousands of tessellated hair strands to characters, dramatically decreases frame rate performance on AMD graphics cards, sometimes by as much as 50 percent.
I got bitten by this just the other day. I’m currently enjoying my time with The Witcher III – go out and buy it, it’s worth your money – but the first few hours of the game were troubled with lots of stutter and sudden framerate drops. I was stumped, because the drops didn’t occur out in the open world, but only when the head of the player – a guy named Geralt – came close to the camera, or was in focus in a cutscene. It didn’t make any sense, since I have one of the fancier Radeon R9 270X models, which should handle the game at the highest settings just fine.
It wasn’t until a friend said “uh, you’ve got NVIDIA HairWorks turned off, right?” Turns out, it was set to “Geralt only”. Turning it off completely solved all performance problems. It simply hadn’t registered with me that this feature is pretty much entirely tied to NVIDIA cards.
While I would prefer all these technologies to be open, the cold and harsh truth is that in this case, they give NVIDIA an edge, and I don’t blame them for keeping them closed – we’re not talking crucial communication protocols or internet standards, but an API to render hair. I do blame the developers of The Witcher for not warning me about this. Better yet: automatically disable and/or hide NVIDIA-specific options for Radeon owners altogether. It seems like a no-brainer to prevent disgruntled consumers. Not a big deal – but still.