While AMD seems to have made up with Slightly Mad Studios, at least if this tweet from Taylor is anything to go by, the company is facing yet another supposedly GameWorks-related struggle with CD Projekt Red’s freshly released RPG The Witcher 3. The game makes use of several GameWorks technologies, most notably HBAO+ and HairWorks. The latter, which adds tens of thousands of tessellated hair strands to characters, dramatically decreases frame rate performance on AMD graphics cards, sometimes by as much as 50 percent.
I got bitten by this just the other day. I’m currently enjoying my time with The Witcher III – go out and buy it, it’s worth your money – but the first few hours of the game were troubled with lots of stutter and sudden framerate drops. I was stumped, because the drops didn’t occur out in the open world, but only when the head of the player – a guy named Geralt – came close to the camera, or was in focus in a cutscene. It didn’t make any sense, since I have one of the fancier Radeon R9 270X models, which should handle the game at the highest settings just fine.
It wasn’t until a friend said “uh, you’ve got NVIDIA HairWorks turned off, right?” Turns out, it was set to “Geralt only”. Turning it off completely solved all performance problems. It simply hadn’t registered with me that this feature is pretty much entirely tied to NVIDIA cards.
While I would prefer all these technologies to be open, the cold and harsh truth is that in this case, they give NVIDIA an edge, and I don’t blame them for keeping them closed – we’re not talking crucial communication protocols or internet standards, but an API to render hair. I do blame the developers of The Witcher for not warning me about this. Better yet: automatically disable and/or hide NVIDIA-specific options for Radeon owners altogether. It seems like a no-brainer to prevent disgruntled consumers. Not a big deal – but still.
There was BatmanAA for nVidia, then DeusExHR for AMD then TombRaider for AMD, and all the others, now it’s TW3 for nVidia, yeah, oh life is so sad for poor old AMD, how is that TressFx working out on my nVidia card? Oh wa-
Game tech lock-ins, it takes one to know one.
I have a similar opinion. At the end of the day Nvidia provide a feature which is better on their cards than AMD’s. This is a feature that ADDS things to the games that use it. If these guys thought that implementing their game with this feature rather than just not including it because AMD cards aren’t as good for that then good for them. Yes it does suck a bit from a consumer perspective but I personally would rather have a game that works spectacularly on certain hardware rather than mediocre on everything. Additionally it is an optional feature so it’s not like it’s always going to affect the performance on AMD cards.