Sidenote: What Would AMD Need to Win Me Over to a Radeon? and a Note on Patches

Firstly, the regularly scheduled content note – there were two very major patches (in their own ways) for both of my main play MMOs that I discuss here. I haven’t played enough of either to have really concise thoughts, but here’s my bullet list:

FFXIV:
-I GOT A HOUSE, FINALLY


-Bozjan Southern Front is cool, but Resistance Weapon quests are a bit disjointed
-Cool first music track from the 5.3 trailer is in Bozja!

WoW:
-The ICC tower in the login screen tilts out of symmetry and now that it has been pointed out to me it bugs me
-Level squish is fine so far at level 50 (haven’t had problems personally with content, soloed some Nighthold Mythic on my Druid and it was about the same as when I tried it the night before patch)
-I’ve already been goofing with the new character creation stuff on beta, but it’s really cool still!
-Not a huge amount of class shifting or changing thus far, but I’ve only really actually played Balance Druid so far
-Feels really weird having prepatch now without a release date for the expansion or any actual new content (yet)

Anyways, with that stuff said, expect more deep dive posts on these topics later this week!

So with AMD’s next big announcement in two weeks, I wanted to sort of explore a topic I’ve been grazing at with most GPU posts I’ve had over the last month and change – whether or not AMD can compete with Nvidia, why I think there are some challenges for them in said competition, and what they would need to do for me personally to shift towards their products.

In the simplest sense, let me put it this way – in the price tier I usually buy graphics cards in, AMD has been a non-entity for around 5 years now. The Radeon Fury X and R9 290X were the last majorly competitive AMD cards, both of which launched in 2015. Both challenged the top of Nvidia’s stack at the time in a great way (for which Nvidia released a Titan to maintain the “fastest” crown) and were great offerings, hampered only by incredibly loud stock cooling on the R( 290X (the Fury X dodged this by having a liquid cooler out of the box!). Since then, AMD has competed largely as they did in the CPU space until Ryzen – by having cheaper cards that get to the point of “good enough” at their respective price points and can often trade blows with the Nvidia offerings there.

Unlike with AMD’s CPU division, however, there is a challenge in pinpointing the exact fall of the Radeon Technologies Group. There’s a large list of failings, including AMD’s lack of marketing to win back mindshare from Nvidia, the mainstream-focused strategy that leaves AMD out of halo-product comparisons or often winds up with their best looking weak, leadership struggles resultant from overambitious and underperforming leadership in the Vega era under Raja Koduri (who is now seemingly repeating the same trajectory with Intel’s graphics unit), and a lack of R&D funding and meaningful support during the Ryzen retooling, leading to problems with driver issues lasting for a long time, difficult, delayed product launches, and Nvidia’s childish insistence on one-upping every Radeon release with some news (cut prices on RTX 2060s, the RTX Super cards, moving the release date of the RTX 3070 to the day after AMD’s Radeon announcement at the end of the month!).

All of these contribute to a myriad of issues that lead a lot of enthusiasts to not pick AMD, even in cases where they do make sense. Rather than trying to speak for a massive audience with a variety of reasons to make the consumer choices they do, I’ll speak solely to mine.

Again, no offerings in my price class: I treat my PC hardware as a hobby and investment, and if I can find cause to justify getting and using a higher-end part, I will. That has meant that when I buy graphics cards, I usually go straight to the halo product and only back down from there if it really breaks the sense of logic in the purchase (for example, being convinced I wanted a 3090 until I saw that it offered a ~10% performance uplift over the 3080 for 117% the price!). The 3080 I’ve identified as the GPU of choice for my next build goes between $700 and $900 depending on the manufacturer and out of the box settings. In that price tier, AMD currently has…nothing. Their best card as of today is the RX 5700 XT, a great card that competes with the Geforce RTX 2070, and is typically beaten by the 2070 Super and above despite the pricing being closer to the Super for most models. This actually leads to a second problem…

Rarely competitive in the price tier they place their best offerings: The Radeon VII launched at the start of 2019 for $700, at a time when the Geforce RTX 2080 was the primary Nvidia card there. It lost in most benchmarks to the 2080. It also, typically, lost to the Geforce GTX 1080 Ti from two years prior, which could be found used for less and even new right around the same pricing. When the RX 5700 XT launched in July 2019, it was priced to compete with the RTX 2070, a favorable comparison on most fronts. However, Nvidia simply launched their RTX Super lineup, and the 2070 Super, while more expensive at launch, took back the performance crown in that tier. There were still a lot of good reasons to buy the RX 5700 XT – it was and is a good graphics card. However, it didn’t really steal the show – it traded blows with the 2070 and couldn’t beat the Super, so its fate was sealed.

Driver Issues Galore: This one is always a hot button for some, but it needs to be said. If you don’t own an AMD card and look into the AMD subreddit, or most enthusiast hardware communities, you can see pages upon pages of driver issues – everything from performance regressions to hard lock crashes. My experience when I last owned a Radeon was mostly good, but I did have driver conflicts and crashes more than I have with Nvidia. The thing with any such issue is that ultimately, it is a series of anecdotes – I know people who had crashes with Nvidia, and the whole Vista-Capable lawsuit with Microsoft over Windows Vista instability revealed the number one cause of reported crashes was Nvidia drivers! However, when you look at the fan communities for both products, one tends to find more regular driver releases, anecdotally has better experiences reported, and generally reports fewer issues – and it is Nvidia’s fanbase. AMD’s fanbase, rabid as they are in support and defense of the company, also report more issues with the drivers. That doesn’t mean an AMD card is always going to crash or that their drivers always suck, but generally, the experience doesn’t seem ideal looking in on it!

Lack of Software Features: This one is more of an Nvidia win than an AMD loss – AMD does offer some interesting and compelling software options for use with their cards, and even has the dignity to not force you to log in and allow telemetry to use most of them! However, Nvidia simply has way more software development underway. RTX cards can use AI to clean up your webcam, keep you in frame, filter microphone noise to create cleaner audio, and do background replacement and effects. Geforce cards have an outstanding video encoder and broad codec support from a variety of software tools (you could use Shadowplay, but that sucks since you have to log in, luckily the encoder it uses is accessible in OBS and most other streaming and capture applications). Nvidia was first to real-time hybrid rendering with raytracing, first to AI-based resolution scaling, first on desktop to variable rate shading support, first to offer dynamic resolution scaling to allow you to use an overpowered GPU to render at higher resolution at a driver level and then scale the output to fit your monitor’s actual resolution, first to offer enhanced screenshot modes through Ansel, first to offer reshading filters at a driver level, and first to bring optimized per-game settings through auto-detection to gamers. Depending on what you play or how you play, many of these may not matter, but AMD has struggled to match the breadth of various features Nvidia does.

To be fair on this point, AMD beat Nvidia to reduced latency rendering, adaptive sync support, multi-display as a single virtual display, and Nvidia still doesn’t have anything on offer for power management as good as Radeon Chill.

Awful reference cooler designs: This one is easy on the face of it (don’t buy a Radeon directly from AMD) but it carries over into board partner designs too. AMD makes the worst coolers on the planet for their GPUs, a shame because the boxed coolers with Ryzen CPUs are quite good. The company has an obsessive hard-on for obnoxiously loud blower cooler fan designs with insufficient heatsink surface area leading to the fan running extra-hard for a prolonged period of time. They don’t enforce board quality specifications with partners as well as Nvidia, which leads to things like the Asus RX 5700 designs having insufficient mounting pressure on the GPU die leading to overheating. Even still, you can’t just blame the board partners there because the RX 5700 XT cooler pressure was so bad that buying washers and adding them to the screws on the GPU cooler along with thermal paste on the GPU instead of the pad included would lower temperatures by a substantial margin!

Overall architectural inefficiencies: This drives to a deeper issue, but the Radeon technology has fallen behind from their golden years. It took until Vega for the Radeon series to have tile-based rendering, an efficient rendering technique that mobile GPUs in phones have used for a long time and that Nvidia added to the Maxwell 900-series Geforce cards in 2014. AMD’s design going into the GCN architecture was about compute efficiency, which led to some bizarre problems where GPU occupancy was reduced because the card was built as a compute card which happened to be capable to doing gaming graphics, rather than as a gaming first design (to their credit, the move to RDNA is intended to switch that and the RX 5700 XT shows a clear improvement for this work). Ironically, the Ampere design in the current Geforce cards may be leading to Nvidia taking on this same problem, which is seen through some of the poor resolution scaling in benchmarks compared to Turing.

AMD, when they do implement strong efficiency measures, then leaves it to developers to write code for it specifically, instead of other driver-level solutions. A great example of this is implementation of Primitive Shaders on Vega. Designed to keep GPU occupancy high, it was supposed to offer a massive improvement in handling geometry. However, AMD could not get it worked in to the driver and found that a marquee feature they shipped required API changes, which never fully came, leaving Vega performance at a serious deficit over where the cards could have been were it enabled. Vega as a product in general has a lot of these problems – its implementation of tile-based rendering was bugged and not working for many people at launch, the high bandwidth cache option required user options, and these issues led to reduced occupancy in the GPU which meant that the lack of performance was down to the GPU sitting waiting for work.

So given these issues, what would make it work for me such that I could buy an RX 6000 GPU? Well, a few things are top of mind:

Working launch drivers: Give me a solid driver and that’s number one.
Leading performance per dollar with results that meet or exceed the RTX 3080: If AMD can hit an average of 95% of the RTX 3080 at like $600, even if the tech nerd shit inside isn’t as good as it could be, that is a strong case to make!
Better stock cooling: The 3-fan model AMD has shown off looks nice, and provided that it doesn’t have the problems the Radeon VII design (which was also 3 fans) had, then I could support that.
Proper DXR Raytracing: This one comes down to a fear of what Nvidia has done more than a fear of AMD failing at this. They have raytracing in the RX 6000 series, but if apps that currently support raytracing features on RTX support it on the Radeon cards, that’s a win.

Realistically, that is it – if they offer these, then I would absolutely get onboard with a Radeon card! Right now, I’m waiting until mid-November as EVGA is supposed to be offering a pre-waterblocked RTX 3080 at that time, and that is my current target for what I want in my next system. If, however, AMD comes to the table with a strong offer (and I can get a waterblock for it later) then they could well win.

Either way, it’s an exciting time in technology and keeps getting more interesting!

2 thoughts on “Sidenote: What Would AMD Need to Win Me Over to a Radeon? and a Note on Patches

  1. I think AMD has realized that while F1 LOOKS really lucrative and sexy, the real money’s in NASCAR, and they never really liked champagne anyway 🙂

    Radeon usually ticks all my boxes in that I don’t play those games that “need” (questionable term) the F1 graphic card, but I suspect even if I did I’d settle for something more “sensible” that would fulfill the role for the lifetime of that game and then some. I build five year builds, and my ugly NASCAR card usually outlives it, as long as I don’t unreasonably expect F1 performance for the entire period.

    Super flawed analogy, I know. My brief experience with an F1 of two years past kind of made me less interested in maintaining that relationship. That said, the wife got her a nice F1 in a really good deal (current, even) and four years later it worked great in my machine when my NASCAR started flaking out (still think it was a driver issue). And now she’s using it still, so in SOME cases, the F1 also fulfills the long term reliability role).

    I gotta stop watching car shows on TV.

    Liked by 1 person

  2. ICC itself leans a bit to the left if you go to the same spot in Icecrown. The original reddit image had the line in the wrong spot, as well. Someone put the image in Photoshop and used the centering (?) tool and showed that the image was as symmetric as it was possible. Torghast being off is understandable since we’re looking at it through the lens of the broken sky. I wouldn’t expect that to be accurate, but more like the view of something through a surface of water.

    As far as the AMD graphics cards go, I just want them to put pressure on Nvidia as they are with Intel on CPUs. I’ll personally likely stay with Nvidia cards since I have a GSync monitor.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.