The Five Best AMD GPUs of All Time

The Radeon model was established by ATI Technologies again in 2000, and it is endured by means of numerous architectural modifications, paradigm shifts, and even AMD’s acquisition of ATI in 2006. The historical past of Radeon graphics playing cards is fairly messy, as rival Nvidia GPUs have nearly at all times put up a tough combat. AMD has just about at all times been the underdog all through Radeon’s 23 12 months historical past, although its choices routinely rank among the many greatest graphics playing cards.But being the underdog makes successful that rather more spectacular, and we’re not wanting on the right here and now. Instead, we’re taking a look at AMD’s 5 greatest graphics playing cards ever — in our opinion — ranging from the underside and ending at the perfect and most history-defining Radeon GPU ever made. While we have listed particular person GPUs, we’re additionally contemplating the entire household of every structure as properly, after which choosing the right consultant from every.5 — Radeon RX 480 8GB(Image credit score: Future)2016 was a surprisingly optimistic 12 months for AMD, contemplating how tough the earlier two had been. AMD was fairly far faraway from its final nice GPU, the R9 290X, which provided glorious efficiency at a aggressive worth. However, Nvidia struck again fairly onerous in 2014 with its Maxwell-based GTX 900-series, and AMD simply could not catch up. The Radeon 300-series had been simply 200-series playing cards with a brand new title, and the Fury-series had been failed flagships.There was an perspective of reform within the air (particularly after AMD simply barely staved off chapter), and in 2015 AMD determined it could create the Radeon Technologies Group, a extra autonomous graphics division led by Raja Koduri. RTG’s first graphics card ended up being the RX 480 primarily based on the Polaris structure, which had been in improvement earlier than the reorganization. This graphics card was not a flagship, however as an alternative provided an economical midrange card with a balanced mix of worth, efficiency, and energy.A midrange graphics card is often not very thrilling, however the RX 480 was completely different. AMD bought PC avid gamers excited with its “The Uprising” advertising and marketing marketing campaign, also referred to as the “Radeon Rebellion.” AMD employed advertising and marketing company Brand & Deliver to make advertisements with phrases like “VR is not only for the 1%” and “the gaming revolution will probably be streamed” and even “do not silence us, silence the GPU.” It was very indicative of what course RTG and Koduri needed to take Radeon, making it the model for avid gamers who needed high-end graphics playing cards however could not afford them.(Image credit score: AMD)When the RX 480 launched, there have been two fashions: one with 4GB of VRAM at $200 and one with 8GB for $250. The 8GB mannequin was clearly the higher selection, because it had double the reminiscence (and extra bandwidth) for simply $50 further. However, on launch day it wasn’t clear that the RX 480 8GB was a lot of a revolution in any respect. It was about as quick as the present Radeon R9 390 and GeForce GTX 970, each of which value simply $280. Then, the rather more environment friendly and barely sooner GTX 1060 6GB launched a month later, raining on AMD’s parade.But in the long term, the RX 480 8GB (the 4GB mannequin wasn’t practically as well-liked) aged fairly properly. Although initially slower than the 1060 6GB by an honest margin, a sequence of driver updates boosted efficiency nearer to the 1060 6GB. By the time the RX 580 (a refreshed 480) got here out in 2017, the hole was about closed. As an extension of the 480 with increased efficiency for a similar worth, the 580 was additionally fairly good.Although the RX 480 did not stay as much as its unusual, communist-themed advertising and marketing marketing campaign, it had a really lengthy life. It solely lately stopped receiving common driver updates, a run of over seven years. But in these seven years, none of the discuss VR actually got here to fruition, as the previous continues to be largely restricted to the high-end. The AMD reference design additionally had points, draw properly over the desired 75W from the PCIe x16 slot — customized playing cards had been significantly better. Nevertheless, the cardboard continues to be remembered fondly by a lot of the PC group even in the present day, and it enjoys a optimistic repute.4 — Radeon RX 6800 XT(Image credit score: AMD)AMD has at all times had a really onerous time making flagship GPUs, typically skipping them completely. ATI has made nearly as many flagship Radeon playing cards as AMD, and ATI was solely by itself for six years, whereas AMD has owned the model for 17. In 2020, it had been three years since AMD had made a flagship gaming GPU (the Radeon VII would not really matter), and it was lastly time for a comeback.After the failure of RX Vega in 2017, AMD determined to take its time to launch its subsequent flagship. First, AMD laid the groundwork by leaping to TSMC’s 7nm node and making a brand new RDNA graphics structure in 2019 with the RX 5000-series, which ended with the upper-midrange RX 5700 XT primarily based on the Navi 10 GPU.The mixture of the 7nm node and the extra gaming-focused RDNA 1 structure bought AMD on a superb footing, and AMD centered subsequent on creating what was basically a bigger Navi 10 with a more recent structure, RDNA 2. That GPU was codenamed Navi 21, aka Big Navi, and it was AMD’s first flagship in over three years when it launched in 2020. Although the top-end flagship was the RX 6900 XT (and later RX 6950 XT), it was the rather more reasonably priced RX 6800 XT that basically mattered.(Image credit score: AMD)The 6800 XT was a special beast from some other AMD graphics card earlier than it. It might hit properly over 2GHz out of the field, far increased than Nvidia GPUs of the time that usually topped out slightly below 2GHz. It additionally got here with 16GB of GDDR6 reminiscence and had a ton of graphics cores. Perhaps most significantly, AMD’s Infinity Cache proved extremely helpful, packing 128MB of L3 cache onto the GPU and permitting the 256-bit however to behave extra like a 384-bit interface in phrases of efficient bandwidth.The RX 6800 XT’s main competitors was Nvidia’s GeForce RTX 3080 10GB, and the RX 6800 XT was roughly as quick, whereas utilizing much less energy, and it additionally value $650 as an alternative of $700. AMD did add help for ray tracing with RDNA 2, although it was clearly extra centered on rasterization efficiency. It was Radeon’s Ryzen second… Sort of.The drawback with any GPU in 2020 (and 2021, and even into early 2022) was that you simply simply could not purchase them, or for those who might they’d typically value twice as a lot they need to. It was technologically spectacular that AMD had lastly caught as much as Nvidia in practically each manner, however it did not matter all that a lot when no one might purchase the 6800 XT anyway.The GPU scarcity disaster did not final perpetually although. It ended by mid-2022, with the dying of Ethereum mining, and costs started a gradual decline. Eventually, RTX 30-series GPU costs leveled out roughly $50 to $100 above MSRP, however RX 6000-series playing cards simply stored getting cheaper. They hit MSRP, after which they dropped properly under MSRP.Today, the most cost effective RX 6800 XT playing cards value round $480, whereas the most cost effective RTX 3080 10GB GPUs solely ever hit round $700 (just a few briefly hit $600, however that was very short-lived). Although the 6800 XT was behind in ray tracing and upscaling tech, it undoubtedly had a giant benefit in bang for the buck. Even the brand new RX 7800 XT is extra of a lateral transfer from the 6800 XT somewhat than a serious enchancment, albeit with improved energy effectivity.3 — Radeon HD 7970(Image credit score: Amazon)2010 had seen AMD practically overtake Nvidia in GPU marketshare, a file that the corporate hasn’t even come near matching since. AMD’s momentary success was largely right down to the failure of Fermi-based GTX 400-series playing cards that got here out within the late 2000s, and because the world entered the 2010s, AMD was having to cope with a revitalized Nvidia. AMD wasn’t precisely doing properly financially, so this might grow to be a giant drawback quick.Historically, AMD (and ATI beforehand) had at all times been capable of depend on its skill to get to cutting-edge nodes faster than Nvidia. This technique did not at all times work completely, however it did at the least give Radeon GPUs a superb benefit. AMD was progressing to 28nm after its 40nm HD 6000-series, however Nvidia’s upcoming GTX 600-series would even be on 28nm. That was one benefit AMD not had.But it wasn’t all doom and gloom for Radeon. AMD was additionally revamping its graphics structure, swapping out the Terascale structure for Graphics Core Next (GCN). GCN debuted within the HD 7000-series, led by the flagship HD 7970 in early 2012. It was considerably sooner than the earlier era flagships, the HD 6970 and GTX 580, which was good, although that further efficiency did come at a worth.Unfortunately for AMD, Nvidia’s GTX 680 was even higher. It was slightly sooner, it was cheaper, it was extra environment friendly, and it was smaller. Technologically and economically, AMD was overwhelmed on the 4 most necessary factors. AMD’s second within the solar was over, and Nvidia was proper again in first place the place it had been with the GTX 200-series.(Image credit score: Future)Except, AMD did not take its loss mendacity down. Sure, there was nothing it might do about energy effectivity or die measurement and not using a radical improve, however the HD 7970 wasn’t that far behind the GTX 680. If AMD might simply make a 7970 with the next clock pace, then it might at the least declare to have one of the world’s quickest GPUs.So, within the center of 2012, AMD launched the HD 7970 GHz Edition, that includes a 1GHz base clock pace, 75MHz increased than the unique. While the 7970 GHz did certainly shut the hole with the 680 (and possibly even exceeded it), a lot of the efficiency features needed to do with driver updates that additionally utilized to the 7970. Perhaps if AMD hadn’t rushed the HD 7000-series a lot, the GTX 680 would not have been such a giant drawback.In the tip, it was really the unique 7970 that gave the impression to be essentially the most interesting high-end card, because it was a lot sooner than it was at launch, it might simply hit 1GHz with an overclock, and it was less expensive than each the 7970 GHz Edition and the 680. Although 2012 was messy for AMD, it did find yourself successful the era general, if solely by a hair.2 — Radeon HD 5870(Image credit score: Future)In 2006, AMD acquired ATI, the corporate behind Radeon. AMD sought to marry its Athlon CPUs with Radeon GPUs, however that might take some time. Of course, ATI was within the center of creating extra desktop graphics playing cards, and now AMD had a say in it. AMD had determined to cost its 2007 HD 3000-series fairly aggressively, particularly with the flagship HD 3870. This would herald a change in course for AMD and ATI.Starting with the HD 4000-series, AMD pursued its “small die technique.” AMD believed it might practically match Nvidia’s huge flagship GPUs with smaller GPUs (large and small referring to the scale of the silicon die), which might save heaps of cash in each improvement and manufacturing. Those financial savings could possibly be used to make small die GPUs less expensive than Nvidia’s flagships, which might permit AMD to take tons of market share and break Nvidia’s grip available on the market.The HD 4000-series in 2008 was largely a proof-of-concept, because the HD 4870 could not catch as much as the GTX 280, however it got here awfully shut. The HD 5000-series in 2009 was a greater realization of AMD’s new method to gaming GPUs, with the HD 5780 main because the flagship. The 5870 simply beat Nvidia’s GTX 285, however this was only a refreshed 280. Surely, Nvidia’s subsequent flagship would carry AMD to its knees.Except Nvidia gave the world GTX 400-series GPUs powered by the Fermi structure, which is universally agreed to be Nvidia’s worst catastrophe ever. The GTX 480 was sooner than the HD 5870 in most video games, however it consumed an unbelievable quantity of energy for the time and it value $500, versus the 5870’s $350 price ticket. This was the small die technique in motion, going precisely in response to AMD’s plan.The HD 5000-series was so environment friendly that AMD was even capable of make a dual-GPU graphics card, the HD 5970, which basically had two 5870s. Of course, this card relied on CrossFire, which was by no means all that dependable, however nonetheless it was highly effective when it did work and did not require a nuclear energy plant to function.Thanks to the small die technique, AMD was additionally practically capable of overtake Nvidia in marketshare for the primary time ever. According to Jon Peddie Research, AMD hit an all-time excessive of 44.5% GPU marketshare in Q2 of 2010, months after the launch of the HD 5000-series. AMD by no means ended up getting any farther than that, however it was so remarkably near attaining the bulk marketshare for a short time.Unfortunately, there was one half of the small die technique that did not actually work: cash. Although AMD had gained heaps of marketshare and offered heaps of GPUs, it wasn’t actually turning a revenue. Nvidia however was raking in money even with the notorious GTX 400-series, and that compelled AMD to desert its aggressive costs. The HD 5870 might be the perfect bang for buck flagship GPU there ever was, and we’ll in all probability by no means see something prefer it ever once more.1 — Radeon 9700 Pro(Image credit score: VGA Museum)While the Radeon 9700 Pro is unquestionably the perfect Radeon GPU ever made, technically it isn’t AMD’s greatest because it was made again earlier than the corporate acquired ATI. Still, to not point out this legendary GPU can be flawed, as a result of it is arguably the ancestor of trendy gaming GPU flagships.Throughout the late 90s and early 2000s, the sector of firms making gaming GPUs for desktops sharply declined. This was largely as a result of Nvidia was so profitable, and the corporate’s GeForce 256 with hardware-accelerated remodel and lighting highlighted this. ATI was the one firm that was capable of stand as much as Nvidia, although its brand-new Radeon 7000 and 8000 playing cards weren’t precisely robust competitors.ATI determined to do one thing radical and completely change the sport. Most high-end graphics playing cards again then used a graphics processor roughly 100mm2 to 140mm2 in measurement. Instead of making a normally-sized GPU, ATI deliberate out the most important graphics chip the world had ever seen: R300. Coming in at 215mm2, it was over twice the scale as the corporate’s earlier R200-based Radeon 8000 GPUs, and 50% bigger than Nvidia’s GeForce 4 Ti in addition to containing 75% extra transistors.Such a large disparity in measurement would additionally imply an equally massive disparity in uncooked horsepower. The 2002 contest between the flagship Radeon 9700 Pro and Nvidia’s puny GeForce 4 Ti 4600 was at all times going to be a massacre. The 9700 Pro achieved what is sort of probably essentially the most full victory a flagship GPU has ever obtained towards one other bona fide flagship, beating the Ti 4600 in just about every part by a big margin. The 9700 Pro, although solely remembered by PC veterans and historians today, is synonymous with unbeatable efficiency.But past attaining an unbelievable victory, ATI had confirmed that large GPUs had been the best way ahead for flagships, and there was room to develop. Nvidia began making 200mm2 GPUs in 2003, after which each firms had been making 300mm2 chips in 2004. By 2007, Nvidia’s flagship GeForce playing cards had been practically 500mm2, which continues to be fairly massive by in the present day’s requirements.Although Nvidia claims to have invented the primary GPU with the GeForce 256 (a really doubtful declare), ATI arguably launched the primary recognizably trendy high-end gaming graphics card. The alternative to launch such a product comes alongside extraordinarily sometimes, and it is an achievement Radeon (and AMD too, kind of) will get to assert for itself.

Recommended For You