​​GPU prices could spike again as rumors indicate AMD wants to prioritize AI – what could that mean for gamers?

AMD may nix the Radeon RX 8800 and 8900 so it can join Nvidia in fuelling the AI rush

When you purchase through links on our site, we may earn an affiliate commission.Here’s how it works.

AMDmay scrap the high-end options of its next generation of Radeon gaming GPUs to divert scarce resources into building GPUs for AI and high-performance computing (HPC) instead – a segment that’s undergoing something of a boom.

When AMD launches its RDNA 4 family of GPUs, possibly next year, there won’t be an AMD Radeon RX 8800 or 8900, according toTechSpot. This will give its rivalNvidiaa clear run at manufacturing thebest GPUsto meet the high-end gaming market, but could also serve to constrain supply and spike prices.

The line-up will resemble the RDNA 1 family ofAMD GPUs, according to sources speaking to the publication, where the most powerful entry was the RX 5700 XT GPU. Subsequent generations included higher-end units such as 6800, 6900, and 6950 in RDNA 2, and 7800 and 7900 in last year’s RDNA 3 series.

AMD wants in on the AI boom

AMD wants in on the AI boom

This rationale is simple. There’s a rush for hardware and components to service generative AI workloads – alongside a limited supply of resources and manufacturing capacity – and AMD wants to get in on the action.

Indeed, this is a segment in which there’s currently a shortage, with chipmaking giant TSMC lacking the capacity to ramp up production from vendors like Nvidia to meet industry demand.

Nvidia’sA100 and H100 chips, incidentally, currently lead the way in an AI servers market that’s reportedly set to surge to$150 billion by 2027, with AMD hoping to be a part of it. The main reason is Nvidia is enjoying profit margins of 823%, according to sister siteTom’s Hardware, on its H100 GPUs.

Rather than diverting semiconductors into its high-end consumer GPUs, the firm will focus on field programmable gate arrays (FPGAs) and general-purpose graphics processing units (GPGPUs). This is according toBrits and Chips - Eng.

Are you a pro? Subscribe to our newsletter

Are you a pro? Subscribe to our newsletter

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

The circuitry of the former is highly suited to machine learning and deep learning, while the latter are GPUs that also handle computational workloads normally undertaken bythe best CPUs. They’re both ideal to meet the rising demand for GPUs for AI.

With constrained supply, however, does does mean we may see a return to the shortages and spiking prices for GPUs that we last saw in 2020. With fewer options, gamers may find themselves paying above the odds whenbuilding PCs, for example.

More from TechRadar Pro

Keumars Afifi-Sabet is the Technology Editor for Live Science. He has written for a variety of publications including ITPro, The Week Digital and ComputerActive. He has worked as a technology journalist for more than five years, having previously held the role of features editor with ITPro. In his previous role, he oversaw the commissioning and publishing of long form in areas including AI, cyber security, cloud computing and digital transformation.

This new malware utilizes a rare programming language to evade traditional detection methods

Google puts Nvidia on high alert as it showcases Trillium, its rival AI chip, while promising to bring H200 Tensor Core GPUs within days

Arcane season 2 confirms the hit series isn’t just one of the best Netflix shows ever made – it’s an animated legend that’ll stand the test of time