When you look at some of the best graphics cards of today, it’s easy to forget that Nvidia and AMD (and more recently, Intel) weren’t always the only players in the GPU game. While both AMD and Nvidia have committed their fair share of GPU blunders, they’re not the only two brands behind some of the worst GPUs of all time.
Let’s take a look at some of the graphics cards that will make you appreciate the current GPU landscape, and yes, even including cards that are borderline mistakes. (Hello, RTX 4060 Ti.) Here are the GPUs that did it terribly, terribly wrong, even though each had something interesting or innovative to bring to the table.
We’re focused on the past here, mainly around brands that have faded from the limelight. Make sure to check out our other roundups for more modern examples:
Arc Alchemist wasn’t Intel’s first venture into discrete GPUs, and neither was the DG1. Long before either of those projects came the Intel i740, and it’s a GPU that makes all of Intel’s other attempts look just that much better.
In the mid to late 1990s, Intel jumped on the 3D graphics acceleration bandwagon. The burgeoning PC gaming market, with titles like Doom, Quake, and Tomb Raider, was really starting to bring into focus just how much 3D graphics would matter in the future. Perhaps this was what tempted Intel to stray out of its primary domain — which, even back then, was making some of the best processors — and try to make discrete GPUs.
The Intel740, also known as i740, was released in early 1998. It was a 350nm GPU that relied on the now long-forgotten AGP interface, which looked promising compared to PCI (mind the difference — not PCIe) back in those days. In fact, it was one of the first GPUs to utilize AGP, which later proved to play a part in its downfall.
It clocked at a modest 66MHz and had 2-8MB of VRAM across a 64-bit bus. Those specs sound laughable when judged by today’s standards, and even back then, they were a bit off. The amount of VRAM was lower than some of Intel’s competitors were able to provide, but the AGP interface was meant to help; unfortunately, it only served to reduce CPU performance by stuffing the main RAM with textures and taking up the processor’s capacity. The GPU was also affected by this convoluted process.
Despite a lot of hype, the Intel740 fell flat. Although it may have been meant to become a solution for rendering 3D graphics, it sometimes failed to handle them well, delivering artifacts and low visual clarity instead. Rumors of its poor performance were quick to spread. Although Intel mostly targeted pre-built PC manufacturers (OEMs) with this GPU, it only took a short time to be forgotten as gaming enthusiasts knew to stay away from the i740.
The graphics market was very volatile back then and evolved rapidly, so a flop like that must have been a setback for Intel. However, after a couple more attempts at making discrete GPUs, it switched to integrated graphics, where it found success for the years to come.
Before we settled into the current landscape of AMD, Nvidia, and Intel, the GPU market had a few more names vying for attention. One such company was S3, which rose to fame very quickly in the early-to-mid 1990s. Much like Intel, S3 capitalized on the 3D graphics boom and designed graphics chips that offered 3D acceleration. In the end, the S3 ViRGE became known as a “3D decelerator,” and is now remembered as one of the worst GPUs of all time.
Upon launch, the S3 ViRGE was marketed as the “world’s first integrated 3D graphics accelerator.” It was, indeed, one of the first such chipsets designed for the mainstream market. It supported around 2MB to 4MB SDRAM across a 64-bit bus and had a core clock of 55MHz. It could render both 2D and 3D graphics and offered resolutions of up to 800 x 600 in 3D. While it did a decent enough job in 2D, it failed to impress in 3D — and that was the whole purpose and marketing scheme for the chip.
When faced with relatively simple 3D rendering, the S3 ViRGE was actually a little bit faster than the best CPU-based solution of those times. However, when it came to the increasingly complex rendering required for 3D games, including tasks like bilinear filtering, the GPU actually proved to be slower than software rendering (which essentially meant using the CPU for graphics purposes). This is what earned it the mocking name of “world’s first 3D decelerator,” because users would prefer to turn off the 3D acceleration and just use the CPU instead.
Word of the chip’s poor 3D performance quickly got around, and the rapid shift from 2D to 3D in the gaming market didn’t help here. S3 attempted to fix what went wrong with future GPUs, such as ViRGE/DX and the ViRGE/GX, but it had pretty fierce competitors in Nvidia, ATI (later AMD), and 3dfx. Ultimately, S3 couldn’t compete in the growing 3D market, although it kept making chips for the midrange segment.
Meet the GeForce FX 5800 Ultra — the first (and only?) GPU that Nvidia made a spoof video about. Yes, Nvidia itself made a two-minute video mocking this GPU, but it wasn’t until after it was released to the market and became known as the “Dustbuster” of graphics cards.
Nvidia had big plans for the FX series. It was meant to be this big leap into the DirectX 9 era, which was a significant transition for PC gaming. This GPU came at a time when Nvidia was already a market leader, although ATI Technologies was closely behind with the Radeon graphics card line. Nvidia’s stumble with the FX series was an unexpected setback, but as we now know, ATI’s/AMD’s dominance didn’t last long and Nvidia now controls the majority of the market, perhaps to the detriment of PC gamers.
The FX 5800 Ultra was manufactured on a 130nm process and clocked at 500MHz (clock and memory, for an effective 1GHz). It used 128MB GDDR2 memory across a 128-bit interface. Nvidia decked it out with the CineFX architecture to enhance cinematic rendering and built it with the plan to make it efficient at DirectX 9 shader processing.
On paper, it sounded great. In reality, it was decidedly not. It did well enough in DirectX 8 games but struggled with certain DX9 titles, and ATI’s Radeon 9700 Pro was an enticing alternative that didn’t have the same issues. However, the main problem with the FX 5800 Ultra was the noise.
Nvidia implemented an innovative cooling solution in this GPU called the FX Flow. This was meant to keep the GPU, which normally ran hot, at a comfortable temperature even during heavy gaming. However, the tiny fan that powered the contraption had to run at a really high speed in order to keep up. The result was some of the loudest noise a consumer GPU was ever known to produce.
Nvidia didn’t stick to this cooling model for long. Most of its partners reverted to traditional cooling methods for the FX 5900 XT and 5900 Ultra, and we haven’t seen anything like it ever since.
3dfx was once a formidable rival to Nvidia and ATI. It rose to fame in the early 1990s, and like several other GPU makers of that time, it rode the wave of 3D graphics until it crashed and burned. Nvidia eventually bought most of its assets in 2000. While the company’s decline can’t be attributed to a single card, it had some interesting solutions that ended up failing in the mainstream market, and the 3dfx Voodoo Rush GPU is perhaps one of the most recognized examples.
The Voodoo Rush chipset was a follow-up to the company’s initial product, the Voodoo1. It integrated 2D and 3D acceleration into a single card, pairing the 3D capabilities of Voodoo graphics with a 2D core from other manufacturers. That’s right, I’m talking about a dual-chip configuration here.
The GPU served up 6MB of EDO DRAM, a maximum core clock speed of around 50MHz, and support for things like the Glide API, Direct3D, and OpenGL, as well as a maximum resolution of 800 x 600 in 3D applications. It sounded promising on paper, but once the actual product was out and people could test it, several problems peeked through the hype.
For one, it was a massive GPU with a tendency to heat up, but the main issue lay in the architecture and how it all added up to performance that was often worse than the Voodoo 1 in 3D games. Compatibility problems and visual artifacts weren’t uncommon, and once those issues came to light, reviewers and users alike turned their back on this GPU.
The poor reception of the Voodoo Rush wasn’t what ultimately sealed 3dfx’s fate, though. It went on to produce more GPUs, including the (also controversial) Voodoo 5 6000 which came with its own power adapter. Let’s hope Nvidia doesn’t come up with a similar idea for one of its next-gen behemoth flagships, because the end result was pretty funny to look at.