HISTORY OF GPUS

Every PC enthusiast knows that the graphics processing unit (GPU) is one of the most important components in PC architecture today. The more savvy among us might be well-informed about the role a GPU plays in gaming as well as general computing, but XOTIC PC strives to keep our customers in the know about how components came to be and how they’re changing as we make technological advancements. In this article, we explore the history of graphics processing units. Our journey takes us back to a time when manufacturers were racing to deliver incomparable GPU hardware.

HISTORY OF GRAPHICS PROCESSING UNITS

Back in 1999, NVIDIA popularized the term “GPU” as an acronym for graphics processing unit, although the term had been used for at least a decade prior to marketing the GeForce 256. However, the GPU was actually invented years before NVIDIA launched their proprietary NV1 and, later, the video card to rule them all.

1980s: Before there was the graphics card we know today, there was little more than a video display card. IBM made and introduced the Monochrome Display Adapter (MDA) in 1981. The MDA card had a single monochrome text mode to allow high-resolution text and symbol display at 80 x 25 characters, which was useful for drawing forms. However, the MDA did not support graphics of any kind. One year later, Hercules Computer Technology debuted the Hercules Graphics Card (HGC), which integrated IBM’s text-only MDA display standard with a bitmapped graphics mode. By 1983, Intel introduced the iSBX 275 Video Graphics Controller Multimodule Board, which was capable of displaying as many as eight unique colors at 256 x 256 resolution.

Just after the release of MDA video display cards, IBM created the first graphics card with full-color display. The Color Graphics Card (CGC) was designed with 16 kB of video memory, two text modes, and the ability to connect to either a direct-drive CRT monitor or a NTSC-compatible television. Shortly thereafter, IBM invented the Enhanced Graphics Adapter (EGA) that could produce a display of 16 simultaneous colors at a screen resolution of 640 x 350 pixels. Just three years later, the EGA standard was made obsolete by IBM’s Video Graphics Adapter (VGA). VGA supported all points addressable (APA) graphics and alphanumeric text modes. VGA is also known as Video Graphics Array as a result of its single-chip design. It didn’t take long for clone manufacturers to start producing their own VGA versions. In 1988, ATi Technologies developed the ATi Wonder as part of a series of add-on products for IBM computers.

1990s: Once IBM faded from the forefront of formative PC development, many companies began developing cards with more resolution and color depths. These video cards were advertised as Super VGA (SVGA) or even Ultra VGA (UVGA), but both terms were too ambiguous and simplistic. 3dfx Interactive introduced the Voodoo1 graphics chip in 1996, gaining initial fame in the arcade market and eschewing 2D graphics altogether. This hardcore hardware led to the 3D revolution. Within one year, the Voodoo2 was released as one of the first video cards to support parallel work of two cards within a single PC. NVIDIA entered the scene in 1993, but they didn’t earn a reputation until 1997 when they released the first GPU to combine 3D acceleration with traditional 2D and video acceleration. RIVA 128 did away with the quadratic texture mapping technology of the NV1 and featured upgraded drivers.

Finally, the term “GPU” was born. NVIDIA shaped the future of modern graphics processing by debuting the GeForce 256. According to the NVIDIA definition, the graphics processor is a “single chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second.” The GeForce 256 improved on the technology offered by RIVA processors by taking a large leap in 3D gaming performance.

2000s: NVIDIA went on to release the GeForce 8800 GTX with a texture-fill rate of 36.8 billion per second. By 2009, ATI released the colossal Radeon HD 5970 dual-GPU card before being taken over by AMD. At the dawn of virtual reality in consumer technology, NVIDIA developed the GeForce Titan, which has become the forerunner of graphics technology since. NVIDIA sees multi-chip GPU architecture as the future of graphics processing, but the possibilities are endless.