The life of a PC
gaming enthusiast is a constant battle between saving some moolah and buying the latest and the greatest
graphics card. That card, from deep within its dark transistor-filled dungeon, would always hold the promise of squeezing out greater performance from the system. Nvidia and
AMD (which acquired ATI in 2006) are the two super giants of the GPU world, and compete for the coveted space in your system’s casing. Both companies offer extremely powerful cards that are capable of handling cutting-edge innovations and data processing, to provide us with jaw- dropping visual candy. And both have been at each other’s throats for quite some time. Luckily, the years of friction have given us, the consumers, some drool-worthy products.
Why Nvidia?
Nvidia started its journey in 1993, but truly picked up steam after introducing the famous “GeForce” series. Since then, it has capitalized on its success and gobbled up many other smaller companies. Nvidia managed to secure a contract to develop the graphics hardware for Microsoft’s
Xbox console, and later Sony’s PlayStation 3 made it the leading independent GPU manufacturer. Now, though Nvidia is much more than a GPU producer, its core remains the same. It has also stepped into the Smartphone and tablets market. providing processing power to a number of
gadgets.
What about AMD?
AMD’s need to fuel its latest series, marketed as AMD Fusion, promised a one- die solution for both GPU and CPU. The final design of the Fusion lineup was a merger between AMD and ATI, which was bought out in a huge 5.6 billion US dollars deal.
ATI was never a GPU manufacturer per se, as they always headed the research and development aspects of business, while third-party publishers mass produced the GPUs. So AMD’s takeover didn’t cause a major overhaul of the company. ATI’s venture into the gaming console resulted in powering Nintendo’s GameCube, Wii and most importantly Microsoft’s
Xbox 360.
The difference?
This Is the section where I have to constantly look over my shoulder because PC fan boys will tear this issue (or me) apart if they find this article leaning over to any particular side (deep breath!). Comparing AMD and Nvidia is relatively hard, since they both take completely different approaches. AMD’s flagship is the Radeon lineup, a successor to its Rage series. Nvidia’s jewel in the crown is the GeForce series. Though both these companies offer mobility solutions as well, we will only talk about their discrete graphics cards. These cards vary in memory, clock speed, visual technologies, pixel pipelines, form factor (heat sinks), price and architecture.
Nvidia offers a cocktail of innovations in its cards; SLI (scalable link interface), which allows parallel processing by cramping two or more cards together, PhysX support, Pure Video, GPU Direct, 3D Vision Surround and CUDA (Compute Unified Device Architecture — Nvidia’s parallel computing architecture) have made significant impact in tilting the pivot. AMD, on the other hand, has its share of novelties such as Vision Engine, Crossfire X (in answer to Nvidia’s SLI capability), Dual Graphics, Eyefinity and HD3D technology. All of these remarkable advancements require in-depth analysis and stress tests. As a user you will find that some of these technologies will cater to your needs more than others.
The current lineup
Although it is not possible to announce either Nvidia or AMD as the overall winner, I have dared to compare a few of their cards, with respect to their pricing and ‘weight’.