by Paul Korzeniowski
Blazing performance and clever design have been reasons why NVIDIA gained a legion of loyal followers through the years. However, times and markets change, and the company now finds itself being squeezed out of the desktop graphics arena because of a few of its own missteps coupled with aggressive moves by competitors. Never one to sit by idly, the vendor has been shifting its focus. Its followers will still be able to find the company’s screaming graphics capabilities in the future, but they will probably first operate on tablets and supercomputers rather than PCs and laptops.
The company has a storied history. “NVIDIA entered the GPU market a bit late, so it faced significant competition,” explained Kevin Krewell, senior analyst at The Linley Group. “But the company was quite bold and very aggressive and eventually found its niche.” Founded in 1999, the vendor set out to establish new standards in visual computing and achieved that goal. NVIDIA developed a supportive corporate culture for creative engineers, so it now holds more than 1,800 patents worldwide.
Unlike some technology companies, the firm also did a good job at marketing. “NVIDIA has always had good, aggressive messaging, stemming from president Jen-Hsun Huang’s strong personality,” stated Jon Peddie, president at Jon Peddie Research. For years, the semiconductor maker has been a leading supplier of GPUs and a favorite for desktop users who love gaming or rely on other highly graphic applications.
A Bit Too Aggressive
But its position was threatened a few years ago. Some of the wounds were self-inflicted. In 2008, a new release of the company’s GeForce GPU experienced problems. “NVIDIA tried to push the functionality envelope,” explained Sergis Mushell, principal research analyst at Gartner Inc. “As a result, its chip ran hot.” In addition, it used a lot of power, so notebook users found the system shutting down during epic World of Warcraft battles.
Eventually, NVIDIA fixed the problems and now offers their users a top-of-the-line graphics experience. To save power and reduce heat, its GPU recognizes what applications are running: high resolution 3D graphics are invoked for gamers but shut off as users update their Facebook profile.
However, as it was stubbing its toe, competitors dramatically altered the design of their chips. “Recently, there has been a significant slowdown in the sales of discrete GPUs,” noted The Linley Group’s Krewell. After paying $5.4 billion for ATI in the summer of 2006, AMD decided to integrate its CPU and GPU. Intel followed suit with its chips. The new chipsets began to ship in volume in 2010 and quickly gained traction.
A Risky Venture
There was an element of risk with the move. Combining the two products required technical ingenuity. The vendors had to change the design of their microprocessors, so all of the components fit on one slab of silicon. Also, funneling all of the functionality into a small form factor increased the likelihood of problems, such as overheating and short battery lives, for users.
But the switch offered users a number of potential benefits. Because the components footprint was smaller and all the elements integrated, performance increased. The integration would enable vendors to lower costs. Upgrades became simpler because enhancements to the GPU occurred simultaneously with those for the CPU.
To date, the benefits have outweighed the downsides. Jon Peddie Research found that NVIDIA’s market share fell 2.5% from quarter to quarter. The company now accounts for 20% of graphics chips sold on x86 devices. That number represents a significant drop from last year’s 28%, a change driven largely by rising sales of integrated CPU/GPUs.
Lacking the Proper License
Moving forward, NVIDIA will probably continue to have trouble at the desktop. Because it lacks the licensing agreement needed to build x86 CPUs, the company cannot deliver integrated CPU/GPUs for PCs and laptops. Faced with the recent market changes, NVIDIA has begun to shift its focus.
So, where is it going and what impact will the new direction have at the desktop? Recently, the use of GPUs in supercomputing became more common. For instance, advances in medical imaging have enabled health care providers to turn around items, like breast exam results, faster.
NVIDIA has moved into the supercomputer space and been trying to make its GPUs more programmable. The company has made progress there. Now, three of the world’s top five supercomputers use NVIDIA GPUs, and Oak Ridge National Laboratory built a 20 petaflop computer using NVIDIA silicon.
Pages: 1 2