by Dustin Sklavos
This is the entry to the How it Works series that I’m sure some of the more knowledgable readers have been waiting for: graphics hardware. This is one of the most often misunderstood parts of a notebook, or even a desktop, really. It’s something that I’ve struggled with for a while, and recent models from NVIDIA and ATI (mostly or almost entirely NVIDIA at this point) have only served to confuse things further.
But before I get into things, I want to be clear about something: this is NOT going to be a technical article. Take your memory buses, pipelines, and shaders elsewhere. I’m not even going to talk about DirectX, really. Why? Because anything on the market that would require Shader Model 3 needs a dedicated graphics card anyhow to be halfway playable, so ATI’s Radeon X1250/X1270 with their halfway implementation aren’t even really relevant. And there are no games that require DirectX 10, which has largely flopped anyhow. It’s just not worth going into.
I will say that the notebook graphics market, so simple just a couple short years ago, has gotten patently stupid at this point. NVIDIA’s model vomit that we can affectionately dub the GeForce 9 series is so convoluted it has actually prevented me from updating my notebook graphics guide. Simply put, I can’t keep up with it and I can’t make heads or tails of it. What the heck is a 9650? Your guess is as good as mine. ATI, conversely, completely streamlined their model number system.
How It Works: Graphics Hardware
Fundamentally, the graphics hardware in your notebook is what puts a picture on the screen. It’s responsible for a couple more things, though: it’s responsible for handling computer games that require 3-D rendering, it’s responsible for running Vista Aero Glass, it’s responsible for decoding some video so the CPU doesn’t do all the work (see Part III), and it’s responsible for driving an additional screen should you connect one to your laptop.
At its most basic, graphics hardware is comprised of two key components. First is the graphics core, or GPU, which is essentially a CPU that is highly specialized for handling 3-D rendering, video decoding, and outputting a picture. Second is the video memory, which is RAM used by the GPU to buffer images, store visual information for games, and so on.
One of the major points about graphics hardware is whether it’s dedicated or integrated, and I’ll explain what that means soon. But first, it’s important to understand the two components in greater detail.
Graphics Core or Graphics Processing Unit (GPU)
As I said before, this is basically a specialized CPU designed to handle video related tasks. The GPU runs at its own clock speed (MHz, as described in Part I), which chiefly affects gaming performance.
So what do I mean by specialized? Well, your CPU is designed largely to be a jack of all trades. It can do just about anything, but the problem is that for some tasks, it’s just too slow to be usable. For example, the extraordinarily complex graphic effects in modern computer games will completely gum up a CPU. Yet a GPU, designed specifically to handle these effects, can perform exponentially faster and produce a smooth, playable experience depending on the game’s settings and just how powerful the GPU itself is. Likewise, decoding high definition video is an extraordinarily hardware-intensive task that gums up all but the most powerful of modern CPUs. A GPU with proper high definition support built into it, however, can radically reduce the amount of CPU power required to play back that video.
Which brings us to an oft-ignored but increasingly vital component of the GPU: hardware support for video decoding. What this means is that the GPU transparently controls how a lot of the video on your computer is played back. Generally, no checkboxes or settings need to be changed for your GPU to handle it. In fact, a visit to the control panel of your GPU can show you just how fully-featured it really is. You can control brightness, contrast, saturation, and a wealth of other settings from there. Most GPUs include some form of de-interlacing, which can produce a cleaner, crisper video. Likewise, many of them include noise reduction, which can help smooth out grainy video. Of course, if these optimizations don’t appeal to you, they can also be disabled.
This component is becoming particularly vital with the advent of high definition video. High definition video is simply too much for most notebook CPUs to handle on their own. Even if the CPU can, it generally has to run at near full bore to play it back properly and without stutter, and that has a profoundly deleterious effect on battery life. But if the GPU is designed to handle that video, it can do most of the heavy lifting without anywhere near such a profound impact on battery life.
Of course, the most important place this aids is in watching Blu-ray. Mercifully, manufacturers will seldom sell you a notebook with a Blu-ray drive that can’t properly play back that content, and the logic here is fairly obvious: Dell and HP don’t want their customer service department getting phone calls about why "Transformers" (terrible movie that it was) is chopping and stuttering. Equipping their notebooks properly from the outset avoids this problem entirely.
I’ve spoken a whole lot about video playback, and that’s largely because gaming is the more obvious application. Unfortunately, it’s not easy to just say "oh this GPU is the best for gaming." GPUs are extraordinarily complex in design – in many ways more complex than CPUs – and it’s just too difficult to tackle in a basic article like this. As with CPUs, one GPU running at 500MHz can be significantly slower or faster than another one running at the same speed. Even then, performance can become dependent on how much video memory the GPU has (more on this in the next section), how much bandwidth that memory has, and even what game it is. "Crysis," for example, runs pretty poorly across the board (even on top of the line desktop machines), but tends to run better on NVIDIA hardware. "Assassin’s Creed," on the other hand, can run markedly faster on ATI hardware if left unpatched (this is a rant for another time).
Much as your CPU requires RAM (see Part IV) to operate efficiently, the GPU requires RAM of its own. Video memory is oftentimes much more expensive than typical computer memory and is designed to run substantially faster to feed what is often a hungry graphics core. So while on a computer you’ll see DDR2 or DDR3, video hardware can have DDR2, GDDR2, GDDR3, and even GDDR4 or GDDR5, and all of these are different kinds of video memory. This is a pretty easy one, though: the vast majority of the time, "higher is better."
While your CPU may have its memory controller in the northbridge (as is the case with current Intel CPUs) or onboard (as is the case with current AMD CPUs), the GPU’s memory controller is always onboard. As mentioned in the memory article, keeping the controller onboard allows for improved performance. On graphics hardware, this can make all the difference in the world and it’s an easy way to improve performance without having to raise video memory or GPU clock speeds.
Though the GPU has a major effect on all facets of your computing experience, video memory’s is largely concentrated in gaming performance.
Now that we understand these two concepts, it’s time to bring them together.
Integrated and Dedicated Graphics
The key differences between integrated and dedicated graphics are where the GPU is located and if the GPU actually gets its own dedicated video memory.
An integrated graphics part builds the GPU into the northbridge and as a result, the GPU is oftentimes stripped down compared to its dedicated counterparts in order to fit into the northbridge alongside everything else the northbridge has to handle. Consequently, it seldom if ever has its own video memory and uses its proximity to the computer’s main memory controller (be it in the CPU as in AMD or in the northbridge as in Intel) to "steal" some of the system memory for itself.
The major drawbacks are fairly obvious: the GPU is stripped down to begin with so at best it may have drastically reduced gaming performance and at worst it may be missing entire features (particularly having to do with video decoding). The lack of dedicated video memory forces the GPU to use system memory which is much slower than the memory typically used for graphics hardware. Worse, video traffic now also has to piggyback on the same bandwidth the rest of the system is already using.
So why would you go integrated? If you’re not planning on gaming or playing high definition content, you don’t really need dedicated graphics hardware. Because the GPU is built into the northbridge and using system memory, it adds no chips to the design of the notebook – chips that draw power on their own. While the integrated GPU will use some memory bandwidth, it typically uses such a minute amount that the performance difference in regular tasks between integrated and dedicated graphics is more or less imperceptible. It’s only when you start pushing your entire system hard (like doing video conversion and encoding) that a difference makes itself known, and even then it’s a very minimal one.
Dedicated graphics hardware, on the other hand, is separate from the rest of the system. Generally it’s either soldered into the motherboard or uses a proprietary connector (more on this later). The benefit is that the GPU’s size (and accordingly complexity) is no longer limited by the northbridge and it has its own video memory which runs at a much faster speed. As a result, gaming is typically much improved and the GPU is much less likely to be feature limited.
The flipside is that a dedicated GPU and its video memory generate their own heat and draw their own power. While ATI and NVIDIA implement measures to reduce their power consumption when they aren’t in use, it’s never going to be comparable to an integrated part.
This is also where I explain to you why you won’t see a high end part like a GeForce 9800M in a 12" notebook. Simply put, the GPU is too big, too complex, draws too much power, and generates too much heat. As GPUs get more powerful, increased size and complexity goes along with that. As they get bigger, they draw more power and thus generate more heat.
The big question is going to be: how can I tell if it’s integrated or dedicated? The easiest way is if the graphics hardware has the word "Intel" in the title. If it does, it’s integrated. As for the rest? If it lists a specific amount of video memory (not a range), it’s dedicated. For example, a "Mobility Radeon HD 3650 512MB" is going to be dedicated.
Now we get to the most important part of the article…
No, You Can’t Upgrade Your Notebook Graphics Hardware
Shut up. Just stop. No, you can’t. You think you can, but you can’t. This is the single most aggravating post that keeps rematerializing in the forums here on Notebook Review: "my graphics are slow, can I upgrade them?" NO. 99.9% of the time, the answer is no, and that .1% of the time it’s yes, it’s going to be yes for someone who already knows what they’re doing and doesn’t have to ask that question.
The reason why should be obvious at this point: notebooks are designed around very specific power draw and heat tolerances. More than that, graphics hardware is often soldered into the motherboard. There just isn’t a slot for this and I honestly don’t expect there will ever be a common one. I know someone in the forums will mention MXM or AXIOM, and to them I say: no. Don’t even bother mentioning them, because you’re just opening a bigger can of worms.
MXM and AXIOM are standards designed by NVIDIA and ATI respectably to allow for at least some upgrade options for video hardware in notebooks that – and this is key – already have dedicated video hardware to begin with. These standards, however, are somewhat rarefied and generally appear only in notebooks by more obscure brands. They’re largely worthless, too. Since support for them is so minimal, you’re not going to find graphics hardware upgrades for your laptop at your local Best Buy the way you can with your desktop. Even specialists like NewEgg and NCIX don’t carry them. They’re almost impossible to get and invariably very expensive. This is the long way of saying "don’t waste another minute thinking about this."
What’s the bottom line here? No, you can’t upgrade your graphics hardware. You’re stuck with the hardware in the notebook when you buy it, so you’d better do your research (the forums here are great for that).
Recommendations and Conclusion
First of all, if you’re planning on playing games on your notebook, rule out integrated graphics immediately. Outside of the ATI Radeon HD 3100 and 3200 at the time of this writing, integrated graphics really aren’t adequate for gaming proper. Intel’s graphics hardware in particular has frequent compatibility problems with games, and games it can actually produce playable performance in are basically a crapshoot. Beyond all that, you’re just going to have to shop around and do some research, and there’s just no way around it. Notebooks in retail seldom have dedicated graphics hardware.
As for brands? Presently I’m an ATI man, but my laptop has an NVIDIA dedicated GPU in it that games alright from time to time. This is one of those situations where it’s probably healthiest to just be brand agnostic and pick whichever suits your needs best.
I am also going to openly recommend against gaming notebooks or buying a notebook specifically for gaming. In my opinion, these are a waste of money. Gaming technology tends to advance just too fast to make these machines worthwhile, and they tend to be at least twice as expensive as a comparable PC. The dedicated PC gamer is going to want to have a desktop to game on, where the parts are cheaper and the options are plentiful. Notebooks are fine for the odd game or bringing to a LAN, but buying a notebook just to game on is really silly. The most powerful notebook gaming hardware results in a machine that, frankly, is just too big to be used on your lap. Thus, it winds up being a glorified desktop anyhow.
Though I typically close these articles with a digest of points, there’s really only one I want to end with here:
- No, you can’t upgrade your notebook’s graphics hardware.
Unfortunately, my sense of responsibility requires me to distill the rest of the article proper, so here goes:
- Much like your CPU and RAM work together, a GPU and video memory are paired together.
- Your GPU can substantially improve video playback, and options exist to tweak it to your liking.
- Integrated graphics are better for battery life and heat but worse for gaming performance.
- If you plan on gaming on your laptop at all, get dedicated graphics hardware. This is identifiable by having a specific amount of video memory attached to it.
The next "How it Works" article is going to make life a lot easier for me: I’m gonna explain hard drives to you!
Thanks for reading and stay tuned!