The performance of Intel’s GMA X3100 is a question on a lot of peoples’ minds, at least if our forums here are any indication. And why shouldn’t people be curious? The GMA X3100 is ubiquitous in retail stores, where it’s next to impossible to find a notebook with dedicated graphics for under a grand and a half. If you’re buying Intel, odds are pretty good you’re buying a GMA X3100.
When I went to the Game Developers Conference in February of 2007, I was pleasantly surprised by what appeared to be a renewed interest in gaming from Intel. They were very excited by their GMA X3000 series, which the X3100 is a part of, and why wouldn’t they be? Here’s the first graphics core they’ve made that’s actually designed to do something other than put an image on your screen or run Aero Glass. They demonstrated a real desire to make their integrated graphics competitive with the parts from ATI and nVidia, and to raise the bar for gaming developers so that the “lowest common denominator” they had to code for would be a good one. They even took me behind closed doors and demonstrated their IGP running Half-Life 2: Episode One at medium settings at 1024×768, very playably.
So GDC has come and gone, about ten months have passed and several new drivers for the GMA X3100 have materialized. The question becomes: has Intel made good on their promise?
To test the Intel GMA X3100, I’m using the HP dv2615us outfitted with 2GB of RAM. Specifications are as follows:
- Intel Core 2 Duo T5250 (1.5 GHz, 667 MHz FSB)
- 2GB DDR2-667
- Intel GMA X3100 with latest drivers, version 15.7
- 160GB 5400rpm Hitachi SATA Hard Disk
- DVD-RAM w/ Lightscribe
- Intel Wireless 3945ABG w/ Bluetooth
- Windows Vista Home Premium 32-bit
The processor is one of the low men on Intel’s totem pole right now, but is nearly as common as the GMA X3100 in retail right now. It’s become a very popular baseline part, and that’s understandable: even at 1.5GHz, systemwide performance is formidable as I demonstrated in my last article where I pit it against a Turion 64 X2 clocked at 1.9GHz.
The GMA X3100 itself is theoretically a pretty formidable piece of technology, at least for an integrated graphics part. It is presently the only IGP to use the unified shaders specified by Microsoft for DirectX 10 support, featuring 8 scalar shading units that can process video, vertex, and texture operations. It’s also the first part from Intel that supports texture and lighting in hardware. While it doesn’t feature its own memory, it can address up to 384MB of system RAM for video memory, and it addresses the amount of RAM it uses dynamically to ensure maximum performance.
Presently, Intel’s most recent driver enables hardware vertex shader processing, though this can be switched to software using a really roundabout and frankly tedious and over-involved method. The drivers for this hardware have been under constant updating and improvement over the past ten months, so we’re going to see what these improvements have yielded.
As a sidenote, I’m handling this review a little bit differently than my last two IGP reviews. Here, I’m going to focus a bit more on the hard numbers I was able to glean out of this hardware, so feedback on whether you like this approach better or worse than my previous approach is appreciated. Part of the reason I’ve chosen to do this is because, frankly, there’s a lot of conflicting information on the forums about the performance of the GMA X3100 and I wanted to be able to just deliver straight facts.
DOOM 3 v1.3
If you’ve been reading me for a while, you’re probably aware Doom 3 is one of my favorite whipping boys for graphics hardware benchmarks. Years after the fact, the very scalable (no, really, I’ve run it very playably on a Radeon 8500DV) Doom 3 can still offer a decent amount of punishment for hardware that’s not ready for it. At the same time, a lot of technology it uses has since become the norm (stencil shadows, heavy bump mapping, specular lighting).
For Doom 3, switching between software and hardware vertex shader processing yielded no difference in performance, which I find particularly interesting since anecdotally, it’s been my theory that the reason nVidia’s IGPs always murder ATI’s (even the recent X1200 series) in this game because ATI doesn’t include a hardware vertex shader like nVidia does.
All benchmarks were run with the game set to 640×480 Low Quality. As the game was completely unplayable with everything enabled, I toggled options to see how it would perform.
|w/o Specular Lighting||9.4|
|w/o Shadows and Specular Lighting||11.9|
The game is unplayable. At certain points during the demo it ran smoothly, but the instant the environment got even the slightest bit complex, the framerate took a heinous nosedive.
Given the reputed lousy DirectX 8 performance of the GMA X3100, it’s unlikely that even switching to one of the other codepaths in the game can produce the performance required to make the game playable.
UNREAL TOURNAMENT 2004 v3369
My other favorite whipping boy, I employed uMark to run several benchmarks on this classic. If you can’t play Unreal Tournament 3, this is always available for dirt cheap. If you can…well, you’re probably still better off buying this.
I used uMark to benchmark the game in the Icetomb and Inferno maps with 7 bots enabled. I’ve found these to be two of the most demanding maps in the game.
Benchmarks were run with all settings at Normal, all checkboxes checked, and shadows set at “blob.” Benchmarks labeled “HS!” were run with all settings at their highest level, to the point where the game’s announcer actually says “holy s#!+” when you set the last one. (Note: this isn’t a joke, as most of UT2k4 fans can attest.)
|Resolution||Icetomb FPS||Icetomb Min. FPS||Inferno FPS||Inferno Min. FPS|
|640 x 480||41||12||35||8|
|640 x 480 HS!||31||11||37||8|
|800 x 600||40||12||34||8|
|800 x 600 HS!||30||11||35||6|
|1024 x 768||35||12||29||9|
|1024 x 768 HS!||24||6||27||5|
As you can see, the GMA X3100 actually acquits itself pretty favorably in UT2004, delivering solid performance all the way up to 1024×768 at medium settings. This is a massive improvement from the GMA 950, which couldn’t scrape these numbers the last time I played with one. So bad was that chip’s performance that the game actually ran better and smoother under the software renderer.
Unfortunately, this isn’t all bread and roses. While the GMA X3100 offers very playable performance in UT2004, the framerate is erratic. It’s playable, but it’s not smooth. I don’t expect miracles out of integrated hardware, but after reviewing nVidia’s 7150M and the Go 6150 before it, I’ve come to know what to expect from graphics hardware. The high highs and LOW lows of the X3100 can make gameplay a pretty bumpy ride and as you can see from the benchmarks, a 35 frame per second average doesn’t do anyone any favors when the framerate can drop as low as an unplayable 6 frames per second.
FAR CRY v1.4
Far Cry is one of my favorites for benchmarking, but the flipside is that even though technically the game scales well, the visual difference is always painfully pronounced. At low settings, Far Cry can be pretty depressing to look at, and you’ll sit there wondering where all the foliage went.
For benching Far Cry, I used HardwareOC’s Far Cry benchmark v1.7 running Ubisoft’s stressful Volcano demo. The benchmark program has three settings for image quality, “Minimum” and “Maximum.” While clearly there isn’t a lot of granularity here, it should suffice.
Far Cry carves out another win for the GMA X3100, offering very playable performance with some minor image enhancements. All told this thing should run Far Cry at 1280×800 with medium/low details no sweat. This is probably the biggest win I’ve seen for an IGP, so at least here we’re seeing the X3100 start to live up to its potential. Far Cry fans will be very happy with the X3100.
F.E.A.R. is another game that scales surprisingly well and is easy to test.
First, an amusing anecdote: running the game with the graphics card set at Minimum actually results in WORSE performance than running it with the graphics card set at Low. This can very likely be attributed to using DirectX 8 shaders instead of DirectX 9’s, which the GMA X3100 handles much better.
With the computer and graphics card set at “Low,” I achieved the following numbers:
|Resolution||Avg FPS||Min FPS||Max FPS|
Below 1024×768, the X3100 acquitted itself VERY well and unlike in other games, actually produced fairly smooth framerates. I’ve watched this benchmark a thousand times, and on the X3100 it always slowed down and sped up precisely as I’d expect from any AMD or nVidia part. F.E.A.R. fans are in for a treat here: the X3100 runs it very well for an IGP as long as you don’t enable Volumetric Lighting or Shadows, which seem to be historically the achilles heels of IGPs everywhere.
HALF-LIFE 2: LOST COAST
And here we come to what was hands down the most aggravating, major disappointment of the entire suite of benchmarks. Half-Life 2: Lost Coast was pretty stressful when it came out, but Valve’s Source engine is in my opinion the most well-developed, scalable engine on the market. I’m not a huge fan of the games, but Source is an impressive piece of technology.
It also bears mentioning that Half-Life 2 was Intel’s bread and butter at GDC. Their big win was playable performance in Episode One, which inherited the technology pioneered in Lost Coast.
So I set everything to its lowest setting and ran the video stress test. Here’s what happened:
- At 640×480, it produced an average of 20.35 frames per second.
- At 800×600, it produced an average of 19.15 frames per second.
- At 1024×768, it produced an average of 16.39 frames per second.
This is horrendous. Lost Coast isn’t THAT much more stressful than vanilla Half-Life 2 at these settings. I’ve read people getting solid performance in Half-Life 2 on the forums, but that clearly didn’t reproduce here, where framerates were erratic at best and performance was dismal.
I’m sure by now you’ve noticed an abundant lack of the screenshots which usually accompany my reviews. Simply put, image quality on the X3100 is, much like the framerates often are, erratic. I’ve played very little on it where there wasn’t some kind of artifact that creeped up, be it the odd polygon here, or nasty texture flicker there. About the only game I ran on it that seemed to look right was Far Cry.
And then there’s the erratic performance to discuss. While a couple of games like Far Cry and F.E.A.R. delivered as well as any competing ATI or nVidia part, the majority of my experiences with the X3100 have been disappointing. In some games, framerate slowdown appears random. In others, it goes to hell in places an ATI or nVidia part wouldn’t struggle with.
I’ve heard from multiple sources that Intel’s graphics driver team is hopelessly overworked and understaffed. We’ll just call Intel’s graphics driver team “Jim.” Jim, I know you’re trying your best, but you really need some help, and you really need to prioritize.
The forums here are quick to point out that the X3100 offers much better performance in XP. At first glance, that seems reasonable, but the more you noodle it, the more insane it sounds. Simply put, the X3100 was released AFTER Windows Vista. So logically, Vista-based notebooks with X3100s are going to radically outnumber the XP-based ones. Splitting resources between the two platforms just seems like suicide: focus should be on Vista at this point, because Vista’s what’s shipping. Does the X3100 put a picture on the screen in XP like the GMA 950 did? Fantastic! Let’s move on.
The real crime here is that, to me at least, the promise that Intel made in February hasn’t been fulfilled. Performance has improved, sure, but if Intel seriously cares about competing with ATI and nVidia they’re really going to have to kick it up a notch, they can’t just build a GPU and “hope” it works.
The forums are useless for figuring out performance of the X3100 because it varies so wildly, because it’s so inconsistent and random. You can’t even throw out a general performance estimate for it, games are constantly case-by-case. Half-Life 2, for example, should run absurdly well given the great performance Far Cry – a much more stressful, much less optimized game – puts out. Yet it doesn’t.
When I buy a game in the store, I can be reasonably certain that when I run it on my desktop at home on my GeForce, it’ll work. It’ll look like it’s supposed to and it’ll run consistently. If worse comes to worst, I can usually change the driver I’m running to one that WILL work. The support structure is there. The GMA X3100 is such a crapshoot right now that it’s not even worth fighting with. I think I was actually insulted by how poor its performance in Lost Coast was.
Of course, I do have a solution for all the casual gamers out there. If you must have Intel (and at this point, that’s certainly reasonable), you’re in luck. HP and Dell both have great mainstream notebooks on the market right now at reasonable prices that you can get a dedicated graphics card in. It doesn’t have to be great, it just has to work fairly well. nVidia’s GeForce 8400M GS is all over the place right now. Before using an academic discount, you can get a dv2500t from HP with the 8400M GS for $874. From Dell, you can get it in an Inspiron 1420 for $800. And if you’re willing to go up to a 15.4″ notebook, the price for it really just tanks.
If you’re going to game even casually on your laptop, unfortunately, I don’t think the X3100 can be settled for. While technically the hardware is there, it’s not worth gambling on it, hoping that eventually Intel will magically make it suck less. There just aren’t any guarantees.
Film students/filmmakers have an old joke: “fix it in post.” It’s funny because if your production starts living by this motto, by the time you get to post-production, your project is going to be borderline unsalvageable.
Apparently Intel was planning to fix it in post. Good luck with that.