By: Dustin Sklavos
With the pending release of Windows 7 comes DirectX 11, a new iteration of Microsoft’s programming interface for 3D games. DirectX 10 debuted with Windows Vista to a stunning “meh,” and has since then proven itself to be…meh. Will DirectX 11 redeem the franchise, or just cement the DirectX reputation as mere Microsoft gaming vaporware? We break it down in this preview.
WHERE DIRECTX WENT WRONG
DirectX 10 releases since Windows Vista have been largely uninspiring, with the only success story really being Far Cry 2, a game with a decidedly mixed reputation. Crysis generally runs better in DirectX 9 mode with very little trade-off in image quality, and it seems like only hardware review sites actually run it under DirectX 10.
Simply put, DirectX 10 flopped because it was tied to Windows Vista, which has been unable to escape the pall its horrible release cast upon it. Nearly every example of “this is why DirectX 10 is better” has been a failure, only further exacerbated by game releases that were artificially tied to it. Even now, Capcom has actually come out and said the only reason Resident Evil 5 has a DirectX 10 codepath is for Nvidia’s 3D Vision; Capcom admits there’s no difference in visual quality between DirectX 9 and DirectX 10 in the game. That’s what Microsoft has led us to expect from their “next generation” graphics engine.
WHY SHOULD YOU CARE ABOUT DIRECTX 11?
Well, surprisingly, DirectX 11 brings features to the table worth getting excited about, and it’s entering an environment that’s far less hostile than that faced by DirectX 10. While buzz leading up to the release of Windows Vista and the fallout thereafter pretty much buried DirectX 10, Windows 7 is getting great buzz. Better still, DirectX 11 is offering more tangible reasons to support it and be excited about it, and has a better slate of titles coming up for it. Unlike DirectX 10, DirectX 11 also doesn’t make a clean break with the previous generation, and will be available for Windows Vista as well.
The one good thing DirectX 10 did was push for unified shaders in graphics hardware. Now let’s see what DirectX 11 is going to do with them and the rest of the chip.
Probably the feature that I’m personally most excited about is hardware tessellation. This feature is built into DirectX 11 class hardware (like the just recently released Radeon HD 5800 series), and can allow for substantially improved image quality in a very tangible way without a massive performance hit.
So what the heck does it do? Well, taking the understanding that 3D models in games are fundamentally constructed out of multiple triangles, tessellation breaks those basic triangles down into many smaller triangles, thus giving the appearance of a much more complex surface without the massive coding and effort required to produce that model.
AMD released a slide with their Radeon HD 5800 series promotional material that illustrates tessellation in the upcoming (and, at least in the eyes of this writer, hotly anticipated) Aliens vs. Predator game. You’ll notice how the shape of the alien model becomes much more complex, and more true to the mythos, as a result.
AMD (ATI) has been pushing tessellation in their hardware for longer than you might remember, and by working with Microsoft have finally had it codified in DirectX 11. This at least stands to be a massive improvement in video quality.
Thanks to the proliferation of multi-core processors, DirectX 11 can now make true use of multi-threaded instructions. Instead of burying all of the DirectX calls in a single thread and thus hogging a single core, DirectX 11 can properly split work between the cores of the processor itself. Modern games have been surprisingly CPU-dependent, so this ability to take fuller advantage of modern processors can in turn help the graphics hardware perform.
Notebook users should take particular note of this improvement, as mobile quad-core processors are still very power-hungry and somewhat rarefied. Current and upcoming performance mobile graphics parts are hitting performance ceilings with mainstream dual-core processors, so any improvement is going to be an important one.
Also being inherited from DirectX 10.1 is mandatory anti-aliasing support. Existing DirectX 10.1 games are very uncommon and the feature set only works on ATI’s Radeon HD 3000 series onward, but DirectX 10.1 has consistently provided substantial performance improvements with anti-aliasing enabled (see the unpatched Assassin’s Creed or Tom Clancy’s H.A.W.X.)
And, of course, the unified shader hardware required by DirectX 10 remains a major change in how graphics hardware is produced but also an improvement as well, with GPUs being much more flexible and able to better utilize their existing resources. These unified shaders are also highly programmable, which leads to…
The switch to unified shaders resulted in graphics hardware that’s surprisingly programmable and flexible in general, to the point where it can actually be used to accelerate applications other than games. Nvidia’s been pushing this particularly hard with its CUDA technology, but CUDA only works on Nvidia hardware. DirectCompute is Microsoft’s answer in DirectX 11, and provides a vendor-independent means of harnessing the wealth of processing power in modern graphics hardware to handle other tasks.
Of course, this advance is probably the hardest one to see at present, but the fact of the matter is that both AMD and Nvidia are pushing general computing onto the GPU. It’s easy to understand why: Graphics chips are extremely powerful, complex pieces of silicon capable of vastly improving the speed at which certain tasks are performed.
The first big push has been with video encoding, and while the road has been a bumpy one, it’s yielding fruit. The Badaboom video encoder, for example, gets massive performance improvements from running off of a GeForce’s shaders. Likewise, Nvidia’s push has resulted in PhysX physics engine acceleration being run off of their shaders as well.
With DirectCompute finally standardizing GPGPU usage (General Purpose computing on Graphics Processing Units) on the Windows platform (alongside OpenCL on Macs, Windows, Linux and so on), we’ll see this trend push forward and become more common. I’m hesitant to refer to it as a dead end or anything else because the fact of the matter is that the processing power is there, waiting to be harnessed. AMD’s pushing it hard. Nvidia’s pushing it hard. And Intel’s upcoming Larrabee graphics processor is even designed for that level of flexibility.
An important addendum to make here is that DirectCompute may be coming with DirectX 11, but it’s compatible with DirectX 10 hardware as well. Your existing graphics hardware may yet have some additional uses in the future.
In the end, DirectX 11 looks like it’s going to start keeping the promises that DirectX 10 made. Windows 7 is already a hot contender with great buzz behind it (and having been using it for months now I can assure you the buzz is warranted), and DirectX 11 is getting far more support from developers from the get-go than DirectX 10 ever did.
If nothing else pushes adoption of DirectX 11, I personally think DirectCompute is going to wind up being the big winner. Even game developers are looking at ways to use it to speed up other tasks in the game itself. It’s very exciting to hear about how this technology is getting put to use.
I’d encourage anyone to upgrade to Windows 7 when it arrives, but even Windows Vista users will be getting in on this action. Still, if you’re on Windows XP, the release of 7 and accompanying DirectX 11 is going to make October a great time to switch.