The other major feature AMD/ATI is introducing with the 5000 series is the triumphant return of super-sampled anti-aliasing. While I’m sure someone in the forums will point out that NVIDIA’s GeForces have supported SSAA for some time now, that feature always had to be exposed by a registry hack.
Old, old video cards used to use this as their only method for anti-aliasing an image, but it was always very computationally demanding and just too unwieldy to be used on modern games. With the 5800 series being introduced and the excess of performance therein (you could argue that even last generation was a bit more powerful than we needed), super-sampled anti-aliasing has been brought back as a means to further improve image quality.
Traditional MSAA has been used for some time now, and while it cleans up jaggies on the edges of models in the image, textures with transparencies (gates, fences, etc.) would remain jagged, to say nothing of some of the more detailed textures and shaders within the image.
ATI and NVIDIA met this situation halfway by introducing what ATI calls “Adaptive Anti-Aliasing” and what NVIDIA calls “Transparency Anti-Aliasing.” These detect “alpha textures,” textures with transparency, and smooth out their jagged edges. For what it’s worth, I’ve always found ATI’s approach to be inferior to NVIDIA’s, which has felt consistently smoother and better applied.
Of course, this still leaves jagged textures and shaders. Super-sampled anti-aliasing smooths out the entire image at great computational cost, but if you’ve got the horsepower to do it, why not?
Anti-aliasing in this comparison is set to 4x. The small image makes the differences hard to see, but if you click to zoom in you’ll spot some of the differences. I chose Quake 4 because the id Tech 4 engine used in it has been particularly prone to irritating texture and shader aliasing. You’ll want to look at the edges of shadows and the hard lines in the textures on the environment and models to see the differences, but trust me when I tell you they’re much more readily visible in motion.
SSAA does have limitations, however. It can only be used on DirectX 9 titles, and at the moment, the implementation appears to have some kinks that need to be worked out. While some games look fantastic with it (actually, the majority do), certain games seem to have the image almost too smoothed out to the point where they seem a bit blurry, like Call of Duty 4 and Left 4 Dead. It’s worth mentioning that ATI knows about the issue and is looking into it, and that either way, these games are still totally playable and very fluid with it enabled. You can enable it presently and decide on it as a matter of taste.
I will say that it does an impressive job smoothing out the hideous motion blur filter in Mass Effect, but Unreal Engine 3 games seem just a little beyond the pale. Mass Effect was only really playable at 2xSSAA.
It might be best to just say that GPU transcoding is here…for NVIDIA. The ATI AVIVO Video Converter, in addition to clearly not being ready for primetime, definitely still has the kinds of bugs and questionable quality that have been remarked upon by other websites. Conversions of short 1080i MPEG2 files from my most recent film were handled fairly quickly; ten seconds of very complex footage were compressed into a smaller WMV in about two seconds. Converting a 13-minute 1080i MPEG2 to 1080p H.264 took 7:22, while converting it through Adobe Media Encoder CS4 took more than four times as long.
The difference seems very appreciable, but the problem is that AVIVO’s H.264 conversion was extremely lossy with very little fine-grained control over the end quality. Of course it was quicker; it wasn’t doing as much work as CS4 did. Certainly there’s an improvement in encoding overall and I expect that would carry over to a less lossy conversion, but the problem is that the control of quality really isn’t there. AVIVO has a slider that lets you choose from lowest to highest quality, but this was the best it could offer.
Then there’s the matter of stability. Converting to Windows Media (a remarkably robust and high-quality format, actually), AVIVO was…not stable. The slider felt like it was getting to the point where it was spitting out random estimates of the end file size, and eventually just crashed outright. If you have a good explanation for how slowly moving the slider increases the file size, then rapidly moving it to high quality drops the file size by 90%, I’d love to hear it. But that’s all largely academic, because AVIVO crashed anyhow.
And finally, how about finding the thing? If you leave Catalyst Control Center set to Advanced, you will NEVER even see it. You’ll install the converter then wonder where the executable for it is. In order to get to the AVIVO Video converter, you have to set CCC back to Basic, then move forward to the wizards. It shows up there.
Ultimately AVIVO seems like it’s probably good for a quick fix, but if you have any concern about getting the best quality and compression you’re going to want to stick with a more professional grade CPU-based encoder. There’s definitely substantial performance gains to be had from GPU-assisted video transcoding, but they would probably be best exposed by higher quality software produced from a more established vendor.
Heat and Noise
Winner winner, chicken dinner. Before ordering the HD 5870 I had two major worries: that the card would run unbearably hot like its predecessors, and that it would be unbearably loud. I pride myself on a very quiet tower and hearing the video card’s fan spin up aggravates me. This is one of the major reasons I parted ways with the GeForce GTX 280, actually – the fan would spin up irritatingly loudly in games.
Surprise, in my Antec P182 case the 5870 borders on inaudible even under load. Load temperatures in actual gaming top out at around 85C, hot but not frighteningly so for a top-of-the-line video card. It bears mentioning that my case runs four Antec TriCool fans all set at low and a Xigmatek Dark Knight cooler on the processor, which is configured in BIOS to run quietly (and indeed does). The 5870’s bat-cooler has every reason to be louder than the surrounding fans, but remains blissfully silent even under load.
If there’s one bone I still have to pick with the Radeon HD 5870, it’s that ATI’s anisotropic filtering still sucks. Reviews show that it’s no longer angle-dependent, but the mip-mapping borders still aren’t as smooth as I’d like and present some irritating shimmering at times. This is the only fly in the ointment as far as image quality goes on the Radeon, which with SSAA as well as all the old anti-aliasing filters inherited from as far back as the Radeon HD 2000 series still potentially presents the best image quality on the market overall.
Of course, I haven’t mentioned that the 5000 series Radeons are also DirectX 11 class hardware. This you probably already knew, but it does provide a measure of future-proofing. Personally, as a big fan of Alien vs. Predator and Alien vs. Predator 2 on the PC, the pending release of Aliens vs. Predator was enough to get me to care. Couple that with the absurdly high performance of the 5870, and I suspect we have the makings of the most future-proof card since the GeForce 8800 GTX and the Radeon 9700 Pro.
The last remaining issue with the 5870 is price, and the lesser performing but also substantially less expensive 5850 does provide a reasonable alternative (though I personally like being behind a video card with this much juice to spare). Though I didn’t pay MSRP for mine, I think it still makes a convincing case for its pricetag. The 5870 would be justifiable even at the price of a GeForce GTX 295; at MSRP $379, it’s actually a pretty good deal if you have the set up for it. Try to remember the 8800 GTX was $600 when it launched, and the Radeon 9700 Pro was $300. These cards held their own for a while, and I have no reason to doubt the 5870 will as well.
NVIDIA’s DirectX 11-class hardware is still months away from materializing on shelves, and probably won’t be available until the end of the first quarter of 2010. You could certainly wait for it, but given the way the 5870 performs, I’d have a hard time justifying that, and I don’t think anyone who buys a 5870 now is really going to be terribly upset if NVIDIA’s next chip outperforms it. It’s so fast already, any performance differences are just going to be academic anyhow.
- Unbelievable power
- Great idle power loads
- Buggy video transcoding