
Original Link: https://www.anandtech.com/show/422
Telling a worthy manufacturer that they cannot compete in the gaming market is much like telling a nice guy that he simply can't play basketball. While sitting in a car with three other members of the ATI team, we were having a nice discussion about the present graphics card market. When one of the ATI representatives asked for our opinion on a higher clock speed Rage Fury Pro, possibly in the TNT2 Ultra range of speeds, we were taken by surprise. Here, for the first time since the true introduction of 3D accelerated gaming on the PC, we had ATI talking about assuming a leading role in the gaming market. Although it is true that just one year ago ATI had the potential to take the gaming market with the release of their Rage 128 chip, delays in the release of the part snatched the chances of that gold medal away from them quickly. This conversation took place just under six months ago, and as shocked as we were back then when ATI was talking about taking on NVIDIA, one of the leaders in the 3D accelerated PC gaming market, we were just as shocked when they dropped the news about project Aurora.
Project Aurora started out as an ambiguous page on ATI's site and shortly thereafter turned into a skeptical press release as the term Aurora was morphed into ATI's latest offering, the Rage Fury MAXX card. The Rage Fury MAXX revisited an idea that was first introduced to the gaming market with the advent of 3dfx's Voodoo2: the idea of putting two standalone graphics chipsets together in order to provide a desirable performance boost with minimal added engineering time.
The idea of using multiple processors to quickly achieve a performance boost without having to wait for the technology to improve is something that is presently all around the industry. 3dfx's Scan Line Interleave (SLI) on the Voodoo2 was a quick and easy way to assume a nice performance boost simply by adding on another graphics card. The SLI technology allowed the number of horizontal lines being rendered to be split evenly between the two cards in the configuration, so one card would handle every even line while the other card would handle every odd line. Because both cards worked on the same scene, the textures present in the scene had to be duplicated in the frame buffers of both cards being used. This was a highly inefficient manner of improving performance, but, then again, at the time of the technology, the 8/12MB of memory on a single Voodoo2 was more than enough for the games.
On the other hand, this manner of improving performance was very appealing to gamers because they could absorb the cost of owning a single Voodoo2 board, enjoy the performance, and when they came across a little more cash they could make the upgrade to a Voodoo2 SLI configuration and assume an immediate performance increase. The key to the success of 3dfx's Voodoo2 SLI was the fact that you never threw away your initial investment, something very rare in the graphics accelerator market.
The success of the SLI technology led to the question of whether or not 3dfx's Voodoo3 supported SLI. Another company, Metabyte, stepped forth with a technology that was unofficially dubbed SLI, yet, with a few modifications, it could be used on any card. Metabyte officially called this technology their Parallel Graphics Configuration (PGC). The PGC technology split up any given frame into two parts, with each card/chip handling one part of the screen. This approach required quite a bit of elegance in the actual drivers themselves as the drivers had to take into account factors like what would happen if the card rendering the top half (which is generally less complex than the bottom half) finished before the other card was done rendering the bottom half. At the same time, the end result would be much more efficient than 3dfx's SLI design because the textures did not have to be duplicated and the polygon throughput of the setup was effectively doubled, whereas it remained equal to that of a single card in the Voodoo2 SLI situation. Unfortunately, Metabyte's PGC never made it to market, an unfortunate reality as the expensive product could have been quite a success -- Can you imagine laughing at a GeForce's 480 Mpixel/s fill rate while running dual Voodoo3 3500's (732Mpixel/s) or dual TNT2 Ultras (600Mpixel/s)?
ATI turned project Aurora into their take on the same idea, and thus ATI's Alternate Frame Rendering (AFR) Technology was born. As the name implies, AFR divides the load between the two chips in the configuration by frames, instead of parts of frames. One chip will handle the current frame while the second chip is handling the next frame. ATI's AFR is the basis for the Rage Fury MAXX and future cards which will carry the MAXX name.
The Rage Fury MAXX was ATI's only chance at competing with what 3dfx, NVIDIA and S3 hoped to have released by the end of the 1999 holiday shopping season. ATI had no new chip that would allow them to compete with the big boys, all they had was the Rage 128 Pro that delivered performance somewhere between that of a TNT2 and a TNT2 Ultra for about the price of the latter. The Rage 128 Pro itself is a 0.25-micron chip clocked at 125MHz, resulting in a 250 Mpixel/s fill rate; put two of these together and you have a setup capable of beating NVIDIA's recently launched GeForce 256 (500 Mpixel/s versus 480 Mpixel/s). The Rage 128 Pro was featured on ATI's recently released ATI Rage Fury Pro, and the combination of two of these chips using ATI's AFR technology is a product known as the Rage Fury MAXX. With less than three weeks left in 1999, ATI will be pushing for the sale of the Rage Fury MAXX within the next 10 days, pitting it head to head with NVIDIA's GeForce that has been dominating the store shelves. Not only is ATI attempting to compete with NVIDIA on a performance level, but on the issue of price as well, as they have vowed to match the price of the GeForce with the Rage Fury MAXX. Bold claims from a company that isn't known to be a present competitor in the gaming community.
The Specs
The specs of the Rage Fury MAXX are quite impressive as the card features a whopping 64MB of on-board SDRAM clocked at 143MHz. The 64MB of SDRAM is split into 32MB per Rage 128 Pro chip on the board. Like a Voodoo2 SLI setup, textures must be duplicated for each chip, but framebuffer memory can be shared between the two memory banks. Nevertheless, this provides for a tremendous amount of available memory bandwidth. Taking the 128-bit (16-byte) memory pipeline and multiplying it by the 143MHz memory bus frequency of the Rage Fury MAXX results in an available memory bandwidth of about 2.3GB/s. But we're not done yet! In order to finish the memory bandwidth calculations, you must double that number because of the two dedicated memory busses, one per chip, present on the Rage Fury MAXX. This increases the available memory bandwidth to an incredible 4.6GB/s, which outweighs the 2.7GB/s of NVIDIA's GeForce 256 and even the 3.2GB/s of the Matrox G400MAX. The only card the Rage Fury MAXX isn't able to beat on paper in terms of available memory bandwidth is a GeForce 256 equipped with DDR SDRAM, which has approximately 5.3GB/s of available memory bandwidth.
Why does offering a greater amount of memory bandwidth matter? As you increase the resolution and color depth that you're playing your game at, you begin to require much more memory per frame of data that you are displaying. The greater the memory bandwidth your graphics subsystem or video card allows for, the less noticeable of a performance hit you'll be forced to take as you increase the resolution and color depth that you are playing at. So, while the performance advantage of the Rage Fury MAXX's 4.6GB/s peak memory bandwidth over the GeForce SDR's 2.7GB/s won't be seen running at 640x480 at 16-bit color, the MAXX will begin to pull away from its NVIDIA-born counterpart at resolutions of 1024x768 at 32-bit and at higher resolutions.
As we mentioned before, the dual Rage 128 Pro chips not only double the available memory bandwidth but also the peak fill rate of the card. This transforms the unsurprising 250 Mpixel/s fill rate of a single Rage 128 Pro into a monstrous 500 Mpixel/s fill rate of the Rage Fury MAXX. The fill rate of the Rage Fury MAXX can be considered its raw horse power to all you car-fanatics out there (don't worry, the first requirement to work at AnandTech is that you must be a car-guy or gal) and it is what helps to drive the high frame rates in Quake 3 or UnrealTournament. Currently, the Rage Fury MAXX remains unmatched in terms of raw fill rate of a released product; the only other products we are aware of that will be able to beat out the Rage Fury MAXX won't be seen until sometime into the first half of next year.
In addition to offering good performance on paper (we'll get to the actual benchmarks in a bit to see if they follow up their claims), ATI brings to the table their usual set of DVD enhancements and features that help to separate them from the crowd. The hardware assisted DVD playback of the Rage Fury MAXX is obviously identical to that of the Rage Fury Pro since they are both based on the same chip.
The Test
Windows 98 SE Test System |
|||
Hardware |
|||
CPU(s) |
Intel Pentium III 700 |
Intel Celeron 500 Intel Celeron 366 |
AMD Athlon 700
|
Motherboard(s) | ABIT BX6 R2 | ABIT BX6 R2 + "Sloket" Adapter | Gigabyte GA-7IX |
Memory |
128MB PC133 Corsair SDRAM |
||
Hard Drive |
IBM Deskstar 22GXP 22GB Ultra ATA 66 HDD |
||
CDROM |
Phillips 48X |
||
Video Card(s) |
3dfx Voodoo3 2000 16MB (default
clock - 143/143) |
||
Ethernet |
Linksys LNE100TX 100Mbit PCI Ethernet Adapter |
||
Software |
|||
Operating System |
Windows 98 SE |
||
Video Drivers |
|
||
Benchmarking Applications |
|||
Gaming |
idSoftware Quake III Arena demo001.dm3 |
Performance
Benchmarking at 640 x 480 is useful for figuring out two things: 1) CPU performance and 2) driver performance. The latter is what we're investigating as we look at the first set of benchmarks taken at this resolution. While playing Quake III Arena in 640 x 480 is a reality for many, for most that would purchase the Rage Fury MAXX, it's a reality that they are trying to escape from. The Rage Fury MAXX's still immature drivers is what is holding it back here, keeping its level of performance down to that of the single Rage 128 Pro found on the Rage Fury Pro. Better drivers will put the Rage Fury MAXX at the top of this chart with the Savage 2000, Voodoo3 3500 and the SDR GeForce
Moving to from 640 x 480 to 1024 x 768 is a jump into the realm of real world benchmarking. Right now 1024 x 768 is the sweet spot for performance with these cards and at the head of the pack is the Rage Fury MAXX. The performance advantage it offers over the cheaper Savage 2000 and the equivalently priced GeForce is negligible.
In 32-bit color the Rage Fury MAXX manages to outpace the SDR GeForce by over 10 fps while it falls less than 4 short of the Savage 2000. The reason behind it falling short of S3's card is because the Savage 2000 fails to supply the 32-bit Z-buffer that Quake III Arena requests, turning off 32-bit Z on the Rage Fury MAXX allows it to regain the lead over even the Savage 2000 in 32-bit color.
At 1600 x 1200 the increased memory bandwidth the Rage Fury MAXX offers over the competition propels it into first place edging out the SDR GeForce by no more than 2 fps in 16-bit color, but a fairly noticeable 5 fps in 32-bit color. The MAXX is once again beaten by the Savage 2000 in 32-bit color scores because of the same Z-buffer issue mentioned above.
Back at 640 x 480 we get another glance into the driver maturity of all of the compared cards, the MAXX still has a decent distance to grow in terms of driver maturity.
At 1024 x 768 the Rage Fury MAXX once again comes in ahead of the competition, and once gain by a narrow margin in the 16-bit tests but in a much larger and more noticeable degree in the 32-bit tests when compared to its natural rival, the GeForce. The Savage 2000 performs quite respectably here as it falls in between the two competitors.
Here we get a virtual repeat of the performance at 1600 x 1200 on the Pentium III 700, just on a slightly slower CPU. No real surprises here, the top three contenders are all within a few percentage points of one another. The main issues to keep in mind here include the fact that two of the three cards (the Rage Fury MAXX being the odd man out) have integrated hardware T&L and at the same time you want to factor in price as a balancing factor as well.
On the "slower" Pentium III 450 the Rage Fury MAXX drops down a few notches, it seems like the Rage Fury MAXX isn't too fond of CPUs that aren't too bleeding edge.
...But when the smoke clears at 1600 x 1200, the Rage Fury MAXX still comes out on top courtesy of its memory bandwidth advantage over the competition.
As we learned from our Athlon Buyer's Guide, the current crop of Athlon CPUs don't allow for much variance in the performance of a graphics card, thus only a single Athlon 700 is present for the tests. The Athlon is a tricky CPU to benchmark, since it behaves differently from what you'd expect from most video cards.
Once again we start off with a quick look at driver performance.
The GeForce pulls slightly ahead of the Rage Fury MAXX in 16-bit color performance, but is schooled in 32-bit color performance. This is the underlying trend you will probably already have noticed with the GeForce and the Rage Fury MAXX -- their performance is virtually identical on faster CPUs in 16-bit color, but the MAXX pulls away as you increase the resolution/color depth.

The Celeron performance of the Rage Fury MAXX wasn't promising at all as you're about to see from the 1024 x 768 scores.
The Rage Fury MAXX is definitely a high-end contender only, while even the Savage 2000 and GeForce compete nicely on a Celeron 500, the Rage Fury MAXX falls short of a Voodoo3 3000. Not a good price/performance choice for Celeron owners.

We have a similar story with the Celeron 366 as we did with the 500, only the Rage Fury MAXX is now right on the heels of the Voodoo3 2000.

Unreal Tournament makes for a very interesting benchmark, using the UTbench.dem file we went to the test beds with our video cards and came back with these results. Surprisingly enough, the Rage Fury MAXX comes out on top in the 640 x 480 tests under UT. Does the card have superior drivers or is it just a pure coincidence? Your call. One thing that is worth mentioning is that the GeForce has a very difficult time performing well here due to a lack of optimizations in its 3.53 drivers, there is still quite a bit of tweaking possible in order to help squeeze even more performance out of the GeForce.
Here we have another repeat of the Quake III Arena tests, the Rage Fury MAXX is out on top in both 16 and 32-bit rendering tests. The only difference this time around is that 3dfx has a very strong presence here primarily because the 3dfx cards were tested in Glide mode since they produced erratic results under Direct3D mode. So don't look towards UnrealTournament as the definitive Direct3D performance illustration.
Once again ATI is at the top, but not with their 32-bit rendering performance. For some unexplained reason the Rage Fury MAXX performs quite poorly at 1600 x 1200 x 32 under Unreal Tournament barely outperforming the single Rage 128 Pro used on the Rage Fury Pro. This is most likely a driver related problem but it is present throughout the UT tests at 1600 x 1200 x 32.
An encore performance by the Rage Fury MAXX, this time on the "slower" Pentium III 600E.
Once again, while the MAXX dominates at 1600 x 1200 x 16, it chokes once the move is made to 32-bit color. Most likely a driver related issue, although we were using the final retail drivers in our tests.

Even with a Pentium III 450, the Rage Fury MAXX manages to stay ever so slightly ahead of the competition. The GeForce could definitely benefit from some driver optimizations here.


The Rage Fury MAXX performs respectably but not to an amazing degree with the Athlon, it seems like the Athlon isn't the best platform the MAXX although it's not that bad.


The tides take a turn for the better with the Celeron 500 and the MAXX as it gently pulls ahead of the competition. It is worth noting that the range of frame rates (16-bit) varies by around 4 fps though.




Conclusion
The Rage Fury MAXX plays an interesting role in the video card market right now. Overall, especially in high-resolution/32-bit color scenarios, it has a nice time beating the SDR GeForce at an equal price. So for those of you that have a fast enough CPU (Pentium III 450 or above), the Rage Fury MAXX makes for an excellent GeForce alternative while adding excellent support for DVD playback.
Those of you with slower CPUs may want to opt to stay away from the MAXX/GeForce debate entirely and just go after one of the recently price-reduced Voodoo3s or TNT2 Ultras which are still very serious performers.
Athlon owners won't find the highest performing solution in the Rage Fury MAXX, but then again they won't be disappointed by the card. The Athlon's match still seems to be the TNT2 Ultra, as it consistently performs noticeably better on the Athlon than it does on any Intel platform.
The true question boils down to how long you will be able to go without upgrading. If you are going to upgrade in another 6 - 8 months then going with the Rage Fury MAXX over the competition won't be too big of a problem, simply because once games begin to take advantage of hardware T&L you will be ready to upgrade to the next generation of video cards with more advanced hardware T&L support. On the other hand, if you are determined on keeping your next video card for much longer than that 6 - 8 month period you may want to consider the GeForce or the Savage 2000, having hardware T&L support on your card will increase the longevity of your investment. While the Savage 2000 currently doesn't have an enabled hardware T&L engine, well before the end of Q1 2000 we will see hardware T&L support for the Savage 2000 under OpenGL making it a viable option for the longevity category.
In the end it comes down to this, if you want performance now, especially in 32-bit color, the Rage Fury MAXX is obviously the way to go. ATI's raw power solution delivers exactly what they promise in a very attractive package. If you want a long lasting investment (as if one existed in this ever changing market), then you're better off with a card that features hardware T&L like the GeForce or the Savage 2000.
Kudos to ATI on a job well done with the Rage Fury MAXX, but weigh your purchasing decision carefully based on what we just discussed.
If ATI could bring the price of the Rage Fury MAXX down to the level of the Savage 2000 or even below that, then the Rage Fury MAXX would easily become the card to get, unfortunately that is nothing more that a holiday wish from us at AnandTech.