Regarding the logarithmic scale, again, I don't think that it was a really bad idea. It tries to equalize scores that really have a lot of variables. What would be the best way of measuring bang-for-the-buck? Well, here's what I think would actually have to come into consideration.
Well, you would have to take into account the desired average frame rate. This of course varies from game to game, but a score of somewhere between 60 and 100 FPS would probably be ideal. Of course, this would vary between games, as a game like Unreal Tournament 2004 "needs" higher frame rates more than something like Flight Simulator 2004 (or whatever we're on now).
Why 100, when you can't really "see" things that fast? Well, ideally, you would want V-synch enabled for maximum image quality. 100 FPS without V-synch would probably get pretty close to your refresh rate, i.e. 85 Hz, with V-synch. Meanwhile, 60 FPS with an 85 Hz refresh rate might end up scoring more like 42.5 FPS with V-synch enabled. Even worse, 59 FPS with a 60 Hz refresh rate might score as low as 30 FPS.
Anyway, the scale would be weighted, so scores lower than this threshold would be punished increasingly more, while scores above this threshold give diminishing returns. In other words, if we're shooting for 60 FPS, a score of 55 is pretty close, so it gets maybe 89% on a relative scale (instead of the mathematical 91.7%), while a score of 50 might only get 75% (instead of 83.3%) and a score of 45 might get 55% (instead of 75%). Meanwhile, 70 FPS might score 110% (instead of 116.7%) and 80 might score 115% (instead of 133.3%). That's just a rough example to illustrate what I'm talking about - no actual formula was used there.
You might want to consider *minimum* frame rates (although the weight given to the average frame rate does accomplish *some* of this). So maybe we want a minimum frame rate of 40 FPS, and again its weighted so that dropping below this hurts more, while exceeding this value benefits less. And maybe there should be some factor taking into account the percentage of time that a card drops below this threshold?
Price is also a factor. Some people have limits, so you would have an inversely weighted price scale based off your target. Say you want to spend $200. Cards costing less than that would be more desirable, even with lower performance, while cards costing more would be less desirable.
You can go on with all sorts of ideas. In the end, though, it ends up being far too complex for anyone but a mathematics professor. So, the log scale rating that was used isn't terrible... it's just a little interesting, since a straight scale is more easily calculated, more people are familiar with it, and some would say it’s more “accurate” (although that last is more opinion than anything).
hummm also Doom3 is confirmed for quakecon so the next game to kick your pc's (whatever) is due to come out next month should be interessting to see if any of the value cards can handle it.
our grphing engine doesn't display log scale. I went to all this trouble to get around that :-)
Also, we will get color coding in our graphing engine sometime soon, but we don't have it yet ...
Normalizing everything to avg fps and avg cost wouldn't be as useful if done on a per graph basis (which it would have to be done on)... Unless you picked an arbitrary normalization point like desireable fps across all games and desireable cost for that fps .... but that's too sticky and specific ... You have a good idea, but i don't think it would be straight forward to impliment --- we'll see what we can do with this though, thanks.
Normalise all the results to the average fps/cost of all the cards. Any card close to the normal average(1 or 100 if x100) will be a reasonable to good buy and the outer extremes will be accentuated while still being contained (unlike logarithmic representation that distort the comparison).
Hmm, can your graph generator show a log scale? Then maybe you can use the raw fps/$ numbers but it would be at the display level that the scaling was done.
You might wonna improve the presentation of your results (at least in the Final Words page) by putting them in as few as possible tables (and add/or only use relative results (%) beside actual FPS (mandatory for the GPU which represents 100% or 1,00), to better relate the difference in perfomance -- now being represented by graph bars length which is not that accurate) instead of hundrets of graphs on tens of pages. It makes getting a good overview and doing quick comparisons very hard. You can also use some creative color coding (green = good, red = bad, etc.) to futher increase the readability of these tables. You have done a bit of this in some of your more recent reviews and I'd like see you do it even more and even step it up a notch or two.
we would have had tiny values which look really bad on our graphs :-) So our first option was to scale the graphs up by multiplying by 100:
Value = 100 * fps / price
The problem with this is that the visual impact of the scaling seemed too dramatic. Now, we don't mean with the high priced ultrasuperextremegold cards. The problem was the difference between the X800 Pro and the 6800GT ... The GT certainly comes out on top, but we are talking about current market prices and a difference of $10. We didn't want to give this difference (the cost of a couple value meals at the local fast food place) more weight than people might ascribe it.
Of course, there are infinite other ways to look at this. Is frame rate *REALLY* important to you? Then its perfectly valid to look at this graph:
Value = fps^2 / price
or is cost the absolute driving factor to you?
Value = fps / price^2
Its all relative really, and very hard to determine. We just wanted to show which cards will give you more frames per dollar, but we didn't want to ascribe a dollar value to every frame ... So the log scale seemed to be our best bet.
But we are definitely open to feedback on how we approach value from a quantitative standpoint. Please let us know what you think, or if there is a better way to approach the issue.
Warcraft III is an interesting game. It is a DX8 game, but OpenGL rendering can be enabled with a commmandline switch. In doing some testing, we noticed that OGL mode gets MUCH higher frame rates on both cards (if you talk to some hard core wc3 freaks out there, they might already enable OGL mode to keep framerate up).
We spoke with Blizzard about the differences between the DX8 and OGL rendering paths. Specifically, we wanted to know if there was a quality difference that would make the OGL path run faster than the DX8 path. Blizzard informed us that the two paths had converged on the same quality level. They would not say that the DX8 and OGL paths rendered the exact /same/ image, but they maintained that the only reason they didn't was that DX8 and OGL can't rendere the exact same pixels in certain situations.
We may taclke some OGL benches in WC3 in the future if people are interested. But since we're trying to show realworld performance in these benches, and DX8 is what most people will use, we haven't felt like including these numbers would be useful for now.
Blizzard didn't say anything about capping framerates in DX8, but, then, we didn't ask that question specifically.
Actually WCIII is DX8 based. Also, wc3 is quite odd with graphic cards. Even with a high end cpu, on toms hardware, the cards dont exactly scale evenly as they do in other benchmarks. Its like theres a cap placed on them in the engine or ATI's drivers dont have some sort of cap on them. I believe it is in the engine due to all cards seeming to have similar frames dispite the card used (unless AA or AF is turned on). The ati cards have a slightly lower cap it seems. I mean the difference is like 3fps between the 9700p and the 6800U. It makese no sense why they all stop here.
On the Warcraft III page, you had this to say: "Even at 16x12, this benchmark is very CPU limited, and yes, vsync was disabled. Oddly, when AA/AF is enabled, the FX 5950U actually outperforms the X800 XT PE. This is an atypical situation, and we will try to look into the matter further."
My thought on this is that the likely reason has to do with optimizations. In most benchmarks, the 6800 series of cards outperforms their X800 equivalents when running at standard settings. Enabling 4xAA and 8xAF often diminishes the gap or shifts the benchmark into ATI's favor. However, you don't really do a full suite of benchmarks, so it's difficult to say why the shift takes place. Having looked at other sites, the shift seems to be related almost entirely to the anisotropic filtering. Turning on/off AA seems to have very little impact on the placing of the cards when you're not CPU limited, while turning on/off AF can put a much larger burden on the Nvidia cards, especially cards of the FX era.
So what does this have to do with Warcraft III? Well, I won't bother arguing which of the two AF methods is actually better out of Nvidia and ATI. They seem to be roughly equivalent. However, ATI seems to get more AF performance out of their hardware. Basically, the ATI algorithm simply appears to be superior in performance.
So again, what does this have to do with Warcraft III and the Geforce FX? One word: perspective. WCIII uses an overhead perspective, so much of the screen is filled with polygons (the ground) that are perpendicular to the camera angle. If I recall correctly from my graphics programming classes, there is less that can be done to optimize the AF algorithms in this scenario. I believe that perpendicular polygons are already almost "perfectly optimized". (Or maybe it's just that Nvidia has better optimizations on the FX architecture in this instance?) The end result is that the GPU doesn't have to do a whole lot of extra work, so in this particular instance, the FX architecture does not suffer nearly as much when enabling AF. Not that any of us would actually go out and buy an FX5950 these days....
Honestly, though, the benchmarking methodology for WCIII (playback of a demo at 8X speed) seems pretty much worthless - i.e. on the level of 3DMark usefulness. It's a DX7 game that will run well even on old Pentium 3 systems with GeForce 2 cards, and anything more recent than a GeForce 4 Ti with a 2 GHz CPU will have no difficulty whatsoever with the game. Running a demo playback at 8X might not work well, but that's not actually playing the game. I'm sure there are plenty of WCIII fans that think this is a meaningful performance measurement, but there are probably people out there that still play the original Quake and think that it gives meaningful results. :)
"The 9700 Pro may be a good value for many games, but it just won't deliver the frame rates in current and future titles, at the resolutions to which people are going to want to push their systems."
I really have to disagree with that opinion. These tests were done exclusively at 1280x1024 and 1600x1200, as well as with 4xAA and 8xAF. Only the extreme fringe of gamers actually have a desire to push their systems that far. Well, I suppose we would all *want* to, but most of us simply cannot afford to. First, you would need a much better monitor than the typical PC is equipped with - 19" CRT or 17" LCD would be the minimum. You would also need to run at 4xAA and 8xAF at the maximum resolution your display supports in several of the games. Finally, you would need to max out all the graphics in each game. While some people certainly feel this is "necessary", I'm pretty sure they're in the minority.
My opinion? The difference between 800x600 and 800x600+2xAA is rather noticeable; the difference between 800x600+2xAA and 800x600+4xAA is much less so. I also think that 800x600+4xAA is roughly equivalent to 1024x768+2xAA or 1280x1024 without any AA. Personally, I would prefer higher resolutions up to a point (beyond 1280x1024, it's not nearly as important). For graphical quality, there's a pretty major improvement from bilinear to trilinear filtering, but you don't notice the bump to anisotropic filtering nearly as much. There is also a very drastic change in quality when going from low detail to medium detail, and generally a noticeable change when going from medium to high detail. Beyond that (going to very high or ultra high - assuming the game permits), there is usually very little qualitative difference, while the performance generally suffers a lot.
But hey - it's just one man's opinion against anothers. I point this out not as a rebuke of your opinion. It is as disagreement with your pushing your opinion as being something more. Often, writers don't like wishy-washy conclusions, but a more moderate stance is probably warranted with many of the hardware sites. The fastest hardware comes with a major price increase that most people are simply unwilling to pay. The use of a logarithmic scale is also part of this problem, as most people would be more than happy to pay half as much for 75% of the performance.
#24 - I'm amazed that you're the only other person that even wondered about that. Basically, using the Log of the performance/price makes everything a lot closer. There is a reason for this, of course: if you take the straight performance/price (multiplied by 10 or 100 if you want to get the numbers into a more reasonable range), it makes all the expensive cards look really, really bad.
However, the reality is that while an X800 Pro or 6800 GT might cost over twice as much as the 9700 Pro, there is a real incentive to purchase the faster cards. Minimum frame rates on a 9700 Pro would often be completely unacceptable at these resolutions. The use of a logarithmic chart makes large differences in price and/or performance less of a deal killer.
For example, let's look at Warcraft III 1600x1200 without AA/AF. The cards range from 58.2 to 61.1 FPS, but the price range is from $300 to $600. In this particular instance, the $300 6800 would be almost twice as "desirable" as the 6800UE or X800XTPE. Apply their log-based calculation to it, though, and the 6800 is now only 30% more desirable than the $600 cards.
What it amounts to, though, is their statement at the beginning: every person has a different set of criteria for rating overall "value". In Anandtech's case, they like performance and are willing to pay a lot of extra money for it. (Which of course flies in the face of their comments about the $10 difference in price between the 6800GT and X800 Pro, but that's a different story. As someone already pointed out, if the GT leads in performance, costs a little less, and also has more features, what numbskull wouldn't choose it over the X800 Pro?!? Of course, there are instances where the X800 Pro still wins, so if you value those specific situations more, then you might want the X800 instead of the GT.)
How can you leave out the 9800 pro when talking value, especially when the video card guide right under this article says the 9800 pro is the best price/perfomance now?
One thing you don't take into account is that someone buying a lower end card probably doesn't have the same cpu as someone buying a top end card. While it wouldn't make sense to test each card with a different cpu for this article it's worth mentioning. I'd actually like to see a perfomance plot of a couple value cards tested across the gamut of cpus. Looking at video card value and cpu value completely separate from each other isn't necessarily going to lead to the best choices.
araczynski, i used an asetek waterchill v2. dedicated only for the GPU and a custom coolant, my recipe... i havent tried it with a CPU, my 3400 is barely overclockable.
Yes that is a good deal. But it doesnt represent the actual price of the card w/o promotions. The GT was $300 from bestbuy. Thats not the overall price tho, just a pricing mistake. You cant count them.
Did you guys notice the Far Cry specs? That's some pretty huge numbers... the X800XT is beating the 6800 Ultra by about 15FPS from what I could tell.
That's huge!!
And those "actual" pricing numbers seem way off... I picked up a retail X800 Pro from Best Buy 2 weeks ago for $399, so I don't see why we'd include price gouging vendors.
And the X800XT Platinum Edition is below $499 at most places, I personally have it on order from Gateway for $390 which makes it the best deal by FAR (of course I'll get it sometime in August with my luck, but who cares, it's cheap).
So many rich people here, discussing de price of 600$ card's, and worried about their little price differences (10$), funny.
After watching this review, I would go for a 9700pro or 9800pro, or even better a softmod 9500/9800se.
I play always (if the games permit) at 1024x768 2xAA and 4xAF. It's more than enough.
The review doesn’t take into account that most of monitors (CRT) do 60 Hz at 1280x1024 and 1600x1200.
Guys if you don't like the value information 'overkill', er... just don't hover your mouse over the graphs?
Actually my beef is slightly different which is why do anandtech log the fps/$ ?? There may be a good reason but am not sure what it is...
Something that I dont think is quite right is that they are doing these benchmarks to determine the value of a card. If you use SM2.0 for the 6800 series and the X800 series you will not be seeing the entire value of purchasing a 6800 based graphics card. SM3.0 IS A FACTOR IN VALUE!
I completely agree with #18, just too much value information for me. In the end of an article just give a graph of the overall value, something like they do at THG.
I love my LeadTek 4200, and the 6800nu is right up my alley...not like I need 256 for anything I do anyway. Great article, now I'm sure that once the gouging stops if I can find one for 250 it's mine. And ATi fanboys please stop posting their prices like they are wrong, everyone is overcharging right now...and the XTPE does not equal the XT #17...if the PE costs the same as the XT, who the hell will buy the XT when the PE is clocked higher stock and performs better?
Hell at my local Best Buy the Pro (yes, the Pro) is proudly sold for 499.99; so much for the MSRP...
#18 - I infer that you meant "I am not implying . . ." in your comment.
Derek - The "value" thing is a good idea, but using it in every graph is really more information than any of us need - which makes it more confusing than it needs to be. Not many are interested in comparing bucks per frame in Eve at 1600x1200 to bucks per frame in Halo, for example. What's in the article about value is geeky overkill, when what I want to know is true overall value, or bang for the buck.
Maybe you can settle on a bench ot two to best illustrate value instead of making it so complicated you have to run a computer analysis to figure it out.
nice article, but too bad you didn't touch the issues of heat and noise. for me, those factors are far more decisive than, say, a 10$ price premium or a performance difference of a couple of fps.
An after thought to my original post on this review...there seems to be a great deal of emphasis put on 1600 X 1200 performance in these reviews..I know there are still a great number of gamers out there still using CRT moniotors..but..with the growing popularity of LCD monitors this 1600 X 1200 performance range is unobtainable for most LCD owners as most 17" and 19" LCD monitors operate with 1280 X 1024 as the optimal native setting. I am not infering though that 1600 X 1200 is not revelant in this testing process...it's just an observation.
The use of ATI catalyst 4.X drivers has proven to be a real thorn in the side of ATI card owners(in my opinion). I have a 9700Pro and the use of any catalyst driver above 3.9 results in a 3-5% performance hit regardless of the benchmark I use for measurement. I wonder if the resulting compatibility issues resolved in the 4.X series is worth the tradeoff in performance. ATI needs to remain focused on performance issues also in there driver releases. I have been a long time ATI fan (since the release of the 8500)..but..this article clearly indicates that Nvidia is in the process of or already has put a trump card on the table with the 6800 series of VGA cards that ATI needs to respond to.
Derek, do you really think a fall refresh line is comeing this year? The spring line has barely shown up, let alone the low end ATI cards haven't really been introduced. Seems like that would be a big waste of money for ATI and Nvidia to release refreshes before the current line has market penetration.
All I can say is that it's about time, Nvidia, that you've closed the gap on ATI. I'm looking forward to 2 6800 UE's SLI'd on a pci express board. heheh
Things would have looked different if we had gone with MSRP ... things will also look different in a couple months when the fall refresh line comes out.
We didn't feel it was fair to use MSRP because people can't find these cards for MSRP right now. In a month or two, that may or may not be different. And in a month or two, we may take another look at performance and value.
But, as we indicated in the article, the value graphs should really be more of a guide than a definition. If that makes any sense :-)
gotta agree with Shinei, this bunch DEFINITELY makes me wanna shift from my trusty Ti4200 *sniff* : p ... and umm no, im no fanboy; was seriously considering the X800XT, but the tests make me think the 6800GT is worth my money...
btw Derek...what monitor + connection were you using for these tests?
/also - the value graphs: - great idea!...something i've come to expect from you guys now : P Keep up the good work!
whoops thought I was just logging in - um - nice article.
I've seen newegg testing the waters with 9800 Pro 256MB for $223.00. Once I could believe it was a mistake, twice tho seems like real sampling to see when they should be lowering the price. There's just so many models out there, given a 10$ spacing, it's like prices are shored up from the bottom. Anyways, picked one up, but I let them sweat for a half hour first. ;)
Definitely interesting to see the 6800NU's performance numbers... I almost want to say that the 6800 will be the DX9-generation's Ti4200. Not that that'll stop me from getting a 6800GT or a 6800U to replace my 4200, mind you. :p
Anyway, I like the inclusion of the "value" graphs as part of the benchmark results, it makes it easier to see what card gives you the best bang (in the case of these latest cards, a boom?) for your buck. And as always, AT puts out the quality benchmarks that I recommend so highly to everyone.
well i got myself an x800 xt already (pulled a lot of strings...), nice article anyway.. wished you included overclocked numbers, hardcore types do it anyway. i had mine liquid cooled to 620/600... blazing! hehehe
[q]ATI Radeon 9700 Pro: $180
ATI Radeon 9800 XT: $400
ATI Radeon X800 Pro: $420
ATI Radeon X800 XT: $540
ATI Radeon X800 XT Platinum Edition: $600[/q]
Obviously you need to re-evaluate how you obtain prices. Since ATI's SRP (suggested retail prices) are lower. Those prices are from dealers who are price gouging due to demand.
ATI Radeon X800 Pro: $399 SRP
ATI Radeon X800 XT Platinum Edition: $499 SRP
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
46 Comments
Back to Article
TrogdorJW - Tuesday, July 13, 2004 - link
Regarding the logarithmic scale, again, I don't think that it was a really bad idea. It tries to equalize scores that really have a lot of variables. What would be the best way of measuring bang-for-the-buck? Well, here's what I think would actually have to come into consideration.Well, you would have to take into account the desired average frame rate. This of course varies from game to game, but a score of somewhere between 60 and 100 FPS would probably be ideal. Of course, this would vary between games, as a game like Unreal Tournament 2004 "needs" higher frame rates more than something like Flight Simulator 2004 (or whatever we're on now).
Why 100, when you can't really "see" things that fast? Well, ideally, you would want V-synch enabled for maximum image quality. 100 FPS without V-synch would probably get pretty close to your refresh rate, i.e. 85 Hz, with V-synch. Meanwhile, 60 FPS with an 85 Hz refresh rate might end up scoring more like 42.5 FPS with V-synch enabled. Even worse, 59 FPS with a 60 Hz refresh rate might score as low as 30 FPS.
Anyway, the scale would be weighted, so scores lower than this threshold would be punished increasingly more, while scores above this threshold give diminishing returns. In other words, if we're shooting for 60 FPS, a score of 55 is pretty close, so it gets maybe 89% on a relative scale (instead of the mathematical 91.7%), while a score of 50 might only get 75% (instead of 83.3%) and a score of 45 might get 55% (instead of 75%). Meanwhile, 70 FPS might score 110% (instead of 116.7%) and 80 might score 115% (instead of 133.3%). That's just a rough example to illustrate what I'm talking about - no actual formula was used there.
You might want to consider *minimum* frame rates (although the weight given to the average frame rate does accomplish *some* of this). So maybe we want a minimum frame rate of 40 FPS, and again its weighted so that dropping below this hurts more, while exceeding this value benefits less. And maybe there should be some factor taking into account the percentage of time that a card drops below this threshold?
Price is also a factor. Some people have limits, so you would have an inversely weighted price scale based off your target. Say you want to spend $200. Cards costing less than that would be more desirable, even with lower performance, while cards costing more would be less desirable.
You can go on with all sorts of ideas. In the end, though, it ends up being far too complex for anyone but a mathematics professor. So, the log scale rating that was used isn't terrible... it's just a little interesting, since a straight scale is more easily calculated, more people are familiar with it, and some would say it’s more “accurate” (although that last is more opinion than anything).
jiulemoigt - Monday, July 12, 2004 - link
hummm also Doom3 is confirmed for quakecon so the next game to kick your pc's (whatever) is due to come out next month should be interessting to see if any of the value cards can handle it.DerekWilson - Sunday, July 11, 2004 - link
our grphing engine doesn't display log scale. I went to all this trouble to get around that :-)Also, we will get color coding in our graphing engine sometime soon, but we don't have it yet ...
Normalizing everything to avg fps and avg cost wouldn't be as useful if done on a per graph basis (which it would have to be done on)... Unless you picked an arbitrary normalization point like desireable fps across all games and desireable cost for that fps .... but that's too sticky and specific ... You have a good idea, but i don't think it would be straight forward to impliment --- we'll see what we can do with this though, thanks.
Pumpkinierre - Sunday, July 11, 2004 - link
Normalise all the results to the average fps/cost of all the cards. Any card close to the normal average(1 or 100 if x100) will be a reasonable to good buy and the outer extremes will be accentuated while still being contained (unlike logarithmic representation that distort the comparison).trexpesto - Saturday, July 10, 2004 - link
Hmm, can your graph generator show a log scale? Then maybe you can use the raw fps/$ numbers but it would be at the display level that the scaling was done.trexpesto - Saturday, July 10, 2004 - link
Not sure I like the log scaling.Looking at http://anandtech.com/video/showdoc.aspx?i=2113&...
in the first graph with no AA/AF the scores for 9800XT vs 6800NP are 12.2 and 14.9 respectively.
So:
9800XT 66fps/$400 = .165 AT-score = 12.2
6800NP 93fps/$300 = .31 AT-score = 14.9
So the fps/$ is almost double but the AT-score is about one-fifth increase.
Not that anyone will ever pay 400 for an 9800XT again - vendors are you listening?
AtaStrumf - Saturday, July 10, 2004 - link
You might wonna improve the presentation of your results (at least in the Final Words page) by putting them in as few as possible tables (and add/or only use relative results (%) beside actual FPS (mandatory for the GPU which represents 100% or 1,00), to better relate the difference in perfomance -- now being represented by graph bars length which is not that accurate) instead of hundrets of graphs on tens of pages. It makes getting a good overview and doing quick comparisons very hard. You can also use some creative color coding (green = good, red = bad, etc.) to futher increase the readability of these tables. You have done a bit of this in some of your more recent reviews and I'd like see you do it even more and even step it up a notch or two.DerekWilson - Saturday, July 10, 2004 - link
as for the log scale ...if we had said:
Value = fps / price
we would have had tiny values which look really bad on our graphs :-) So our first option was to scale the graphs up by multiplying by 100:
Value = 100 * fps / price
The problem with this is that the visual impact of the scaling seemed too dramatic. Now, we don't mean with the high priced ultrasuperextremegold cards. The problem was the difference between the X800 Pro and the 6800GT ... The GT certainly comes out on top, but we are talking about current market prices and a difference of $10. We didn't want to give this difference (the cost of a couple value meals at the local fast food place) more weight than people might ascribe it.
Of course, there are infinite other ways to look at this. Is frame rate *REALLY* important to you? Then its perfectly valid to look at this graph:
Value = fps^2 / price
or is cost the absolute driving factor to you?
Value = fps / price^2
Its all relative really, and very hard to determine. We just wanted to show which cards will give you more frames per dollar, but we didn't want to ascribe a dollar value to every frame ... So the log scale seemed to be our best bet.
But we are definitely open to feedback on how we approach value from a quantitative standpoint. Please let us know what you think, or if there is a better way to approach the issue.
Thanks!
DerekWilson - Saturday, July 10, 2004 - link
Warcraft III is an interesting game. It is a DX8 game, but OpenGL rendering can be enabled with a commmandline switch. In doing some testing, we noticed that OGL mode gets MUCH higher frame rates on both cards (if you talk to some hard core wc3 freaks out there, they might already enable OGL mode to keep framerate up).We spoke with Blizzard about the differences between the DX8 and OGL rendering paths. Specifically, we wanted to know if there was a quality difference that would make the OGL path run faster than the DX8 path. Blizzard informed us that the two paths had converged on the same quality level. They would not say that the DX8 and OGL paths rendered the exact /same/ image, but they maintained that the only reason they didn't was that DX8 and OGL can't rendere the exact same pixels in certain situations.
We may taclke some OGL benches in WC3 in the future if people are interested. But since we're trying to show realworld performance in these benches, and DX8 is what most people will use, we haven't felt like including these numbers would be useful for now.
Blizzard didn't say anything about capping framerates in DX8, but, then, we didn't ask that question specifically.
Marsumane - Friday, July 9, 2004 - link
Actually WCIII is DX8 based. Also, wc3 is quite odd with graphic cards. Even with a high end cpu, on toms hardware, the cards dont exactly scale evenly as they do in other benchmarks. Its like theres a cap placed on them in the engine or ATI's drivers dont have some sort of cap on them. I believe it is in the engine due to all cards seeming to have similar frames dispite the card used (unless AA or AF is turned on). The ati cards have a slightly lower cap it seems. I mean the difference is like 3fps between the 9700p and the 6800U. It makese no sense why they all stop here.TrogdorJW - Friday, July 9, 2004 - link
My final comment (for now):On the Warcraft III page, you had this to say: "Even at 16x12, this benchmark is very CPU limited, and yes, vsync was disabled. Oddly, when AA/AF is enabled, the FX 5950U actually outperforms the X800 XT PE. This is an atypical situation, and we will try to look into the matter further."
My thought on this is that the likely reason has to do with optimizations. In most benchmarks, the 6800 series of cards outperforms their X800 equivalents when running at standard settings. Enabling 4xAA and 8xAF often diminishes the gap or shifts the benchmark into ATI's favor. However, you don't really do a full suite of benchmarks, so it's difficult to say why the shift takes place. Having looked at other sites, the shift seems to be related almost entirely to the anisotropic filtering. Turning on/off AA seems to have very little impact on the placing of the cards when you're not CPU limited, while turning on/off AF can put a much larger burden on the Nvidia cards, especially cards of the FX era.
So what does this have to do with Warcraft III? Well, I won't bother arguing which of the two AF methods is actually better out of Nvidia and ATI. They seem to be roughly equivalent. However, ATI seems to get more AF performance out of their hardware. Basically, the ATI algorithm simply appears to be superior in performance.
So again, what does this have to do with Warcraft III and the Geforce FX? One word: perspective. WCIII uses an overhead perspective, so much of the screen is filled with polygons (the ground) that are perpendicular to the camera angle. If I recall correctly from my graphics programming classes, there is less that can be done to optimize the AF algorithms in this scenario. I believe that perpendicular polygons are already almost "perfectly optimized". (Or maybe it's just that Nvidia has better optimizations on the FX architecture in this instance?) The end result is that the GPU doesn't have to do a whole lot of extra work, so in this particular instance, the FX architecture does not suffer nearly as much when enabling AF. Not that any of us would actually go out and buy an FX5950 these days....
Honestly, though, the benchmarking methodology for WCIII (playback of a demo at 8X speed) seems pretty much worthless - i.e. on the level of 3DMark usefulness. It's a DX7 game that will run well even on old Pentium 3 systems with GeForce 2 cards, and anything more recent than a GeForce 4 Ti with a 2 GHz CPU will have no difficulty whatsoever with the game. Running a demo playback at 8X might not work well, but that's not actually playing the game. I'm sure there are plenty of WCIII fans that think this is a meaningful performance measurement, but there are probably people out there that still play the original Quake and think that it gives meaningful results. :)
TrogdorJW - Friday, July 9, 2004 - link
A few other comments from the article:"The 9700 Pro may be a good value for many games, but it just won't deliver the frame rates in current and future titles, at the resolutions to which people are going to want to push their systems."
I really have to disagree with that opinion. These tests were done exclusively at 1280x1024 and 1600x1200, as well as with 4xAA and 8xAF. Only the extreme fringe of gamers actually have a desire to push their systems that far. Well, I suppose we would all *want* to, but most of us simply cannot afford to. First, you would need a much better monitor than the typical PC is equipped with - 19" CRT or 17" LCD would be the minimum. You would also need to run at 4xAA and 8xAF at the maximum resolution your display supports in several of the games. Finally, you would need to max out all the graphics in each game. While some people certainly feel this is "necessary", I'm pretty sure they're in the minority.
My opinion? The difference between 800x600 and 800x600+2xAA is rather noticeable; the difference between 800x600+2xAA and 800x600+4xAA is much less so. I also think that 800x600+4xAA is roughly equivalent to 1024x768+2xAA or 1280x1024 without any AA. Personally, I would prefer higher resolutions up to a point (beyond 1280x1024, it's not nearly as important). For graphical quality, there's a pretty major improvement from bilinear to trilinear filtering, but you don't notice the bump to anisotropic filtering nearly as much. There is also a very drastic change in quality when going from low detail to medium detail, and generally a noticeable change when going from medium to high detail. Beyond that (going to very high or ultra high - assuming the game permits), there is usually very little qualitative difference, while the performance generally suffers a lot.
But hey - it's just one man's opinion against anothers. I point this out not as a rebuke of your opinion. It is as disagreement with your pushing your opinion as being something more. Often, writers don't like wishy-washy conclusions, but a more moderate stance is probably warranted with many of the hardware sites. The fastest hardware comes with a major price increase that most people are simply unwilling to pay. The use of a logarithmic scale is also part of this problem, as most people would be more than happy to pay half as much for 75% of the performance.
TrogdorJW - Friday, July 9, 2004 - link
#24 - I'm amazed that you're the only other person that even wondered about that. Basically, using the Log of the performance/price makes everything a lot closer. There is a reason for this, of course: if you take the straight performance/price (multiplied by 10 or 100 if you want to get the numbers into a more reasonable range), it makes all the expensive cards look really, really bad.However, the reality is that while an X800 Pro or 6800 GT might cost over twice as much as the 9700 Pro, there is a real incentive to purchase the faster cards. Minimum frame rates on a 9700 Pro would often be completely unacceptable at these resolutions. The use of a logarithmic chart makes large differences in price and/or performance less of a deal killer.
For example, let's look at Warcraft III 1600x1200 without AA/AF. The cards range from 58.2 to 61.1 FPS, but the price range is from $300 to $600. In this particular instance, the $300 6800 would be almost twice as "desirable" as the 6800UE or X800XTPE. Apply their log-based calculation to it, though, and the 6800 is now only 30% more desirable than the $600 cards.
What it amounts to, though, is their statement at the beginning: every person has a different set of criteria for rating overall "value". In Anandtech's case, they like performance and are willing to pay a lot of extra money for it. (Which of course flies in the face of their comments about the $10 difference in price between the 6800GT and X800 Pro, but that's a different story. As someone already pointed out, if the GT leads in performance, costs a little less, and also has more features, what numbskull wouldn't choose it over the X800 Pro?!? Of course, there are instances where the X800 Pro still wins, so if you value those specific situations more, then you might want the X800 instead of the GT.)
Leuf - Friday, July 9, 2004 - link
How can you leave out the 9800 pro when talking value, especially when the video card guide right under this article says the 9800 pro is the best price/perfomance now?One thing you don't take into account is that someone buying a lower end card probably doesn't have the same cpu as someone buying a top end card. While it wouldn't make sense to test each card with a different cpu for this article it's worth mentioning. I'd actually like to see a perfomance plot of a couple value cards tested across the gamut of cpus. Looking at video card value and cpu value completely separate from each other isn't necessarily going to lead to the best choices.
Neekotin - Friday, July 9, 2004 - link
araczynski, i used an asetek waterchill v2. dedicated only for the GPU and a custom coolant, my recipe... i havent tried it with a CPU, my 3400 is barely overclockable.Marsumane - Friday, July 9, 2004 - link
Yes that is a good deal. But it doesnt represent the actual price of the card w/o promotions. The GT was $300 from bestbuy. Thats not the overall price tho, just a pricing mistake. You cant count them.snikrep - Friday, July 9, 2004 - link
Did you guys notice the Far Cry specs? That's some pretty huge numbers... the X800XT is beating the 6800 Ultra by about 15FPS from what I could tell.That's huge!!
And those "actual" pricing numbers seem way off... I picked up a retail X800 Pro from Best Buy 2 weeks ago for $399, so I don't see why we'd include price gouging vendors.
And the X800XT Platinum Edition is below $499 at most places, I personally have it on order from Gateway for $390 which makes it the best deal by FAR (of course I'll get it sometime in August with my luck, but who cares, it's cheap).
nserra - Friday, July 9, 2004 - link
So many rich people here, discussing de price of 600$ card's, and worried about their little price differences (10$), funny.After watching this review, I would go for a 9700pro or 9800pro, or even better a softmod 9500/9800se.
I play always (if the games permit) at 1024x768 2xAA and 4xAF. It's more than enough.
The review doesn’t take into account that most of monitors (CRT) do 60 Hz at 1280x1024 and 1600x1200.
araczynski - Friday, July 9, 2004 - link
Neekotin: What hardware are you using for your liquid cooling setup? I've been thinking about incorporating it into my next build possibly.Drayvn - Friday, July 9, 2004 - link
Sorry to post again, but the cheapest 6800 Ultra, we dont even have the Extreme yet is....$621
So the difference is about $100 still, which in my opinion i would by the XT-PE still, but prices could go down when the UE comes out, dunno...
Drayvn - Friday, July 9, 2004 - link
Actually i just found it for $530 over here in the UKDrayvn - Friday, July 9, 2004 - link
In England the price of the XT-PE is about $565 and u could probably find it lower, at around $550 to $540...Noli - Friday, July 9, 2004 - link
Guys if you don't like the value information 'overkill', er... just don't hover your mouse over the graphs?Actually my beef is slightly different which is why do anandtech log the fps/$ ?? There may be a good reason but am not sure what it is...
Marsumane - Friday, July 9, 2004 - link
Something that I dont think is quite right is that they are doing these benchmarks to determine the value of a card. If you use SM2.0 for the 6800 series and the X800 series you will not be seeing the entire value of purchasing a 6800 based graphics card. SM3.0 IS A FACTOR IN VALUE!DarkKnight - Friday, July 9, 2004 - link
I completely agree with #18, just too much value information for me. In the end of an article just give a graph of the overall value, something like they do at THG.DarkKnight - Friday, July 9, 2004 - link
ZobarStyl - Friday, July 9, 2004 - link
I love my LeadTek 4200, and the 6800nu is right up my alley...not like I need 256 for anything I do anyway. Great article, now I'm sure that once the gouging stops if I can find one for 250 it's mine. And ATi fanboys please stop posting their prices like they are wrong, everyone is overcharging right now...and the XTPE does not equal the XT #17...if the PE costs the same as the XT, who the hell will buy the XT when the PE is clocked higher stock and performs better?Hell at my local Best Buy the Pro (yes, the Pro) is proudly sold for 499.99; so much for the MSRP...
rjm55 - Friday, July 9, 2004 - link
#18 - I infer that you meant "I am not implying . . ." in your comment.Derek - The "value" thing is a good idea, but using it in every graph is really more information than any of us need - which makes it more confusing than it needs to be. Not many are interested in comparing bucks per frame in Eve at 1600x1200 to bucks per frame in Halo, for example. What's in the article about value is geeky overkill, when what I want to know is true overall value, or bang for the buck.
Maybe you can settle on a bench ot two to best illustrate value instead of making it so complicated you have to run a computer analysis to figure it out.
binger - Friday, July 9, 2004 - link
nice article, but too bad you didn't touch the issues of heat and noise. for me, those factors are far more decisive than, say, a 10$ price premium or a performance difference of a couple of fps.deathwalker - Friday, July 9, 2004 - link
An after thought to my original post on this review...there seems to be a great deal of emphasis put on 1600 X 1200 performance in these reviews..I know there are still a great number of gamers out there still using CRT moniotors..but..with the growing popularity of LCD monitors this 1600 X 1200 performance range is unobtainable for most LCD owners as most 17" and 19" LCD monitors operate with 1280 X 1024 as the optimal native setting. I am not infering though that 1600 X 1200 is not revelant in this testing process...it's just an observation.gordon151 - Friday, July 9, 2004 - link
[q]ATI Radeon X800 XT: $540ATI Radeon X800 XT Platinum Edition: $?[/q]
The X800 XT == X800 XT PE with respect to your prices.
deathwalker - Friday, July 9, 2004 - link
The use of ATI catalyst 4.X drivers has proven to be a real thorn in the side of ATI card owners(in my opinion). I have a 9700Pro and the use of any catalyst driver above 3.9 results in a 3-5% performance hit regardless of the benchmark I use for measurement. I wonder if the resulting compatibility issues resolved in the 4.X series is worth the tradeoff in performance. ATI needs to remain focused on performance issues also in there driver releases. I have been a long time ATI fan (since the release of the 8500)..but..this article clearly indicates that Nvidia is in the process of or already has put a trump card on the table with the 6800 series of VGA cards that ATI needs to respond to.RyanVM - Friday, July 9, 2004 - link
How did the X800 XT come out ahead of the XT-PE by 10fps in the Homeworld 2 16x12 4xAA/8xAF test?Warder45 - Friday, July 9, 2004 - link
Derek, do you really think a fall refresh line is comeing this year? The spring line has barely shown up, let alone the low end ATI cards haven't really been introduced. Seems like that would be a big waste of money for ATI and Nvidia to release refreshes before the current line has market penetration.Slikkster - Friday, July 9, 2004 - link
All I can say is that it's about time, Nvidia, that you've closed the gap on ATI. I'm looking forward to 2 6800 UE's SLI'd on a pci express board. hehehDerekWilson - Friday, July 9, 2004 - link
sorry, to answer the resolution and connection questions:Viewsonic P95f+ using analog. When necessary (with the 6800 U and 6800 UE) we used a DVI to analog adapter.
DerekWilson - Friday, July 9, 2004 - link
Things would have looked different if we had gone with MSRP ... things will also look different in a couple months when the fall refresh line comes out.We didn't feel it was fair to use MSRP because people can't find these cards for MSRP right now. In a month or two, that may or may not be different. And in a month or two, we may take another look at performance and value.
But, as we indicated in the article, the value graphs should really be more of a guide than a definition. If that makes any sense :-)
at80eighty - Friday, July 9, 2004 - link
gotta agree with Shinei, this bunch DEFINITELY makes me wanna shift from my trusty Ti4200 *sniff* : p ... and umm no, im no fanboy; was seriously considering the X800XT, but the tests make me think the 6800GT is worth my money...btw Derek...what monitor + connection were you using for these tests?
/also - the value graphs: - great idea!...something i've come to expect from you guys now : P Keep up the good work!
trexpesto - Friday, July 9, 2004 - link
whoops thought I was just logging in - um - nice article.I've seen newegg testing the waters with 9800 Pro 256MB for $223.00. Once I could believe it was a mistake, twice tho seems like real sampling to see when they should be lowering the price. There's just so many models out there, given a 10$ spacing, it's like prices are shored up from the bottom. Anyways, picked one up, but I let them sweat for a half hour first. ;)
Pumpkinierre - Friday, July 9, 2004 - link
In homeworld2 test the X800XT 1t 1600x1200 goes up in speed when you switch on 4xAA and 8xAF and beats the PE!Shinei - Friday, July 9, 2004 - link
Definitely interesting to see the 6800NU's performance numbers... I almost want to say that the 6800 will be the DX9-generation's Ti4200. Not that that'll stop me from getting a 6800GT or a 6800U to replace my 4200, mind you. :pAnyway, I like the inclusion of the "value" graphs as part of the benchmark results, it makes it easier to see what card gives you the best bang (in the case of these latest cards, a boom?) for your buck. And as always, AT puts out the quality benchmarks that I recommend so highly to everyone.
Mithan - Friday, July 9, 2004 - link
I think the best "value" card out right now is the 6800 GT.sparkz - Friday, July 9, 2004 - link
What kind of monitors were used and also what type of connection was used for testing at 16x12? (DVI or VGA)ViRGE - Friday, July 9, 2004 - link
SilentRunning, they aren't using the MSRP, they're using the actual market price.Neekotin - Friday, July 9, 2004 - link
well i got myself an x800 xt already (pulled a lot of strings...), nice article anyway.. wished you included overclocked numbers, hardcore types do it anyway. i had mine liquid cooled to 620/600... blazing! heheheSilentRunning - Friday, July 9, 2004 - link
[q]ATI Radeon 9700 Pro: $180ATI Radeon 9800 XT: $400
ATI Radeon X800 Pro: $420
ATI Radeon X800 XT: $540
ATI Radeon X800 XT Platinum Edition: $600[/q]
Obviously you need to re-evaluate how you obtain prices. Since ATI's SRP (suggested retail prices) are lower. Those prices are from dealers who are price gouging due to demand.
ATI Radeon X800 Pro: $399 SRP
ATI Radeon X800 XT Platinum Edition: $499 SRP
http://www.ati.com/buy/pricespcusa.html