Comments Locked

95 Comments

Back to Article

  • raks1024 - Monday, January 24, 2005 - link

    free ati x800: http://www.pctech4free.com/default.aspx?ref=46670
  • Ritalinkid - Monday, June 28, 2004 - link

    After reading almost all of the video cards reviews posted on anandtech I start to get the feeling the anandtech has a grudge against nvidia. The reviews seem to put nvidia down no matter what area they excel in. With leading openGL support, ps3.0 support, and the 6850 shadowing the x800 in directX, its seems like nvidia should not be counted out as the "best card."
    I would love to see a review that tested all the features that both cards offered especially if showed the games that would benefit the most from each cards features (if they are available). Maybe then could I decide which is better, or which could benefit me more.
  • BlackShrike - Saturday, May 8, 2004 - link

    Hey if anyone is gonna be buying one of these new cards, would anyone want to sell their 9700 pro or 9800 por/Xt for like 100-150 bucks? If you do contact me at POT989@hotmail.com. Thanks.
  • DonB - Saturday, May 8, 2004 - link

    No TV tuner on this card either? Will there be an "All-In-Wonder" version soon that will include it?
  • xin - Friday, May 7, 2004 - link

    (my bad, I didn't notice that I was on the first page of the posts, and replied to a message there heh)

    Well, since everyone else is throwing their preferences out there... I guess I will too. My last 3 cards have been ATI cards (9700Pro & 9500Pro, and an 8500 "Pro"), and I have not been let down. Right at this moment I lean towards the x800XT.

    However, I am not concerned about power since I am running a TruePower550, and I will be interested in seeing what happens with all of this between now and the next 4-6 weeks when these cards actually come to market... and I will make my decision then on which card to buy.
  • xin - Friday, May 7, 2004 - link


    Besides that, even if it were true (which it isn't), there is a world of difference between have *some* level of support, and requiring it. (*some* meaning the intial application of PS3.0 technology to games, that will likely be as sloppy as your first time in the back of a car with your first girlfriend).

    Game makers will not require PS3.0 support for a long long long time... because it would alienate the vast majority of the people out there, or at least for the time being any person who doesn't have a NV40 card.

    Some games may implement it and look slightly better, or even still look the same only run faster while looking the same.... but I would put money down that by the time PS3.0 usage in games comes anywhere close to mainstream, both mfg's will have their new, latest and greatest cards out, probably a 2 generations or more past these cards.
  • xin - Friday, May 7, 2004 - link


    first of all... "alot of the upcoming topgames will support PS3.0!" ??? They will? Which ones exactly?
  • Z80 - Friday, May 7, 2004 - link

    Good review. Pretty much tells me that I can select either Nvidia or ATI with confidence that I'm getting alot of "bang for my buck". However, my buck bang for video cards rarely exceeds $150 so I'm waiting for the new low to mid range cards before making a purchase.
  • xin - Friday, May 7, 2004 - link


    I love how a handful of stores out there feel the need to rip people off by charing $500+ for the x800PRO cards, since the XT isn't available yet.

    Anyway, something interesting I noticed today:

    http://www.compusa.com/products/product_info.asp?p...

    http://www.compusa.com/products/product_info.asp?p...

    Notice the "expected ship date"... at least they have their pricing right.
  • a2y - Friday, May 7, 2004 - link

    Trog, I Also agree, the thing is.. its true i do not have complete knowledge of deep details of video cards.. u see my current video card is now 1 year old (Geforce4 mx440) which is terrible for gaming (50fps and less) and some games actually do not support it (like deusEX 2). I wanted a card that would be future proof, every consumer would go thinking this way, I do not spend everything i earned, but to me and some others $400-$500 is O.K. If it means its going to last a bit longer.
    I especially worry about the technology used more than the other specs of the cards, more technologies mean future games are going to support it. I DO NOT know what i'v just said actually means, but I fealt it during the past few years and have been affected by it right now (like the deus ex 2 problem!) it just doesn't support it, and my card performs TERRIBLY in all games

    now my system is relatively slow for hardcore gaming:
    P4 2.4GHz - 512MB RDRAM PC800 - 533MHz FSB - 512KB L2 Cache - 128MB Geforce4 mx440 card.

    I wanted a big jump in performance especially in gaming so thats why i wanted the best card currently available.
  • NullSubroutine - Thursday, May 6, 2004 - link

    Trog I agree with you for the most part, but there are some people who can use upgrades. I myself have bought expensive video cards in the past. I got the Geforce3 right when it came out (in top of the line alienware system for 1400 bucks), and it lasted me for 2-3 years. Now if someone spends 400-500 bucks on a video card that lasts them that long (2-3 years) its no different than if someone buys a 200 buck video card every year. I am one of those people who likes to buy new compoents when computing speed doubles and if I have the money I'll get what I can that will last me the longest. If I cant afford top of the line Ill get something that will get me by (9500pro last card I bought for 170 over a year ago).

    However I do agree with you that people who upgrade to the best every generation is silly.
  • TrogdorJW - Thursday, May 6, 2004 - link

    I'm sorry, but I simply have to laugh at anyone going on and on about how they're going to run out and buy the latest graphics cards from ATI or Nvidia right now. $400 to $500 for a graphics card is simply too much (and it's too much for a CPU as well). Besides, unless you have some dementia that requires you to run all games at 1600x1200 with 4xAA and 8xAF, there's very little need for either the 6800 Ultra or the X800 XT right now. Relax, take a deep breath, save some money, and forget about the pissing contest.

    So, is it just me, or is there an inverse relationship between a person's cost of computer hardware and their actual knowledge of computers? I have a coworker that is always spending money on upgrading his PC, and he really has no idea what he's doing. He went from an Athlon XP 2800+ (OC'ed to 2.4 GHz) to a P4 2.8 OC'ed to 3.7 GHz. He also went from a 9800 Pro 256 to a 9800 XT. In the past, he also had a GeForce FX 5900 Ultra. He tries to overclock all of his systems, they sound like a jet engine, and none of them are actually fully stable. In the last year, he has spent roughly $5000 on computer parts (although he has sold off some of the "old" parts like the 5900 Ultra). Performance of his system has probably improved by about 25% over the course of the year.

    Sorry for the rant, but behavior like that from *anybody* is just plain stupid. He's gone from 120 FPS in some games up to 150 FPS. Anyone here actually think he can tell the difference? I suppose it goes without saying that he's constantly crowing about his 3DMark scores. Now he's all hot to go out and buy the X800 XT cards, and he's been asking me when they'll be in stores. Like I care. They're nice cards, I'm sure, but why buy them before you actually have a game that needs the added performance?

    His current games du joir? Battlefield 1942 and Battlefield Vietnam. Yeah... those really need a high performance DX9 card. The 80+ FPS of the 9800 XT he has just isn't cutting it.

    So, if you read my description of this guy and think I'm way off base, go get your head examined. Save your money, because some day down the road you will be glad that you didn't spend everything you earned on computer parts. Enjoy life, sure, but having a faster car, faster computer, bigger house, etc. than someone else is worth pretty much jack and shit when it all comes down to it.

    /Rant. :D
  • a2y - Thursday, May 6, 2004 - link

    If a card is going to come up every few weeks then how do you guys choose which to buy?

    ATI have the trade-up section for old cards, is that any good?
  • gxshockwav - Thursday, May 6, 2004 - link

    Um...what happened to the posting of new Ge6 6850 benchmark numbers?
  • NullSubroutine - Thursday, May 6, 2004 - link

    Trog, its good to hear you were being nice, but I wasnt bashing THG, I love that site (besides this one) and I get alot of my tech info from there.

    What I normally do though is I take benchmarks from different sites then put them in Excel, make a little graph and see the % point differences between the tests. If you plan on buying a new vid card its important to find out if the Nvida or ATi card is faster on your type of system.

    And from what I found is that the AMD system from Atech performed better with Nvidia, and Intel system peformed better with ATi from THG (for Farcry and Unreal2004 only ones to be somewhat similar tests).

    #61 How much money did ATi spend when developing the R3xx line? I would venture to say a decent amount...somtimes companies invest more money in a design then refine it several times (at less cost) before starting from scratch again. ATi and Nvidia has done this for quite awhile. Also from what Ive heard the r3xx had the possibilty of 16 pipes to begin with..this true anyone?

    Texture memory about 256 doesnt really matter now b/c of the insane bandwidth the 8x apg has to offer, however one might see that 512 may come in handy after Doom3 comes out since they use shitloads of high res textures instead of high polygons for alot of detail. I dont see 512 coming out for a little while, espescially with ram prices.
  • NullSubroutine - Thursday, May 6, 2004 - link

    Trog, its good to hear you were being nice, but I wasnt bashing THG, I love that site (besides this one) and I get alot of my tech info from there.

    What I normally do though is I take benchmarks from different sites then put them in Excel, make a little graph and see the % point differences between the tests. If you plan on buying a new vid card its important to find out if the Nvida or ATi card is faster on your type of system.

    And from what I found is that the AMD system from Atech performed better with Nvidia, and Intel system peformed better with ATi from THG (for Farcry and Unreal2004 only ones to be somewhat similar tests).

    #61 How much money did ATi spend when developing the R3xx line? I would venture to say a decent amount...somtimes companies invest more money in a design then refine it several times (at less cost) before starting from scratch again. ATi and Nvidia has done this for quite awhile. Also from what Ive heard the r3xx had the possibilty of 16 pipes to begin with..this true anyone?

    Texture memory about 256 doesnt really matter now b/c of the insane bandwidth the 8x apg has to offer, however one might see that 512 may come in handy after Doom3 comes out since they use shitloads of high res textures instead of high polygons for alot of detail. I dont see 512 coming out for a little while, espescially with ram prices.
  • deathwalker - Thursday, May 6, 2004 - link

    Well...once again..someone is lying thru there teeth. What happen to the $399 entry price of the Pro model? Cheapest price on pricewatch it $478. Someone trying to cash in on the new buyer hysteria? I am impressed though with ATI's ability to step up to the plate and steal Nvidia's thunder.
  • a2y - Thursday, May 6, 2004 - link

    OMG OMG!! I almost gone to buy and build a new system with latest specs and graphics card! and was going for the nVidia 6800Ultra ! until just now i decided to see any news from ATI and discovered their new card!

    Man if ATI and nVidia are going to bring up a card every 2/3 weeks then i'll never be able to build this system!!!

    Being a (Pre)fan of half-life 2, I guess im going to wait until its released to buy a graphics card (meaning when we all die and go to hell).
  • remy - Wednesday, May 5, 2004 - link

    For the OpenGL vs D3D performance argument don't forget to take a look at Homeworld2 as it is an OpenGL game. ATI's hardware certainly seems to have come a long way since the 9700 Pro in that game!
  • TrogdorJW - Wednesday, May 5, 2004 - link

    NullSubroutine - It was meant as nice sarcasm, more or less. No offense intended. (I was also trying to head off this thread becoming a "THG sucks blah blah blah" tangent, as many in the past have done when someone mentions their reviews.)

    My basic point (without doing a ton of research) is that pretty much every hardware site has their own demos that they use for benchmarking. Given that the performance difference between the ATI and Nvidia cards was relatively constant (I think), it's generally safe to assume that the levels, setup, bots, etc. are not the same when you see differing scores. Now if you see to places using the same demo and the same system setup, and there's a big difference, then you can worry. I usually don't bother comparing benchmark numbers from two different sites since they are almost never the same configuration.
  • Pumpkinierre - Wednesday, May 5, 2004 - link

    Sorry, scrub that last one. I couldnt help it. I will reform.
  • Pumpkinierre - Wednesday, May 5, 2004 - link

    So, which is better: a64 at 2Gig or P4 at 3.2?
  • jibbo - Wednesday, May 5, 2004 - link

    "Zobar is right; contra Jibbo, the increased flexibility of PS3 means that for many 2.0 shader programs a PS3 version can achieve equivalent results with a lesser performance hit."

    I think you're both still missing my point. There is nothing that says PS3.0 is faster than PS2.0. You are both correct that it has to potential to be faster, though you both assume that a first generation PS3.0 architecture will perform at the same level as a refined PS2.0 architechture.

    PS3.0 is one of the big reasons that nVidia's die size and transistor count are bigger than ATI's. The additional power drain (and consequently heat dissipation) of those 40M transistors also helps to limit the clock speeds of the 6800. When you're talking about ALU ops per second (which dominate math-intensive shaders), these clock speeds become very important. A lot of the 6800's speed for PS3.0 will have to be found in the driver optimizations that will compile these shaders for PS3.0. Left to itself, ATI's raw shader performance still slaughters nVidia's.

    They both made trade-offs, and it seems that ATI is banking that PS3.0 won't be a dealbreaker in 2004. Only time will tell....
  • Phiro - Wednesday, May 5, 2004 - link

    K, I found the $400M that the CEO claimed. He also claimed $400M for the NV3x core as well. It seemed more as a boast than anything, not particularly scientific or exact.

    In any case, ATI supposedly spent $165-180M last year (2003) on R&D, with an estimated increase of 100% for this year. How long has the 4xx core been in development?

    Regardless, ultimately we the consumers are the winners. Whether or not the R&D spent pans out will play out over the next couple years, as supposedly the nv4x core has a 24 month lifespan.

  • 413xram - Wednesday, May 5, 2004 - link

    If you watch nvidia's launch video on their site they mention the r&d costs for their new card.
  • RyanVM - Wednesday, May 5, 2004 - link

    What ever happened to using ePSXe as a video card benchmark?
  • Phiro - Wednesday, May 5, 2004 - link

    Well, Nvidia may have spent $400M on this (I've never seen that number before but we'll go with it I guess) but they paid themselves for the most part.

    ATI's cost can't be too trivialized - didn't they drop a product design or two in favor of getting this out the door instead? And any alteration in the architecture of something doesn't really qualify as a hardware "refresh" in my book - a hardware refresh for an OEM consists of maybe one speed notch increase in the RAM, new bios, larger default HD, stuff like that. MLK is what Dell used to call it - Mid Life Kick.
  • retrospooty - Wednesday, May 5, 2004 - link

    "Precisely. By the time 512mb is useful, the card will be too slow for it to matter, and you'd need a new card any way."

    True...

    Both cards perform great, both have wins and losses depending on the game. The deciding factor will be price and power requirements.

    Since prices will adjust downward, at a fairly equal rate, that leaves power. With Power requirements being so incredibly high with the NV40, that leans me toward ATI.

    413xram also has a good point above. For Nvidia, this is a 400 million dollar new chip design. For ATI, this was a refresh of an old design to add 16 pipes, and a few other features. After the losses NV took with the heavily flawed NV30 and 35 , they need a financial boom, and this isnt it.

  • mattsaccount - Wednesday, May 5, 2004 - link

    There are no games available today that use 256mb of video RAM, let alone 512mb. Even upper-high-end cards routinely come with 128mb (e.g. Geforce FX 5900, Radeon 9600XT). It would not make financial sense for a game developer to release a game that only a small fraction of the community could run acceptably.

    >> I have learned from the past that future possibilties of technology in hardware does nothing for me today.

    Precisely. By the time 512mb is useful, the card will be too slow for it to matter, and you'd need a new card any way.
  • 413xram - Wednesday, May 5, 2004 - link

    #64 Can you explain "gimmick"?
  • 413xram - Wednesday, May 5, 2004 - link

    They announced they where going to in there release anyway. Later on this summer. Why not now?
  • jensend - Wednesday, May 5, 2004 - link

    #61- nuts. 512 mb ram will pull loads more power, put out a lot more heat, cost a great deal more (especially now, since ram prices are sky-high), and give negligible if any performance gains. Heck, even 256 mb is still primarily a marketing gimmick.
  • 413xram - Wednesday, May 5, 2004 - link

    They (ATI) are using the same technology that their previous cards are using. They pretty much just added more transistors to perform more functions at a higher speed. I am willing to bet my paycheck that they spent no where close to 400 million dollars to run neck and neck with nvidia in performance. I guess "virtually nothing" is an overstatement. My apologies.
  • Phiro - Wednesday, May 5, 2004 - link

    Where do you get your info that ATI spent "virtually nothing"?
  • 413xram - Wednesday, May 5, 2004 - link

    Both cards perform brilliantly. They are truly a huge step in graphics processing. One problem I forsee though,is that Nvidia spent 400 million dollars into development of their new nv40 technology, while ATI spent virtually nothing to have the same performance gains. Economically that is a hard pill for Nvidia to swallow.

    It is true that Nvidia's card has the 3.0 pixel shading, unfortunatly though, they are banking on hardware that is not supported upon release of the card. In dealing with video cards from a consumers standpoint that is a hard sell. I have learned from the past that future possibilties of technology in hardware does nothing for me today. Not to mention the power supply issue that does not help neither.

    Nvidia must find a way to get better performance out of their new card, I can't believe I'am saying that after seeing the specs that it already performs at, or it may be a long, HOT, and expensive summer for them.

    P.S. Nvidia. A little advice. Speed up the release on your 512 mb card. That would definetly sell me. Overclocking your 6800 is something that 90% of us in this forum would do anyway.
  • theIrish1 - Wednesday, May 5, 2004 - link


    heh, whatever.. whatever, and whatever. I love the fanboyisms....

    I admit I am a fan of ATI cards. I bought a 9700pro and a 9500pro(in my secondary gaming rig) when they first came out, and an 8500 "pro" before that...but now I want to upgrade again. I am keeping an open mind. After looking at benchmarks, it is clear the both cards have their wins and losses depending on the test. I don't think there is a clear cut winner. nVidia got there by new innovation/technology. ATI got there by optimizing "older" technology.

    At this point, with pricing being the same.. I think I still have to lean to the ATI cards. Main reasons being heat & power consumption. If the 6800U was $75 or $100 cheaper, I would probably go with that. It will be interesting to see where the 6850 falls benchmark wise, and also in pricing. If the 6850 takes the $500 pricepoint, where will that leave the 6800U? $450? Or with the 6850 be $550?

    Something else about the x800Pro (which by the way, alot of the readers/posters seem to be getting confused as to what they are talking about between the Pro and XT models). Anyway, there are a few online stores out there taking pre-orders still for the x800PRO.... for $500+. I thought the Pro was going to go at $400 and the XT at $500...?!?
  • 413xram - Wednesday, May 5, 2004 - link

  • Pumpkinierre - Wednesday, May 5, 2004 - link

    On the fabrication o the two Gpus- the tech report:

    "Regardless, transistor counts are less important, in reality, than die size, and we can measure that. ATI's chips are manufactured by TSMC on a 0.13-micron, low-k "Black Diamond" process. The use of a low-capacitance dielectric can reduce crosstalk and allow a chip to run at higher speeds with less power consumption. NVIDIA's NV40, meanwhile, is manufactured by IBM on its 0.13-micron fab process, though without the benefit of a low-k dielectric."

    The extra transistors of the 6800U might be taken up with the cinematic encoding/rendering embedded chip. Although ATI claim encoding in their X800p/XT blurb, I havent seen much yet to distinguish it from the 9800p in this field. The Tech report checked power consumption at the wall for their test systems and the 6800s ramp up the power a lot quicker with gpu speed so I'm not too hopeful about the overclock to 520Mhz and 6800u extreme gpu yields. Still, maybe a new stepping or 90nm SOI shrink might help (I noticed both manufacturers shied away from 90nm).

    Anyway brilliant video cards from North America. Congratulations ATI and Nvidia!

  • NullSubroutine - Wednesday, May 5, 2004 - link

    If it was nice sarcasm I can laugh, if it was nasty sarcasm you can back off. I can see it would be simple for me to overlook the map used, however no indication to what Atech used. One could assume or someone could ask for the real answer and if they are really lucky they will get a smart ass remark.

    After checking through 10 different reviews I found similar results to Atech when they had 25 bots, THG had none.

    Next time save us both the hassle and just say THG didnt use bots, and Atech probably did.
  • TrogdorJW - Tuesday, May 4, 2004 - link

    #54 - Think about things for a minute. Gee... I wonder why THG and AT got such different scores on UT2K4.... Might it be something like the selection of map and the demo used? Nah, that would be too simple. /sarcasm

    From THG: "For our tests in UT2004 we used our own timedemo on the map Assault-Torlan (no bots). All quality options are set to maximum."

    No clear indication of what was used for the map or demo on AT, but I'm pretty sure that it was also a home-brewed demo, and likely on a different map and perhaps with a different number of players. Clearly, though, it was not the same demo as THG used... unless THG is in the habit of giving their benchmarking demos out? Didn't think so.

    I see questions like this all the time. Unless two sites use the exact same settings, it's almost impossible to directly compare their scores. There is no conspiracy, though. Both sites pretty much say the same thing: close match, with the edge going to ATI right now, especially in DX9, while NV still reigns supreme in OGL.
  • TrogdorJW - Tuesday, May 4, 2004 - link

    Nice matchup we've got here! Just what we were all hoping for. Unfortunately, there are some disappointing trends I see developing....

    ShaderMark 2.0, we see many instances where the R420 is about 25% faster than the NV40. Let's see... 520 MHz vs 400 MHz. 'Nuf said, I think. Too bad for Nvidia that they have 222 million transistors, so they're not likely to be able to reach 500 MHz any time soon. (Or if they can, then ATI can likely reach 600+ MHz.)

    How about the more moderately priced card matchup? The X800 Pro isn't looking that attractive at $400. 25% more price gets you 33% more pipelines, which will probably help out on games that process a lot of pixels. And the Pro also has 4 vertex pipelines compared to 6? The optimizations make it better than a 9800XT, but not by a huge margin. The X800 SE with 8 pipelines is likely going to be about 20% faster than an 9800XT. Hopefully, it will come in at a $200 price point, but I'm not counting on that for at least six months. (Which is why I recently purchased a $200 9800 Pro 128.)

    The Nvidia lineup is currently looking a little nicer. The 6800 Ultra, Ultra Special, and GT all come with 16 pipelines, and there's talk of a lower clocked card for the future. If we can get a 16 pipeline card (with 6 vertex pipelines) for under $250, that would be pretty awesome. That would be a lot like the 5900 XT cards. Anyone else notice how fast the 9800 Pro prices dropped when Nvidia released the 5900 XT/SE? Hopefully, we'll see more of that in the future.

    Bottom line has to be that for most people, ATI is still the choice. (OpenGL gamers, Linux users, and professional 3D types would still be better with Nvidia, of course.) After all, the primary benefit of NV40 over R420 - Shader Model 3.0 - won't likely come into play for at least six months to a year. Not in any meaningful way, at least. By then, the fall refresh and/or next spring will be here, and ATI could be looking at SM3.0 support. Of course, adding SM3 might just knock the transistor counts of ATI's chips up into the 220 million range, which would kill their clock speed advantage.

    All told, it's a nice matchup. I figure my new 9800 Pro will easily last me until the next generation cards come out, though. By then I can look at getting an X800 Pro/XT for under $200. :)
  • NullSubroutine - Tuesday, May 4, 2004 - link

    I forgot to ask if anyone else noticed a huge difference (almost double) between AnandTechs UnrealTourment 2003 scores and that of Toms Hardware?

    (Its not the CPU difference, because the A64 3200+ had a baseline score of ~278 and the 3.2 P4 had a ~247 on a previous section.)

    So what gives?
  • NullSubroutine - Tuesday, May 4, 2004 - link

    The guy talking about the 400mhz and the 550mhz I have this to say.

    I agree with the other guy about the transistor count.

    Dont forget that ATi's cards used to be more powerful per clock speed compared to Nvidia a generation or two ago. So dont be babbling fanboy stuff.

    I would agree with that one guy (the # escapes me) about the fanboy stuff, but I said it first! On this thread anyways.
  • wassup4u2 - Tuesday, May 4, 2004 - link

    #30 & 38, I believe that while the ATI line is fabbed at TSMC, NVidia is using IBM for their NV40. I've heard also that yields at IBM aren't so good... which might not bode well for NVidia.
  • quanta - Tuesday, May 4, 2004 - link

    > #14, I think it has more to do with the fact those OpenGL benchmarks are based on a single engine that was never fast on ATI hardware to begin with.

    Not true. ATI's FireGL X2 and Quadro FX 1100 were evenly matched in workstation OpenGL tests[1], which do not use Quake engines. Considering FireGL X2 is based on the Radeon 9800XT and Quadro FX 1100 is based on GeForce FX 5700 Ultra, such result is unacceptable. If I were an ATI boss, I would have made sure the OpenGL driver team would not make such a blunder, especially when R420 still sucks in most OpenGL games compared to GeFocre 6800 Ultra cards.

    [1] http://www.tomshardware.com/graphic/20040323/index...
  • AlexWade - Tuesday, May 4, 2004 - link

    From my standpoint the message is clear: nVidia is no longer THE standard in graphic cards. Why do I say that? It half the size, it requires less power, it has less transistors, and the performance is about the same. Even if the performance was slightly less, ATI would still be winner. Anyway, whatever, its not like these benchmarks will deter the hardcore gotta-have-it-now fanboys.

    Its not like I'm going to buy either. Maybe this will lower the prices of all the other video cards. $Dreams$
  • rsaville - Tuesday, May 4, 2004 - link

    If any 6800 users are wondering how to make their 6800 run the same shadows as the 5950 in the benchmark see this post:
    http://forums.relicnews.com/showthread.php?p=39462...

    Also if you want to make your GeForceFX run the same shadows as the rest of the PS2.0 capable cards then find a file called driverConfig.lua in the homeworld2\bin directory and remove line 101 that disables fragment programs.
  • raskren - Tuesday, May 4, 2004 - link

    I wonder if this last line of AGP cards will ever completely saturate the AGP 8X bus. It would be interesting to see a true PCI-Express card compared to the same AGP 8X counterpart.

    Remember when Nvidia introduced the MX440 (or was it 460?) with an 8X AGP connector...what a joke.
  • sisq0kidd - Tuesday, May 4, 2004 - link

    That was the cheesiest line #46, but very true...
  • sandorski - Tuesday, May 4, 2004 - link

    There is only 1 clear winner here, the Consumer!

    ATI and NVidia are running neck-neck.
  • rms - Tuesday, May 4, 2004 - link

    "the near-to-be-released goodlooking PS 3.0 Far Cry update "

    When is that patch scheduled for? I recall seeing some rumour it was due in September...

    rms
  • Fr0zeN - Tuesday, May 4, 2004 - link

    Yeah I agree, the GT looks like it's gonna give the x800P a run for its money. On a side note, the differences between P and XT versions seem to be greater than r9800's, hmm.

    In the end it's the most overclockable $200 card that'll end up in my comp. There's no way I'm paying $500 for something that I can compensate for by turning the rez down to 10x7... Raw benchmarks mean nothing if it doesn't oc well!
  • Doop - Tuesday, May 4, 2004 - link

    The cards seem very close, I tend to favor nVidia now since they have superior multi monitor and professional 3D drivers and I regret buying my Fire GL X1.

    It's strange ATi didn't announce a 16 pipeline card orginally, it will be interesting to see in a month or two who actually ends up delivering cards.

    I mean if they're being made in significant quantities they'll be at your local store with a reduced 'street' price but if it's just a paper launch they'll just be at Alienware, Dell (with a new PC only) or $500 if you can find one.
  • jensend - Tuesday, May 4, 2004 - link

    #17, the Serious Engine has nothing to do with the Q3 engine; Nvidia's superior OpenGL performance is not dependent on any handful of engines' particular quirks.

    Zobar is right; contra Jibbo, the increased flexibility of PS3 means that for many 2.0 shader programs a PS3 version can achieve equivalent results with a lesser performance hit.

    As far as power goes, I'm surprised NV made such a big deal out of PSU requirements, as its new cards (except the 6800U Extremely Short Production Run Edition/6850U/Whatever they end up calling that part) compare favorably wattage-wise to the 5950U and don't pull all that much more power than the 9800XT. Both companies have made a big performance per watt leap, and it'll be interesting to see how the mid-range and value cards compare in this respect.
  • blitz - Tuesday, May 4, 2004 - link

    "Of course, we will have to wait and see what happens in that area, but depending on what the test results for our 6850 Ultra end up looking like, we may end up recommending that NVIDIA push their prices down slightly (or shift around a few specs) in order to keep the market balanced."

    It sounds as if you would be giving nvidia advice on their pricing strategy, somehow I don't think they would listen nor be influenced by your opinion. It could be better phrased that you would advise consumers to wait for prices to drop or look elsewhere for better price\performance ratio.
  • Cygni - Tuesday, May 4, 2004 - link

    Hmmmm, interesting. I really dont see where anyone can draw the conclusion that the x800 Pro is CLEARLY the winner. The 6800 GT and x800 Pro traded game wins back and forth. There doesnt seem to be any clear cut winner to me. Wolf, JediA, X2, F1C, and AQ3 all went clearly to the GT... this isnt open and shut. Alot of the other tests were split depending on resolution/AA. On the other hand, I dont think you can say that the GT is clearly better than the x800 Pro either.

    Personally, I will buy whichever one hits a reasonable price point first. $150-200. Both seem to be pretty equal, and to me, price matters far more.
  • kherman - Tuesday, May 4, 2004 - link

    BRING ON DOOM 3!!!!!!

    We all know inside that this is what ID was waiting for!
  • Diesel - Tuesday, May 4, 2004 - link

    ------------------
    I think it is strange that the tested X800XT is clocked at 520 Mhz, while the 6800U, that is manufactured by the same taiwanese company and also has 16 pipelines, is set at 400 Mhz.
    ------------------

    This could be because NV40 has 222M transistors vs. R420 at 160M transistors. I think the amount of power required and heat generated is proportional to transistor count and clock speed.
  • edub82 - Tuesday, May 4, 2004 - link

    I know this is an ATI article but that 6800 GT is looking very attractive. It beats the x800Pro on a fairly regular basis is a single slot and molex connector card and is starting at 400 and hopefully will go down a few dollars ;) in 6 months when i want to upgrade.
  • Slaanesh - Tuesday, May 4, 2004 - link

    "Clearly a developer can have much nicer quality and exotic effects if he/she exploits these, but how many gamers will have a PS3.0 card that will run these extremely complex shaders at high resolutions and AA/AF without crawling to single-digit fps? It's my guess that it will be *at least* a year until games show serious quality differentiation between PS2.0 and PS3.0. But I have been wrong in the past..."
    --------

    I dunnow.. When Morrowind got released, only he few GF3 cards on the market were able to show the cool pixel shader water effects and they did it well; at that time I was really pissed I went for the cheaper Geforce2 Ultra although it had some better benchmarks at a much lower price. I don't think I want make that mistake again and pay the same amount of money for a card that doesnt support the latest technology..
  • ZobarStyl - Tuesday, May 4, 2004 - link

    Jibbo I thought that the dynamic branching capability as part of PS3.0 could make rendering a scene faster because it skips rendering unneccessary pixels and thus could offer an increase in performance, albeit a small one. In an interview one of the developers of Far Cry said that there weren't many more things that PS3.0 could do that 2.0 can't, but that 3.0 can do things in a single pass that a 2.0 shader would have to do in multiple passes. The way he described it, the real pretty effects can come in later but a streamlined (read: slightly faster) shader could very well improve NV40 scores as is. This seems kind of analogous to the whole 64-bit processor ordeal going on; Intel says you don't need it, but then most articles show higher scores from A64 chips when they are in a 64 bit OS, so basically if you streamline it you can run a little bit faster than in less efficient 32-bit.

    In the end, it'll still be bitter fanboys fighting it out and buying whatever product their respective corporation feeds them, despite features or speeds or price or whatever. Personally, like I said before, I'll wait and see who really ends up earning my dollar.

    Anyway, thanks for keeping me on my toes though, jib...I can't get lazy now... =)
  • Barkuti - Tuesday, May 4, 2004 - link

    From my point of view, the 6800U is superior high end hardware. Folks, you don't need to be that intelligent to understand that if ATI needs 520 Mhz to "beat" nVidia's 400 MHz chip, as it will need to overclock proportionally to keep the same level of performance that means it will need a good bunch of extra MHz to stay at least on par on the overclocking front.

    I think the final revision of the 6800U will manage 500 MHz overclocks or around (probably more if they deliberately set the initial clock low waiting for ATI), so ATI's hardware may need around 650 Mhz, which I doubt it'll make. As for the power requirements, sure ATI is the winner, but the nVidia's card can be fed with more standard PSU's than they claim; I just think they played on the safe side.
    Oh, sure, power may be a limiting factor when oc'ing the 6800U, but the reality is that people who buy these kind of harware already has top end computer components (including the PSU), so no worries here also.

    And finally speaking, I think PS 3.0 will make some additional difference. With the possibility to somewhat enhance shader performance and the superior displacement mapping effect, it may give it the edge in at least a handful of games. We'll see.

    "Just my 2 cents"
    Cheers
  • Staples - Tuesday, May 4, 2004 - link

    Everyone be sure to check out Tom's review. Looks like the X800 did better here than it did against the 6800. I have seen other reviews and the X800 doesn't really seem as fast in comparison as it does here.

    Anyway, it is a lot faster than I though. The 6800 was impressive but it seems that the reason it does really well in some games and not so great in others is because some games have NVIDIA specific code that the 6800 takes advantage of very well.
  • UlricT - Tuesday, May 4, 2004 - link

    wtf? the GT is outperforming the Ultra in F1 Challenge?
  • jibbo - Tuesday, May 4, 2004 - link

    Agree with you all the way on the fanboys, ZobarStyl.

    Just wanted to point out that PS3.0 is not "faster" - it's simply an API. It allows longer and more complex shaders so, if anything, it's likely to be "slower." I'm guessing that designers who use PS3.0 heavily will see serious fill-rate problems on the 6800. These shaders will have potentially 65k+ instructions with dynamic branching, a minimum of 4 render targets, 32-bit FP minimum color format, etc - I seriosuly doubt any hardcore 3.0 shader programs will run faster than existing 2.0 shaders.

    Clearly a developer can have much nicer quality and exotic effects if he/she exploits these, but how many gamers will have a PS3.0 card that will run these extremely complex shaders at high resolutions and AA/AF without crawling to single-digit fps? It's my guess that it will be *at least* a year until games show serious quality differentiation between PS2.0 and PS3.0. But I have been wrong in the past...
  • T8000 - Tuesday, May 4, 2004 - link

    I think it is strange that the tested X800XT is clocked at 520 Mhz, while the 6800U, that is manufactured by the same taiwanese company and also has 16 pipelines, is set at 400 Mhz.

    This suggests a lot of headroom on the 6800U or a large overclock on the X800XT.

    Also note that the 6800U scored much better on tomshardware.com (HALO 65FPS@1600x1200), but that can also be caused by their use of the 3.2 Ghz P4 instead of a 2.2 Ghz A64.
  • ZobarStyl - Tuesday, May 4, 2004 - link

    I love seeing these fanboys announce each product as the best thing ever (same thing happened with the Prescott, Intel fanboys called it the end of AMD and the AMD guys laughed and called it a flamethrower) without actually reading the benches. NV won some, ATi won some. Most of the time it was tiny margins either way. Fanboys aside, this is gonna be a driver war nothing more. The biggest margin was on Far Cry, and I'm personally waiting on the faster PS3.0 to see what that bench really is. This is a great card but price drops and drivers updates will eventually show us the real victor.
  • jibbo - Tuesday, May 4, 2004 - link

    If I had to guess, DX10 and Longhorn will coincide with the release of new hardware from everyone.
  • Akaz1976 - Tuesday, May 4, 2004 - link

    Just thought of something. If i am reading AT review right, ATi now has milked the original Radeon9700 architecture for nearly 2 years (sure says a lot of good things about the ArtX design team).

    Anyone know when the true next gen chip can be expected?

    Akaz
  • Ilmater - Tuesday, May 4, 2004 - link

    ---------------------------------------
    Hearing about the 6850 and the other Emergency-Extreme-Whatever 6800 variants that are floating about irritates me greatly. Nvidia, you are losing your way!

    Instead of spending all that time, effort and $$ just to try to take the "speed champ" title, make your shit that much cheaper instead! If your 6800 Ultra was $425 instead of $500, that would give you a hell of alot more market share and $$ than a stupid Emergency Edition of your top end cards... We laugh at Intel for doing it, and now you're doing it too, come fricking on...
    --------------------------------------------
    This is ridiculous!! What do you think the XT Platinum Edition from ATI is? The only difference is that nVidia released first, so it's more obvious when they do it than when ATI does. I'm not really a fanboy of either, but you shouldn't dog nVidia for something that everyone does.

    Plus, if nVidia dropped their prices, ATI would do the same thing. Then nVidia would be right back where it was before, but they wouldn't be making any money on the cards.
  • adntaylor - Tuesday, May 4, 2004 - link

    I wish they'd also tested with an nForce3 motherboard. nVidia have managed some very interesting performance enhancements on the AGP to HT tunnel that only works with the nVidia graphics cards. That might have pushed the 6800 in front - who knows!
  • UlricT - Tuesday, May 4, 2004 - link

    Hey... Though the review rocks, you guys desperately need an editor for spelling and grammar!
  • Jeff7181 - Tuesday, May 4, 2004 - link

    This pretty much settles it. With the excellent comparision between architectures, and the benchmark scores to prove the advantages and disadvantages of the architecture... my next card will be made by ATI.
    NV40 sure has a lot of potential, one might say it's ahead of it's time, supporting SM 3.0 and being so programmable. However, with a product cycle of 6 months to a year, being ahead of it's time is more of a disadvantage in this case. People don't care what it COULD do... people care what it DOES do... and the R420 seems to do it better. I just hope my venture into the world of ATI doesn't turn into driver hell.
  • NullSubroutine - Tuesday, May 4, 2004 - link

    Im fan boy for neither company and objectively I can say the cards are equal. Some games the ATI cards are faster other games the Nvidia cards are faster. So it all depends on the game you play to which one is better and the price of the card you are looking for. (Hmm, maybe motherboard companies could make 2 AGP slots...)

    About the arguement of the PS 2.0/3.0...

    2.0 Cards will be able to play games with 3.0, they may not have full functionality or they may run it slower. This will remain to be seen till games begin to use 3.0. However...

    The one thing bad for Nvidia in my eyes is the pixel shader quality that can be seen in Farcry, whether this is a game or driver glitch it is still unknown.

    I forgot to add I like that the ATI cards use less power, I dont want to have to pay for another PSU ontop of already high prices of video cards. I would also like to see a review again a month from now when newer drivers come out to see how much things have changed.
  • l3ored - Tuesday, May 4, 2004 - link

    pschhh, did you see the unreal 3 demo? in the video i saw, it looked like it ran at about 5fps imagine running halo on a gfx 5200. however you could run it if you were to turn of halo's PS 2 effects. i think thats how it's going to be with unreal 3
  • Slaanesh - Tuesday, May 4, 2004 - link

    Since PS 3.0 is not supported by the X800 hardware, does this mean that those extremely impressive graphical features showed in the Unreal 3 tech demo (NV40 launch) and the near-to-be-released goodlooking PS 3.0 Far Cry update are both NOT playable on the X800?? This would be a huge disadvantage for ATi since alot of the upcoming topgames will support PS3.0!
  • l3ored - Tuesday, May 4, 2004 - link

    i agree phiro, personally i think im gonna get the one that hits $200 first (may be a while)
  • Phiro - Tuesday, May 4, 2004 - link

    Hearing about the 6850 and the other Emergency-Extreme-Whatever 6800 variants that are floating about irritates me greatly. Nvidia, you are losing your way!

    Instead of spending all that time, effort and $$ just to try to take the "speed champ" title, make your shit that much cheaper instead! If your 6800 Ultra was $425 instead of $500, that would give you a hell of alot more market share and $$ than a stupid Emergency Edition of your top end cards... We laugh at Intel for doing it, and now you're doing it too, come fricking on...
  • gordon151 - Tuesday, May 4, 2004 - link

    #14, I think it has more to do with the fact those OpenGL benchmarks are based on a single engine that was never fast on ATI hardware to begin with.
  • araczynski - Tuesday, May 4, 2004 - link

    12: personally i think the TNT line was better then the Voodoo line. I think they bought them out only to get rid of the competition, which was rather stupid because i think they would have died out sooner or later anyway because nvidia was just better. I would guess that perhaps they bought them out cuz that gave them patent rights and they woudln't have to worry about being sued for probably copying some of the technology :)
  • l3ored - Tuesday, May 4, 2004 - link

    only the 800xt was winning, the pro usually came after the 6800's
  • Keeksy - Tuesday, May 4, 2004 - link

    Yeah, it is funny how ATi excels in DirectX, yet loses in the OpenGL bechmarks. Looks like I'm going to have both an NVIDIA and an ATi card. The first to play Doom3, the other to play HL2.
  • peroni - Tuesday, May 4, 2004 - link

    I wish there was some testing done with overclocking.

    There are quite a few spelling errors in there Derek.

    Did I miss something or I did not see any mention of prices for these 2 cards?
  • Glitchny - Tuesday, May 4, 2004 - link

    #11 thats what everyone thought when Nvidia bought all the people from 3dFX and look what happened with that.
  • araczynski - Tuesday, May 4, 2004 - link

    i agree with 5 and 10, still the same old stalemate as before, one is good at one thing, the other is good at another. i guess i'll let price dictate my next purchase.

    but ati sure did take the wind out of nvidia's sails with these numbers.

    i wish one of the two would buy the other one out and combine the technologies, one would think they would have a nice product in the end.
  • eBauer - Tuesday, May 4, 2004 - link

    #8 - OpenGL still kicks butt on the nVidia boards. Think of all the Doom3 fans that will buy the 6800's....

    As for myself, I will wait and see how the prices pan out. For now leaning on the X800.
  • ViRGE - Tuesday, May 4, 2004 - link

    ...On the virge of ATI's R420 GPU launch...

    Derek, I'm so touched that you thought of me. ;)
  • Tallon - Tuesday, May 4, 2004 - link

    Ok, so let's review. with the x800XT having better image quality, better framerates, only taking up one slot for cooling and STILL being cooler, and only needing one molex connecter (uses less power than the 9800 XT, actually), who in their right mind would choose a 6800u over this x800XT? I mean, seriously, NVIDIA is scrambling to release a 6850u now which is exactly identical to a 6800u, it's just overclocked (which means more power and higher temperatures). This is ridiculous. ATI is king.
  • noxipoo - Tuesday, May 4, 2004 - link

    ATi wins again.
  • Akaz1976 - Tuesday, May 4, 2004 - link

    Dang! On one hand, I am saddened by the review. My recently purchased (last month) Radeon9800PRO would be at the bottom of the chart in most of the tests carried out in this review :(

    On the other hand this sure bode well for my next vid card upgrade. Even if it is a few months off! :)

    Akaz
  • saechaka - Tuesday, May 4, 2004 - link

    wow. very impressive. i was set on getting an nvidia but just don't know anymore
  • Phiro - Tuesday, May 4, 2004 - link

    The message is clear; Matrox has failed!
  • Brickster - Tuesday, May 4, 2004 - link

    Show me the money!
  • MemberSince97 - Tuesday, May 4, 2004 - link

    Advanced features are worthless if your driver team keeps breaking them..
  • f11 - Tuesday, May 4, 2004 - link

    kinda starting to feel sorry for nvidia now. each generation they stuff their cards with more feautures than they need to, end end up with a slower and more featured card. If anyone's expecting SM3 to be a big hit, remember that the original FX line had lots of extensions to DX9 and they never really made much a difference.

Log in

Don't have an account? Sign up now