Comments Locked

62 Comments

Back to Article

  • stealthc - Monday, January 3, 2005 - link

    I have geforce4 mx 440, with 1.7ghz intel cpu and the frame rate is unacceptable, jittery all to hell, stupid HL2 autodetects it as dx7.0 card, I forcefully correct but still. How the hell are you pulling off frame rates like that with that particular card? I barely get a stable rate at 800x600 res at 20fps. I see lots of red and usually yellow when I use cl_showfps 1 in console
    There's something completely messed with this game it shouldn't run like this.

    Counter strike source is unplayable, I can't get over 10fps in it. Usually it hangs at around 3 to 6 fps.

    I'm pissed off I spent $60 on a freaking game that does SH*T for me. I already beat it but it's so not smooth and jittery I'm surprised I could even hit the side of a barn.
  • Barneyk - Monday, December 6, 2004 - link

    "The next step is to find out how powerful of a CPU you will need, and that will be the subject of our third installment in our Half Life 2 performance guides. Stay tuned..."

    Yeah, im waiting like hell, i wanna se how HL2 performs on old CPUs, i have a TB1200 and a GF4 Ti4800SE, Graphics perfomance was OK, but the game playing was really sluggish, but still very playable.
    And i wanna se som comparison graphs on how different CPUs perform, and i've been waiting, when do we get to see the HL2 CPU roundup? :)
  • charlytrw - Thursday, December 2, 2004 - link

    take a look on this link, here are some interesting info about how to improve performance on geforce fx users in about a 50% on dx9 mode. Try it yourself. The link to the article is:
    http://www.hardforum.com/showthread.php?t=838630&a...
    I hope this can help to some geforce fx users... :)
  • clstrfbc - Wednesday, December 1, 2004 - link

    Did anyone else notice the piers and pylons don't have reflections in the screen shots on page 2, Image Quality, the second picture, the one of the water and damn/lock.

    When you roll over, you see the non-reflexion version in dx8, rollback and something looks funny. The refelctions look good, but there are lots of things missing, most obviously the piers, aslo some bullocks? (things you tie ropes to)
    Hmm, maybe the pier is a witch, or is it a vampire, so it has no reflection....

    Other than that, the game is awesome. I'm well into city 17, and only took a break because the wife was becoming more dangerous than the Combine.

    Running on a Athlon 1.7 512Mb and Saphyre Radeon 9000 128Mb?, it it plays fine excepte the hickups at saves and naptimes at loads.
  • MrCoyote - Monday, November 29, 2004 - link

    This whole ATI/Nvidia DirectX/OpenGL optimization is driving me insane. Developers need to stop trying to optimize and code for "universal" video cards.

    One reason why DirectX/OpenGL was created, was to make it easier on developers when accessing different video cards. It is a "middle man", and the developer shouldn't have to know what card someone has in their system.

    But now, all these developers are using stupid tricks to check if you have an ATI or Nvidia card, and optimize the paths. This is just plain stupid and takes more time for the developer.

    Why don't they just code for a "universal" video card, since that's what DirectX/OpenGL was made for?
  • Alphafox78 - Wednesday, November 24, 2004 - link

    'Glide' all the way!!!
    I used to run Diablo on a voodoo2 and you could switch from glide to Dx and glide looked so much better! even after upgrading to a geforce card glide looked better... not that its relevant here. heh
  • nserra - Wednesday, November 24, 2004 - link

    #55 OK I agree. No I was not trying to saying that 6xxx is bad, or that it will be bad. Nor I think it superior do Ati X line and vice versa. But we already have the "bad" example of the nvidia FX line (for games), how will both scale?! I didn’t like 3Dmark bench, but was 3Dmark03 that bad? It was already showing something at the time….

    One day I said here it was important to bench an Ati 8500/9000 vs Geforce3/4 using current drivers (and platforms maybe too) with new and older games to test.
    This is very important since you will know if support to older hardware is up to date, and also how is performance on these cards today.
    I am not saying that while 8500 loosed to Geforce4 in the past, that it will win today, I just want to know how things are now! Is PS1.4 of the 8500 giving it any thing now since then it doesn’t? I don’t know!

    But everyone said the test was pointless and that there where no need to bench "older" hardware, since no one was planning to buy it....
  • Lord Banshee - Tuesday, November 23, 2004 - link

    well it kinda matter what you do with your computer. Personaly i like my 5900XT over the 9800Pro i have because it is faster in 3D Modeling software.. But for gaming the 9800Pro is fare better.

    But as you say that the NV3x wasn't bad then but now, well i think it was bad then just had nothing to prove it. Now that their is a lot of DX9 games out the poor mistake on nvidia engineers show. And they know they missed up thats what the 6800 can use a 16x1 pipeline config. I belive the 59xx's were 4x2 and the 9800 was 8x1. the 6600GT also shows nvidia's mistake being that it is a 8x1 and beating the paints off the 4x2 5900 series.

    But i was unsure about if you were tring to say in your last post if you think nvidia fudged again and that it will show in due time with the 6800 series, but if this is the case and you are tring to say that. I don't think so and i believe Nvidia truelly has a very compeditive card that meets the demands of all users.
  • nserra - Tuesday, November 23, 2004 - link

    #52 You are right. I was talking about the NV3x core. But dont forget that this CHIP was not that bad at the time, its really bad TODAY!

    Example: Now who is better served, who had bought a 5600/5700 or a 9600 card?
  • pio!pio! - Monday, November 22, 2004 - link

    #48 agreed..it may be that dx8 water looks that crappy while dx8.1 water looks like what you and I are seeing
  • Lord Banshee - Monday, November 22, 2004 - link

    Sorry about above post,

    #50, i hope you are only takinf about nv3x and below core? the nv4x core is almost as good as the newest radeon in rendering dx9 games.

    On a side note does anybody care the reason why doom3 models and textures are as good as half-life2? One being the amount of GPU processing power the lighting system takes. And the special effects. I am sure if every body had a 6800 Ultra then ID would have made the textures in doom3 better and used more high polygon models.

    But in we all don't so they instead used alot of normal mapping(the future in gameing) and a brand new light system never seen in games before.

    But again you most see that the doom3 engine has the ability of using huge textures and models but it is game dependent. Not all games that will use this engine will have the same lighting effect and such, they might want to show off their texture skills, it is the game companies choice.

    What doom3 fails at is outdoor enviroments, this is where the Source engine has them good (so they say, i have yet to play half-life 2)

    But it looks like the Unreal3 engine will be the best of both worlds, but thats another 2 years most likly.
  • Lord Banshee - Monday, November 22, 2004 - link

  • nserra - Monday, November 22, 2004 - link

    #40 T8000 ???!?!

    So why 6600 and 6800 perform very well and 6200 so bad? Aren’t they all the same card? Your post is pointless.

    Luckily Valve was hacked?, are you kidding how many people including like my self buy a piece of crap like the 5600, that performs so bad no only on this game but many others. TOO BAD IT WAS HACKED!!!

    Sure any card plays it today like one year ago, but not the right way!!!!

    I don’t know but I bet when more DX9.0 games came out the difference between the Ati and nvidia will be bigger. Unless there will be an option to enable the fast FP16 mode providing lower image quality like Far Cry.
  • nserra - Monday, November 22, 2004 - link

    We all know that who bought the Ati 9xxx have done a better job than the ones who bought the FX5xxx series card.

    Now what about an 8500 vs GF3/4.
    And some 9000 card too?

    DX8.1 is different of DX8.0, I would like to know if the 8500/9000 was a better buy, but today over the geforce3/4.

    It’s really important since GFfx sucks today but not 2 years ago, who know what will happen 2 years from now with 9xxx and 6xxx.

    Why 6200 performs so badly, and 6600 and 6800 so good?
  • dderidex - Monday, November 22, 2004 - link

    FYI, the compare image on [L=this page]http://www.anandtech.com/video/showdoc.aspx?i=2281...[/L] for the water is all wrong. I don't know what they were using for the 'DX8' sample of the water reflection, but that's not what it looks like at all on a GeForce FX card. It looks virtually indistinguishable from the DX9 sample, only with noticeably less smooth transitions with the coastal terrain (not shown in that shot).

    Unless AT intentionally disabled world reflections when switching to DX8 mode? But, I have a hard time believing they would be so biased.
  • blckgrffn - Sunday, November 21, 2004 - link

    8500/9100 & 9000/9200 & fx5200/5700 Radeon 7000/7500 & GF3/GF2 benches please! There are a lot of these cards out there and I am curious!
  • TheRealSkywolf - Sunday, November 21, 2004 - link

    45, ati contributed with a big cut of the budget for half life 2. Thats why it got delayed 1 year.
    So it is blatant obbious that valve was told to not not make dx 9.0 work well for nvidia fx.
  • moletus - Sunday, November 21, 2004 - link

    #40, you are so wrong wrong and wrong again. What kinda idiot game developer woulnt code as good as possible, regards of who gave em what development money? There are plenty of nvidia cards out there and im quite sure they want to play HL2 too.

    It's all about making $$$, so...
  • Cybercat - Sunday, November 21, 2004 - link

    #40, not necessarily. The 6200 is typically found to perform close to the X300. Only a few times will it meet up with the X600 Pro's standards.

    http://anandtech.com/video/showdoc.aspx?i=2238&...
  • abakshi - Sunday, November 21, 2004 - link

    *other (not over lol)
  • abakshi - Sunday, November 21, 2004 - link

    Just a note - the last graph on page 7 seems to be a bit messed up -- the GeForce 6200 is shown as 82.3 FPS - higher than all of the over cards - while the data chart and line graph show it as 53.9 FPS.
  • KrikU - Sunday, November 21, 2004 - link

    Why cant we see benchmarks with AA & AF enabled with mainstream graphics cards? HL2 is using a such engine that is only CPU limited, so AA & AF tests are really welcome!
    Im playing with ti4400 (o/c to ti4600 speeds) with AA 2x & AF 2x! This is first such new game where I can use these image quality enhancements with my card!
  • T8000 - Sunday, November 21, 2004 - link

    Half life 2 seems to be designed around the Radeon 9700.

    Because Valve seems to have made certain promises to ATI, they where not allowed to optimize any Geforce for DX9.

    This also shows with the GF6200, that should be close to the R9600, but is not, due to the optimized Radeon 9700 codepath.

    Luckely, Valve was hacked, preventing this game from messing up the marketplace. Now, almost any card can play it and Nvidia may even be tempted to release a patch in their driver to undo Valves DX9 R9700 cheats and make the game do DX9 the right way for FX owners, without sacrificing any image quality. Just to prove Valve wrong.
  • draazeejs - Sunday, November 21, 2004 - link

    Well, I like HL2 a lot, much more so than the pitch-black, ugly-fuzzy texture D3. But, honestly - to me it looks exactly like Far Cry, engine-wise. Is there any difference?
    Respect to the level-designers of HL2, none of the games comes even close nowadays to that sort of detail and scenery. Also I think the physics of the people and faces and AI is by far superior. And the Raven-yard is much more scary than the whole D3 :)))
  • kmmatney - Sunday, November 21, 2004 - link

    [sarcasm] Oh, and have fun running those DX games on other platforms without emulation. [/sarcasm]

    Obviously, this game isn't meant for other platforms, and that's fine by me. I think the original half-life had an OpenGL option, but it sucked (at least on my old Radeon card). In general, OpenGL has always been a pain, dating back to the old miniGL driver days. In my experience, when playing games that had either a Dx or OpenGL option, the DX option has usually been more reliable. It sould be because I usually have ATI based cards...
  • kmmatney - Sunday, November 21, 2004 - link

    I didn't mean that DX literally "looks" better than OpenGl, I meant that it seems to be more versatile. Here's a game that can be played comfortably over several generations of video cards. You have to buy a new one to play D3 at decent resolution. The HL2 engine seems to have room to spare in terms of using DX 9 features, so the engine can be further enhanced in the future. I would think this game engine would be preferred over the Doom3 engine.
  • oneils - Sunday, November 21, 2004 - link

    #15, Steam's site (under "updates") indicates that the stuttering is due to a sound problem, and that they are working on a fix. Hopefully this will help you.

  • vladik007 - Saturday, November 20, 2004 - link

    " I'm missing words to how pathetic that is. "

    1st my post was no.2 NOT no.3.
    2nd unlike many people i dont have time to work on my personal computers all the time. IF i dont upgrade this holliday season , i'll possibly have to wait until summer vacation. And you dont see nforce4 out now , do you ?
    3rd No it's not pathetic to follow something that's never failed me. Ever heard of satisfied customer ? Well Abit has always treated me very well , RMA proccess , crossshiping , bios updates , good support on official forums ... etc Why on earth should i change ?
    4th got it ?
  • moletus - Saturday, November 20, 2004 - link

    I really would like to see some ATI 8500-9200 results too..
  • Pannenkoek - Saturday, November 20, 2004 - link

    #18: It depends on what features of the videocards are used for how a game will look like, and the art. It's not dirct3d vs opengl, the videocards are the limiting factor. Doom III is just too dark, and that's because of an optimization used in the shadowing. ;-)

    #26: Surely you mean "#2", I'm all for AMD. Not that my current computer is not pathetic compared with what's around nowadays...
  • meatless - Saturday, November 20, 2004 - link

    I agree with #31, mostly; after playing both I don't think that HL2 is any better than Doom3, just different in how they look f'ing awesome.

    And saying that DX looks better than OpenGL "just because" is about the stupidest f'ing thing I've ever heard.

    [sarcasm] Oh, and have fun running those DX games on other platforms without emulation. [/sarcasm]
  • TheRealSkywolf - Saturday, November 20, 2004 - link

    Hl2 can be easier on the eyes due to art, and the animations are also very cool. But i think doom3 is more intense in technology, doom3 just uses more in very ways, and in the long run the doom3 engine will power the best games. hl2 looks amazing, but doom3 is a better estimate to how games in the future will run in your card.
  • Filibuster - Saturday, November 20, 2004 - link

    This article was a fun read.
    I particularly liked the part about the fallbacks that are in place for older cards and the screenshot comparisons.
    Thanks.
  • Filibuster - Saturday, November 20, 2004 - link

    >I can't believe how much better DirectX looks compared to OpenGL. Seems like Id made the wrong choice...

    What a rediculous generalization.

    I do think that Halflife2 looks far better than Doom3 but the API has nothing to do with how things look. (I imagine HL2 will be much more fun too but I'm replaying HL1 w/source to get back into it)

    Carmack will never use Direct3D. He said so years ago and I doubt he will change his mind (even if it is just to make a point). He is sort of the champion of Opengl for games. Besides, all of the features of the video cards can be exposed in Opengl just like Direct3D (perhaps moreso through the use of extentions). Carmack just targeted a different set of features with Doom3 (mostly it was designed around the Geforce3/4 featureset, and the 6 series was designed for Doom, not the other way around like so many people like to claim)
  • GonzoDaGr8 - Saturday, November 20, 2004 - link

    Thanx kevin and ksherman..
  • Jeff7181 - Saturday, November 20, 2004 - link

    I agree with #1... I'm well into City 17 and I have all my stuff... because of the first review saying I didn't have a flashlight, I was expecting to be thumped on the head again and have all my stuff taken away and end up in a prison cell or something.
  • MrGarrison - Saturday, November 20, 2004 - link

    #3
    That's pathetic.
    nForce4 is around the corner and there are lots of good alternatives like MSI K8N Neo4 Platinum.

    I have "pals" at home who are the same way. Only Intel and only ABIT... I'm missing words to how pathetic that is.
  • unclesam - Saturday, November 20, 2004 - link

    What is the difference between DX 8.0 and 8.1? I am playing the game on a 1.6 GHz Pentium M ThinkPad T41 with a DX 8.1 ATI Mobility Radeon 9000, 32 MB. I too have everything turned on to high, including 1400 x 1050 resolution, and I have experienced no serious hiccups. I had to reduce reflections to the minimum setting, but I just went back to that scene with reflect everything, and the water looks exactly like the DX 9.0 output. The only time the game stutters is just after loading a level. The performance limiter does not seem to be the CPU/GPU, but rather the limited throughput of my FSB. I assume that your CPU test will use "equivalent new patforms" and then compare the fastest "gaming" CPU. Since you have gone through the trouble of benchmarking older graphics cards, I think you should also benchmark the older paltforms and CPUs that go with them, or rather the other way around. Please compare platform performance rather than just CPUs.

    By the way, I am extremely envious of anyone with a halfway decent desktop setup (P4HT800fsb, >ATI 9600). For a small section I turned on reflect all and 6x AA and 16x AF. Got .25 fps, but damn, it's like you are there.

    Happy computing.
  • Saist - Saturday, November 20, 2004 - link

    same setup Revrnd.

    The benchs I want to see though are Geforce4 MX on a 1.2ghz P4 or Athlon XP 1500. Ya know. Something that AVERAGE people have.
  • GoodRevrnd - Saturday, November 20, 2004 - link

    Am I blind or did Anand not post what system these benches was ran on? Or was it the same setup from the first article?
  • klah - Saturday, November 20, 2004 - link

    "cant wait for CPU benches"

    Check out these:

    http://www.firingsquad.com/hardware/half_life_2_cp...
  • KevinCQU - Saturday, November 20, 2004 - link

    #17, I'm running the game on a regular GF3, AXP2500+ @2.2 and 512 ram. It runs smoothly at directx8 settings, I turned down the water quality though, havent tried turning it up yet, been busy playing the game ;) I'm impressed though....its definitely playable at 1024 and it looks pretty nice too
    -Kevin
  • ksherman - Friday, November 19, 2004 - link

    #17, im probably gonna try and run the game (GF2 TI200, but OCed to Ti400 speeds ;), so ill let you know if itll work...
  • ksherman - Friday, November 19, 2004 - link

    dumbish question--> i i wanna play in DX7 or 8 mode, do i need to install different driers, or do i just use the ingmae settings? I dont actually own the game yet, so thats why i ask...
  • kmmatney - Friday, November 19, 2004 - link

    I can't believe how much better DirectX looks compared to OpenGL. Seems like Id made the wrong choice...
  • GonzoDaGr8 - Friday, November 19, 2004 - link

    Aaargh..Has anyone ran this game yet on a Geforce3(regular/Ti200/Ti500) based card yet? I'm curious as I have a Ti200 and could run this is DX8 mode..
  • skiboysteve - Friday, November 19, 2004 - link

    #13 is exactly right

    its not all OpenGL vs DX or nvidia optimization to ati optimization.

    look back at anands article about the graphics pipeline on each of these cards. Doom3 was extremly texture intensive, doing allot of lookup to tables instead of doing the math.

    nv30 and nv40 are very good at doing texture look ups, and only the nv40 is good at the math. nv30 had a very non math friendly pipeline.

    the r300-r400 were better at math.

    its all in the articles on this very website.
  • bupkus - Friday, November 19, 2004 - link

    I have a Ti4200 w/ 64MB ram and I changed from 1024x768 to 800x600 to fix some occasional stutter problems; it didn't help.
    Which res should I probably be able to run in. I have a 2500+ Barton OC'd to 2.2GHz with 2x256 OCZ PC3500 EL.
    Just for fun, I went to 1280x1024 for my LCD just to see how it would look (without movement) and it was very nice.
  • skunkbuster - Friday, November 19, 2004 - link

    also, nvidia is known to be better at openGl games and weaker in games that use dx.

    the same as ati is known to be better at dx games, and weaker in openGl games.

    doom3= open gl
  • Falloutboy - Friday, November 19, 2004 - link

    #10 its because Doom 3 was very texture intensive based, which the 5xxx series of nvidia exceled at and HL2 is a very shader intensive engine wit hless emphasis on texutures, and as we all know by now the 5xxx series sucked in DX9 shaders
  • ukDave - Friday, November 19, 2004 - link

    Not that i'm saying that is the reason it performs so badly, it is due to its poor implementation of DX9.0. I think the whole nV 5xxx line needs to be swept under the carpet because i simply can't say anything nice about it :)
  • ukDave - Friday, November 19, 2004 - link

    Doom3 was optimized for nVidia, much like HL2 is for ATi.
  • mattsaccount - Friday, November 19, 2004 - link

    How can a 5900 be so poor at dx9 style effects in HL2, and excel at an (arguably) more graphically intense game like Doom 3? The difference can't be due only to the AP (Dx vs OGL), can it?
  • ZobarStyl - Friday, November 19, 2004 - link

    Doh login post: FYI the bar graphs on page six are both the DX8 pathway.
  • ZobarStyl - Friday, November 19, 2004 - link

  • Cybercat - Friday, November 19, 2004 - link

    Good article. I'm a little disappointed in the 6200's performance though.
  • thebluesgnr - Friday, November 19, 2004 - link

    Hi!

    Have not read the article yet but I'd like to ask one thing:

    The Radeon 9550 tested has 64-bit or 128-bit memory interface? From your numbers I'm sure it's 128-bit, but I think some people might order the cheapest (=64-bit) after reading the article, so it would be nice to see it mentioned.

    On the same line, I would like to see AnandTech mention the GPU and memory clocks for all the video cards benchmarks.

    btw, the X300SE was tested on a platform with the same processor as the other AGP cards, right?

    Thank you.
  • shabby - Friday, November 19, 2004 - link

    Holy crap my ti4600 can muster 60fps in hl2 ahahaha.
  • skunkbuster - Friday, November 19, 2004 - link

    yikes! i feel sorry for those people using video cards that only support DX7.
  • Pannenkoek - Friday, November 19, 2004 - link

    I wonder if "playability" is merely based on the average framerates of demos, or that somebody actually tried to play the game with an old card. Counter Strike became barely playable with less than 40 fps later in its life, while average framerates could be "good enough" and while it used to run smoothly at the same framerate in older versions.
  • vladik007 - Friday, November 19, 2004 - link

    cant wait for CPU benches. Again , great article , something i've come to expect here.

    Btw , when might u be releasing CPU benches ? I'm about to order my cpu/mobo upgrade this weekend so i can play with it over the Tgiving holiday , i'm thinking A643200+Abit 939 board. ( i've always ,since my 1st computer, owned Abit motherboard , so i'm not going to change ritual , too bad they dont make one with nforce3)
  • SMT - Friday, November 19, 2004 - link

    Um, I'll say it again...

    I'm pretty sure Gordon has a flashlight throughout Nova Prospekt.

Log in

Don't have an account? Sign up now