Next-Generation DirectX 9.0 Game Graphics Performance Preview

Published by

data/avatar/default/avatar04.webp
I'm as suprised as you are - Benchmarking and writing an article using a STOLEN game is disgusting. I've lost what little respect I had left for X-Bit Labs - What are they thinking?!?!
data/avatar/default/avatar04.webp
Isnt X-bit russian or something ? that would explain how they got way with it, even so as an article it tells us nothing. Ok the FX is getting a kicking in DX9, but we knew that already . This is also beta, so not representative of the finished game ands they didnt test the mixed path.
data/avatar/default/avatar03.webp
...you just KNEW someone was gonna do it. ;) (BTW-I'm pretty disgusted my ownself, but I was expecting some one to be low enough to do it for the hits....and I'm looking at the results. :( )
data/avatar/default/avatar04.webp
Why did they do it? Well..... "Since we at X-bit labs always dare to test hardware before the release, why not test the software, even if it is not yet ready to be released? There is a very highly-anticipated 3D games using DirectX 9.0 capabilities with graphics engine on the level where it will not experience any tangible changes and optimizations – why not run some benchmark using the title?" Why not tested mixed mode renderer, well.... "Unfortunately, we did not succeed in using a special rendering mode for NVIDIA GeForce FX graphics cards, but I am sure we will publish those results in future when we finally learn more about that preset." Bit of an iffy thing to do for a legit site, but well, so what. Most people out there probably have something illegal on their PC (cracks, etc.) & lets face it, I know I wanted to see what performance was like.
data/avatar/default/avatar01.webp
I have a ti4200, and when I play I get like 100fps+ It says its running in DX9 mode. It could be wrong, but its on highest quality also. Just for you who still have older dx8 cards, the game will run FINE!
data/avatar/default/avatar04.webp
you have a DX8 card so i cant be running in DX9 mode. HL2 scales well and will work on a DX7 card.
data/avatar/default/avatar04.webp
With the risk of getting flamed, I should just mention that ofcourse I did try both HL2 and 52.13... Why? Since I bought a FX5900 card that everybody laugh at it and claim is truly bad, just becouse one single damn game that isnt even released yet (and yeah, plus two damn console ports that looks like they where made 3-4 years ago). What I can say is that 52.13 does not only double the speed, it also makes the game perfectly playable in 1600x1200 even when there are heavy graphic effects running on screen, and that is everything I needed to know. Now I DEFENITLY wait for the final game.
data/avatar/default/avatar04.webp
Have you compared the IQ to the reference ?
data/avatar/default/avatar04.webp
The only thing I can say about IQ is that the game looks absolutely marvelous, but as I do not own a 9800, I cannot compare.
data/avatar/default/avatar02.webp
He means compare the IQ from the previous driver to the one that "doubles performance"
data/avatar/default/avatar02.webp
John Carmack rules ! As i've checked from stolen DooM III & HL2 versions, D3 engine is far away superior in render technology. So, i'll buy faster video card when D3 be on stores. Go on John !
data/avatar/default/avatar04.webp
Nice... 1024*768*32 + 6XAA and 16Xaniso works fine on my 9700 Pro and Athlon XP1900+... and what can I say, the image quality is obviously splendid.
data/avatar/default/avatar04.webp
And against the reference rasterizer. If your happy then cool thats all that matters, i just dont think i could live with the card hecause id never be sure i was getting the best out of my games.
data/avatar/default/avatar04.webp
The earlier driver was unuseable... I could barely do anything with it... so I didnt have all that willpower required to actually get into a map checking out the IQ.