RE: Nvidia Driver Cheats? Truth or Fiction?

These 27.51's are dated the 7th! Surely there are newer ones then these?.

This topic was started by ,



assets/images/contentteller/avatar_disabled.webp

0 Posts
Location -
Joined -
http://www.hardwareanalysis.com/content/article/1709/
I´m sure you´ve read reports around the web about Nvidia allegedly cheATing with their yet to be released 60.72 build drivers. These drivers were sent out to journalists evaluating the new GeForce 6800 Ultra and the NV4x architecture. Just two short weeks after the first reviews were posted one website published a lengthy article discussing image quality and how some frames rendered by the new GeForce showed differences from the same frames rendered on ATi hardware. So is this the proverbial smoking gun we´re looking at, is Nvidia indeed making shortcuts to boost the performance of their new architecture or is something else going on here?

I think the jury is still out on what is the proper way to render frames; both companies have a different approach to calculating textures, doing anti-aliasing and all the other 3D calculations. Thus it comes as no surprise that Nvidia indeed renders scenes differently than ATi does, but where do you draw the line between a cheat, an optimization, a shortcut or an architectural difference? Both companies have used shortcuts before that some consider cheats and others a good way to boost performance. If a shortcut is only visible after enlarging portions of a screenshot to 4x its size, who really cares, as it is not visible during normal gameplay, and it does improve the performance.

It´s a matter of what shortcuts you use where, and both ATi and Nvidia have plenty of shortcuts built into their architecture. In essence for both companies to render the frame exactly the same, the same architectures would eventually be needed. However what makes graphics accelerators interesting, and gives us something to write about, is the way in which each company uses their own technology to deviate from this path to increase performance whilst producing the same result. Both ATi and Nvidia use proprietary technology to increase performance and efficiency which basically are shortcuts that all produce the same end result; the frame rendered.

Using an architecture specific feature to up the performance is a shortcut, so is rendering only the pixels that are visible in the frame. If these shortcuts don´t decrease image quality, or only by a margin that is imperceptible, it surely is a legitimate shortcut, since it is producing the same end result. Following that train of thought, I always like to compare these discussions about image quality to MP3 compression. Millions of people use MP3s and don't hear the difference between the MP3 and the original CD. However if you look at it from a sound quality point of view there's a big difference, with the MP3 simply leaving out details you don't really hear at all. I have a similar approach to rendering quality, why render every pixel fully, if by using shortcuts to render just those that matter gives you better performance and identical image quality unless you scrutinize the screenshots with a microscope.

Sander Sassen


Participate on our website and join the conversation

You have already an account on our website? Use the link below to login.
Login
Create a new user account. Registration is free and takes only a few seconds.
Register


This topic is archived. New comments cannot be posted and votes cannot be cast.

Responses to this topic



assets/images/contentteller/avatar_disabled.webp

0 Posts
Location -
Joined -
OP
this is for very long time the most senseful article ever read on all those cheating things ...
damn I do not care about RED or GREEN boxes put into a 3D rendered sceen if the screenshot without does simply look 99% the same ...

I do not understand all that fuzzzz about cheating ... whats soooooo bad to optimise special aplications/games if you are able to?
I mean ...
lets take HT for example who blames intel for cheating when some apps use HT technology ?! ... no one because its just good to have some extra frames without doing nothing for it ... same on grafics ... damn its for free ... so why not ?


data/avatar/default/avatar02.webp

202 Posts
Location -
Joined 2003-10-05
FULLY AGREE!

Couldn´t have said that in a better way!


data/avatar/default/avatar01.webp

162 Posts
Location -
Joined 2002-12-16
Nice one Mertsch :)


data/avatar/default/avatar01.webp

614 Posts
Location -
Joined 2003-03-21
I wish drivers had something like "enable optimisations" or something of the kind in the tabs. Not for 3Dmark as such, but the optimisations in there were limited and I wish I had an option to enable that for certain games.


data/avatar/default/avatar03.webp

1352 Posts
Location -
Joined 2004-02-01
that´s a good idea KK

why not give this idea to Nvidia/ATI ? *smile*


data/avatar/default/avatar01.webp

614 Posts
Location -
Joined 2003-03-21
ENABLE OPTIMISATIONS OR ELSE! :)


assets/images/contentteller/avatar_disabled.webp

0 Posts
Location -
Joined -
OP
ENABLE OPTIMISATIONS OR ELSE!

... OR you will never buy an Ati OR nVidia card again and jump back to VooDoo


data/avatar/default/avatar01.webp

614 Posts
Location -
Joined 2003-03-21
Or just forwards to pen and paper :)


data/avatar/default/avatar03.webp

91 Posts
Location -
Joined 2003-05-21
I think we're all getting tired of these driver optimization banners that keep getting waved every time a video card manufacturer comes out with a new set of drivers or a new graphics card. Like with most things, if someone keeps crying "wolf", eventually we all start to ignore those cries and sooner or later, nobody gives a rat's ass WHAT they're saying.

The only thing that matters is PERCEIVED visual quality (what we can actually SEE with our own eyes WHILE we're playing a game or viewing a movie or whatever it is we're running). If all the pixels line up into recognizable patterns at an acceptable performance level, the graphics card is doing it's job. Period.

But, apparently that's not good enough. We keep seeing screen captures of images (at high magnification, no less) from games and benchmarks that IF you take the time to fully analyze them, yeah .... you'll see a few pixels out of place here and there or a particular corner of a texture might look slightly diffused or different from one version of a driver to the next. SO WHAT?!

This frame-by-frame analysis is pointless because, remember, we typically see this stuff going by so fast that we'd rarely see a a MAJOR difference let alone a few pixels out of place. And too, if you're looking at texture quality or individual pixels, you're NOT playing the game. You're running a benchmark or analyzing the results of one just for the sake of running it ... and that's NOT what a computer is intended to be used for.

I didn't build my computer to run benchmarks or stare at pixels to see if anything is different from one frame to the next or one graphics card or driver iteration to the next. If a game, image or movie looks good to me, I could really care less if the drivers I'm running are optimized or not. And frankly, if they ARE optimized ... I'd call that GOOD programming and not a cheat. If a programmer is smart enough to make a game run faster and NOT affect perceived image quality or performance in a negative way, I'd say they're doing the job they're being paid to do. Wouldn't you?

Having the ability to turn things (optimizations) on and off is an excellent idea, but I doubt we'll ever see it. Why? Because for the most part, they are HARDWARE specific optimizations and if you turn them off, you'll probably break something (meaning the card will stop doing what it's suppose to do, which is to render D3D or OpenGL scenes).

The drivers take the instructions in the game that make calls to D3D or OpenGL functions and translates them to hardware functions. How those functions are then implemented by the hardware (the translation of a software instruction to a hardware function) IS a function of the hardware itself with the drivers as the "arbiter" (controller).

The drivers make that translation directly and turning off either specific or general optimizations would probably break something in that pathway. Minor tweaking of specific (adjustable) functions probably could be turned off but I wouldn't think that would be a desirable feature to want to implement anyway (why would you want an image or sequence of frames in a game to look WORSE?!). Remember too that game developers put in optimizations and patches for specific hardware platforms as well (both for ATi and NVidia cards) and turning off ANYTHING that's hard coded (meaning not adjustable by the end-user) would probably break the game as well.

The point of optimizations is to IMPROVE either performance (speed) or image quality (or both). At some point you have to make a tradeoff between speed and image quality by applying or turning off anti-aliasing, anisotropic filtering, etc. We already have access to those functions (and others) and they do directly affect both what we see (image quality) as well as the speed at which we see it (performance).

More appropriately, a single button or checkbox to instantly turn off all ADJUSTABLE parameters like Anti-Aliasing, Anisotropic filtering, Mipmap detail levels, etc., would be extremely handy (as compared with having to move a bunch of sliders or uncheck a bunch of checkboxes). If you want to think of it that way, a RAW mode. Give me ONE BUTTON to turn EVERYTHING OFF, please! That's all I want!

BOTH ATi and NVidia optimize their drivers for their HARDWARE and for the applications that will inevitably be run on them. And that's a GOOD thing because they actually DO improve performance and visual quality through those optimizations. Forget about benchmarks like 3DMark (that nobody trusts anymore anyway) and concentrate on actual game performance ... how the games look and how it performs on YOUR system (not somebody else's).

It's what we actually perceive in front of our eyes at full frame-rate SPEED that's important. NOT a 100x magnified array of pixels or the corner of a texture that holds any meaning for us (especially when it's flying by 150 frames per second). Believe what your eyes see and NOT what the reviewers want you to see.

Spend your money wisely and ask other REAL people (who use the same hardware for more than just benchmarks) what they're seeing. If you trust their judgment and your own eyes, then you'll be happy with the results no matter what hardware you buy. Then HOPE that whatever optimizations a developer comes up with to IMPROVE image quality and performance, is only a short time period away. The sooner you're able to start enjoying that hardware (and what those optimizations provide), the happier you'll be with your decision ... and THAT is what it's all about (product satisfaction).

By the way, Dark Biene ... sorry about unloading on you the other day. That was uncalled for on my part and I apologize if I offended you (which I'm sure I did). I forget sometimes, being the Old Fart that I am, that not everyone sees things the way I do and that everyone does NOT have the same command of the English language I have. It won't happen again.

Later.


data/avatar/default/avatar01.webp

614 Posts
Location -
Joined 2003-03-21
If there's one thing I love about this forum its when Old Fart comes along, because every single time he does I realise something I've been a stupid idiot not to have seen before. Plus every time he answered one of my questions I always left knowing exactly what the deal was. Damn he rocks

he does you know, that wasn't a joke