Jump to content
The Dark Mod Forums

AA Rendering and other things


Lux

Recommended Posts

This thread was moved from: http://forums.thedarkmod.com/topic/16628-opengl-perf-on-amdati-gpus-wip-fix/

in case anyone is wondering how any of the below text relates.

 

====================================================================================================

 

I personally can not comprehend why many of the posters are running AAx8, especially at 1920x1080. The difference between 4x > 6x > 8x at that resolution is hardly discernible particularly when you're actually moving around playing and not just staring at the screen looking for visual nuances.

 

As motosep also said, it "murders" performance. The memory requirements from 4xAA > 8x are doubled and the amount of calculations the GPU has to do is also increased substantially. If you exceed the physical memory limitations of your card, how can it keep up?

 

Playing at 1920x1200 I've never seen a substantial visual gain going over 4x and I just don't understand why so many people run 8xAA. The *only noticeable benefit is less FPS and that's not a benefit.

 

So why do so many people run 8xAA? I guess you can just set everything to High/Max and live with the amount of FPS you get but why would you do that when you could be running >60FPS and have a much smoother experience.

 

Also, if we're actually performing a study of FPS vs. GPU/CPU reqs, shouldn't we have a controlled case where by everyone with dissimilar hardware runs all of the exact same settings to get a baseline of performance data? (e.g. 0xAA 8xAF Shadows on MED/HIGH, Textures on MED/HIGH, etc.) Then maybe have a second case with all of the exact same settings with everything on higher settings to help determine if there is some other bottleneck or constraint affecting various cards/systems.

Edited by Lux
Link to comment
Share on other sites

So why do so many people run 8xAA? I guess you can just set everything to High/Max and live with the amount of FPS you get but why would you do that when you could be running >60FPS and have a much smoother experience.

 

I suspect people just automatically set all graphics settings to max if they have a reasonably modern card, and don't bother conducting tests to see what effect each setting has on performance.

 

Also, if we're actually performing a study of FPS vs. GPU/CPU reqs, shouldn't we have a controlled case where by everyone with dissimilar hardware runs all of the exact same settings to get a baseline of performance data?

 

Agreed. There's not much that can be learned from people's results using totally different graphics settings and screen resolutions, even if they all have AMD graphics cards.

Link to comment
Share on other sites

Agreed. There's not much that can be learned from people's results using totally different graphics settings and screen resolutions, even if they all have AMD graphics cards.

Ah well spotted I had not even noticed the AA settings on peoples posts, @ SteveL, Kyyrma, Skina, Ralle321 and Hmart can you set your AA/AS to x2/x8 and give your FPS again please.

Link to comment
Share on other sites

Today, with my original settings AA8 and AS16, I got 47fps instead of 43.

 

With AA2 and AS8 I get the same.

 

I tried restarting TDM with the new settings in case switching them live doesn't do anything, but I still get 47fps.

 

Changing AA doesn't see to do anything to my fps, so the bottleneck must be elsewhere (on my system at least).

 

I suspect people just automatically set all graphics settings to max if they have a reasonably modern card, and don't bother conducting tests to see what effect each setting has on performance.

This. Until I read that rendering book last month, I didn't even know what AA was for, let alone how it worked.

Link to comment
Share on other sites

I never use AA in games. Since way back then, when AA began appearing in games' options, I avoided using it due to massive performance loss I any game. It's not the case with modern AA techniques, but idTech 4 has that kind of old method of AA. I'll give it a spin without AA after work.

 

Also, why not to simply run VerySleepy on RelWDebugInfo binary and she how much CPU time the engine eats up?

 

@SteveL: one thing worthy porting sooner than later from BFG is the com_showFPS timers. This way you can clearly see whether GPU is choking or CPU.

Link to comment
Share on other sites

No change in FPS from 2xAA to 8xAA seems like either the setting is not doing anything or the game is extremely easy to run and with the lighting and shadows we know that isn't accurate. Something is not right.

 

Modern cards have no trouble running high levels of AF but AA in its normal form still has a large performance impact. AA done in shaders is much less so but conventional AA is a performance hog and always has been.

Edited by Lux
Link to comment
Share on other sites

Unless something else earlier in the pipeline is causing a bottleneck, leaving the shading stages with time to spare? That might also explain why I could put 60 dust fog billboards on screen and toggle them between hardware depth testing and my programmed soft particle shader without seeing a change in fps.

Link to comment
Share on other sites

Ran with 2x/8x like suggested, this time I got 39 FPS (compared to 43 before with higher settings). Bumping up AA and AS I get the same result, curious.

 

Testing this a bit, I notice that there is quite a bit of FPS fluctuation when even slightly changing position in front of the table.

 

Would it help to have a common save file that guarantees everyone has the exact same position?

Edited by kyyrma
Link to comment
Share on other sites

Erm... what, if anything, does "turn off rendering" mean?

 

A cvar that turns of rendering. If nothing goes to GPU, and fps is still not great, then GPU has nothing to do with performance loss.

 

The issue is that AMD's latest most powerful card gives significantly lower fps in TDM than older nvidia cards, in the same machine. AMD have agreed there's a problem and have promised to try to fix it in the next driver release. Biker is gathering evidence to help that process and give feedback.

 

Did you see my post ? I have decent Nvidia GPU and my fps is lower than Bikerdude's.

Link to comment
Share on other sites

A cvar that turns of rendering. If nothing goes to GPU, and fps is still not great, then GPU has nothing to do with performance loss.

 

Does this refer to a cvar that turns off hardware rendering? So in other words, using only the CPU to software render the game?

 

I'm not sure how a GPU rendering a game, and its performance issues while doing so, can in any way be related to any current CPU software rending the same game.

Link to comment
Share on other sites

Does this refer to a cvar that turns off hardware rendering? So in other words, using only the CPU to software render the game?

 

I'm not sure how a GPU rendering a game, and its performance issues while doing so, can in any way be related to any current CPU software rending the same game.

 

well, Doom 3 already does much of the computations for lights and shadows on CPU (if not everything). Doom 3 BFG has cvars that skip rendering of prelit, static and dynamic shadows. So when you shut those off, GPU isn't used to render those, however shadows are still calculated on CPU. So there is no noticeable performance gain when doing so. In BFG, you can also turn off GPU skinning, and you get massive performance drop when you turn it off. And that's why I don't get how can GPU be blamed in Doom 3, if the heaviest calculations are done on CPU.

Link to comment
Share on other sites

Ok, I see what you mean. But we've already got past that point in testing. Whatever your particular card does, we're seeing drops from 60fps (older nvidia) to 35 fps (newest AMD) in the same spot and on the same machine. AMD have installed TDM on their test machines and reproduced the result. So it's not a CPU problem.

 

So I am showing you test results on my Nvidia, and you rebuking it saying "oh it's different" ? I am saying that there is nothing in Doom 3 engine that can bog down GPU. Unless you have some custom ARB shaders that are written without consideration for modern hardware and obsolete OpenGL functions, then it's a different story. I recall there was a cvar to not use ARB shaders during rendering. Why don't we all try running without shaders and with shaders to compare results ?

Link to comment
Share on other sites

Why don't we all try running without shaders and with shaders to compare results ?

Because that wouldn't give any results relevant to this discussion, which is about one very specific result: new AMD card gives worse results than (some) older nvidia cards everything else being equal, including the use of shaders. That's something we'd like to see fixed, irrespective of what other performance optimizations might be available, and thanks to Biker's inexhaustible patience in pressing his trouble ticket, AMD would like to see it fixed too.

Link to comment
Share on other sites

Guys, the new GPUs are heavy optimized for DX11-level features: don't be surprised if and old GPU can beat AMD and NVidia today GPUs with an old (10 years!) D3D8 / D3D9 / OpenGL1.x (->KotOR!) / OpenGL2.x engine!

And don't be surprised of compatibility/performance problems of Catalyst OpenGL driver (atioglxx.dll): it's and old story of bad marriage :P

Edited by lowenz

Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.

Link to comment
Share on other sites

Ah well spotted I had not even noticed the AA settings on peoples posts, @ SteveL, Kyyrma, Skina, Ralle321 and Hmart can you set your AA/AS to x2/x8 and give your FPS again please.

 

I don't use AA at all in any game. So my FPS's aren't affected by it, but even so i enabled it just to see the difference, here it is, 32 to 34 FPS, so some impact but nothing of notice.

Link to comment
Share on other sites

On any card (ATi or Nv) the difference between 2xAA and 16xAA should be staggering

 

Not necessarily. If the rendering pipeline isn't limited by fillrate then adding/removing extra pixels to be rendered won't make much difference to performance. This is also why lowering your screen resolution won't necessarily speed up the rendering of a badly-performing game.

 

So I am showing you test results on my Nvidia, and you rebuking it saying "oh it's different" ? I am saying that there is nothing in Doom 3 engine that can bog down GPU

 

What part of "AMD confirmed an issue with their drivers" do you not understand? Or do you think AMD engineers are so stupid that they can't diagnose problems in their own drivers and need the benefit of your theorycrafting and random guesswork about what the Doom 3 engine "couldn't possibly" be doing?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recent Status Updates

    • OrbWeaver

      Does anyone actually use the Normalise button in the Surface inspector? Even after looking at the code I'm not quite sure what it's for.
      · 6 replies
    • Ansome

      Turns out my 15th anniversary mission idea has already been done once or twice before! I've been beaten to the punch once again, but I suppose that's to be expected when there's over 170 FMs out there, eh? I'm not complaining though, I love learning new tricks and taking inspiration from past FMs. Best of luck on your own fan missions!
      · 4 replies
    • The Black Arrow

      I wanna play Doom 3, but fhDoom has much better features than dhewm3, yet fhDoom is old, outdated and probably not supported. Damn!
      Makes me think that TDM engine for Doom 3 itself would actually be perfect.
      · 6 replies
    • Petike the Taffer

      Maybe a bit of advice ? In the FM series I'm preparing, the two main characters have the given names Toby and Agnes (it's the protagonist and deuteragonist, respectively), I've been toying with the idea of giving them family names as well, since many of the FM series have named protagonists who have surnames. Toby's from a family who were usually farriers, though he eventually wound up working as a cobbler (this serves as a daylight "front" for his night time thieving). Would it make sense if the man's popularly accepted family name was Farrier ? It's an existing, though less common English surname, and it directly refers to the profession practiced by his relatives. Your suggestions ?
      · 9 replies
    • nbohr1more

      Looks like the "Reverse April Fools" releases were too well hidden. Darkfate still hasn't acknowledge all the new releases. Did you play any of the new April Fools missions?
      · 5 replies
×
×
  • Create New...