Jump to content
The Dark Mod Forums

Possible Performance Gain


Springheel

Recommended Posts

Why guessing how it works if we have the code? :huh:

 

We don't have all the code. Anything relating to rendering, scenegraph traversal and the like takes place inside closed-source portions of the Doom 3 executable, which we have no access to.

Link to comment
Share on other sites

  • Replies 78
  • Created
  • Last Reply

Top Posters In This Topic

Definite improvements of 10 to 25 fps set with lg at 30 and no flickering or any other effects on GTX280. Not yet tried on an FM with dynamic skybox though. But I'll be leaving my lg set at 30 for the time being to see how the slight response lag affects gameplay. If a problem I'll look at results for 10, 15, 20 to see what works best. It may be that 10 is almost as good as 30.

 

Without alerts I'm seeing 25-30fps at worst anywhere in my FM and mostly 50-60fps. :) And I'm on fairly high settings so this hopeful for low enders with lower settings assuming they don't get flicker or whatever.

Link to comment
Share on other sites

Also the interleaving depends on FPS. So interleave 10 with 60 FPS is 166 msec lag, while with 10 FPS it becomes 1 sec lag. Again: do not set high values of it and you won't have trouble.

 

That doesn't fit with my results. When I set the interleave to 2, I notice definite flickering in most lights. When I set it to 15, there is no flickering in the lights, but I notice it in the skybox. When I set it to 30, I notice no flickering at all. I don't notice any stuttering either (though maybe it will become more obvious after trying it for a while). Higher values seem to be causing *fewer* flickering problems.

 

I suspect that we never tried it at such high values, only testing values of <10.

 

I'd love to hear from someone who couldn't run the original RTTC. Try setting your value to 30 and see what happens.

Link to comment
Share on other sites

That doesn't fit with my results. When I set the interleave to 2, I notice definite flickering in most lights. When I set it to 15, there is no flickering in the lights, but I notice it in the skybox. When I set it to 30, I notice no flickering at all. I don't notice any stuttering either (though maybe it will become more obvious after trying it for a while). Higher values seem to be causing *fewer* flickering problems.

 

I suspect that we never tried it at such high values, only testing values of <10.

 

I'd love to hear from someone who couldn't run the original RTTC. Try setting your value to 30 and see what happens.

 

How will such high values affect alerts though? At 30, I see very noticeable delays in the light gem. It's no longer smooth and takes time to switch...basically jumping from light to dark.

 

There seem to be rather different side effects to this setting from system to system, which worries me. You see flickering of particles and no stuttering when you move, but I get stuttering and no flickering. Unfortunately, for me the game is unplayable...despite the fps increase. Yet, for whatever reason everything smooths out when I start recording a video with fraps. If there was some way to stop the stuttering, it would be great...otherwise it looks like a flip card animation. lol

Link to comment
Share on other sites

Definite improvements of 10 to 25 fps set with lg at 30 and no flickering or any other effects on GTX280. Not yet tried on an FM with dynamic skybox though. But I'll be leaving my lg set at 30 for the time being to see how the slight response lag affects gameplay. If a problem I'll look at results for 10, 15, 20 to see what works best. It may be that 10 is almost as good as 30.

 

Someone already said that with a lg at 30, you have the problem that the AI will take much longer to notice you...

 

A hard cap at 10 seems very sensible to me, personally I wouldn't go higher than 5. Sure, if might improve performance, but so does removing all AI completely :)

"The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man." -- George Bernard Shaw (1856 - 1950)

 

"Remember: If the game lets you do it, it's not cheating." -- Xarax

Link to comment
Share on other sites

So your movement wasn't choppy at all?

Movement was fine. No problems.

Also no visual issues on skybox, flames, lightgem, etcetera. ...

my specs: Win XP SP3, AMD 4200+ Dual-Core, 2 GB RAM, Nvdia 8800 GT, driver built 178.28.

Antialiasing Off, Vsync Off, all other advanced grafic enhancements enabled.

 

The absence of issues so far may because 'Too Late' is an undemanding map that already had a good average ~40 fps without the interleave set to 5. Tonight I check the boost and visuals with a demanding FM like Baddcogs 'Rift' which had been barely possible to play on my system at the temple as for very low fps.

Edited by fllood

"To rush is without doubt the most important enemy of joy" ~ Thieves Saying

Link to comment
Share on other sites

How will such high values affect alerts though? At 30, I see very noticeable delays in the light gem. It's no longer smooth and takes time to switch...basically jumping from light to dark.

 

It takes about half a second to switch for me. That's noticeable, but a half-second isn't going to have any significant gameplay impact. I'd trade that for a gain of 20+ FPS if there are no other side effects.

 

There seem to be rather different side effects to this setting from system to system, which worries me

 

It may be related to some kind of graphics card or driver issue, who knows? Anyway, I'm not proposing that we change the default value (unless we stumble on a perfect number that works for everyone), just that we let people know that they can tinker with the setting and possibly achieve a much higher FPS (assuming further testing reveals no serious side effects).

 

I'm particularly thinking of the people on TTLG who say they can't play TDM because their computer isn't fast enough. A boost like this might be the difference between playing TDM and not.

Link to comment
Share on other sites

Tonight I check the boost and visuals with a demanding FM like Baddcogs 'Rift' which had been barely possible to play on my system at the temple as for very low fps.

 

Results for 'The Rift' aren't as well as they are with 'Too Late':

 

In the street area fps were boosted ~ 10-20%. Water particles seam ok. Sky is ok. Any lights with dynamic brightness which pulse for ambient (like from some lantern and windows) much noticeable do more abruptly. On areas with fps bottle-necks like at the temple fps were not observably increased, but movement got sloppy as NH mentioned and shadows cast from torches looked less fluid.

 

It seam that the interleave performance add do decrease the lower a fps in a map is. At the same time visual and gameplay issues seam to become more likely. 'Too Late' is a small FM with high fps. That would explain the absence of distinguishable problems in that test. Too bad - the fps bottlenecks is where a boost really would have been awesome on low-end machines.

Edited by fllood

"To rush is without doubt the most important enemy of joy" ~ Thieves Saying

Link to comment
Share on other sites

Unfortunately, this confirms my suspicion that only higher end systems would benefit from the fix. Dropping my resolution to bare minimum of 640 x 480 allowed me to run the fix from settings of 2 to 30 without any issues, but as soon as I cranked the resolution back up and my fps ceiling wasn't as high, the stutters returned. Disappointing.

 

Maybe there is a way to deal with this in the future.

Link to comment
Share on other sites

Unfortunately, this confirms my suspicion that only higher end systems would benefit from the fix.

 

Yeah that's a problem. But if the underlying reason for the original dropping of this tweak is about what Tels mentions:

 

Someone already said that with a lg at 30, you have the problem that the AI will take much longer to notice you...

 

Then this fix is definitely unsatisfactory right there. If it affects the AI's perception of the character by the degree of implementation, then it is a bust. I'm sure if you just turned the lightgem off and made the player completely invisible to where the AI didn't have to "think" about the player at all.. FPS would be amazin'! :laugh:

Link to comment
Share on other sites

I tried 30 on my slower computer and everything became very jerky. While it might not be noticeable if someone's framerate drops from 60 to 50 every half-second, it's very noticeable when the framerate drops from 30 to 7 every half second.

 

I tried different values and when I reached 2, the framerate stopped fluctuating. I can't really tell if it's faster than when it's at 1.

 

So I can't see this allowing slower computers to play the complex missions. (Darn!)

Link to comment
Share on other sites

Definite improvements of 10 to 25 fps set with lg at 30 and no flickering or any other effects on GTX280. Not yet tried on an FM with dynamic skybox though. But I'll be leaving my lg set at 30 for the time being to see how the slight response lag affects gameplay.

 

Im running a HD5870/Q2Q9650 and I have it set to 30 and I not experiencing any side effects, other than the increased fps across the board :-). I have just tested this on the new 'classic' version of rttc in a window@ 1152*864 and full screen@1680*1050, 2xAA, 8xAS, Vsync ON, normal & default and I'm getting 63fps everywhere with a two spots where long view distance and multiple Ai cause it to drop to 50fps

Link to comment
Share on other sites

When I searched for max fps cvars, I read somewhere that the physics system of d3 is always calculated as if the game ran at 60 fps, respectively 60 Hz. Could the same be done for the lightgem rendershot timing? Maybe use a timer that sets a flag when a new rendershot is due? This would get rid of the problem of fps dependent lightgem update delays and the interleave value would directly translate into a frequency value (Hz) if we divide it by 60. If we do this, the performance gain in low-fps areas won't be as strong anymore then though, because there are (correctly) more lg rendershots performed than before.

Link to comment
Share on other sites

I tried 30 on my slower computer and everything became very jerky. While it might not be noticeable if someone's framerate drops from 60 to 50 every half-second, it's very noticeable when the framerate drops from 30 to 7 every half second.

 

I tried different values and when I reached 2, the framerate stopped fluctuating. I can't really tell if it's faster than when it's at 1.

 

So I can't see this allowing slower computers to play the complex missions. (Darn!)

 

I can see that if you have a old GFX card, then you run into problems with the rendershot, which basically does two new camera-angles (shot down/up). If you system barely manages to hold the current scene in video memory, and then the scene changes, it needs to swap in a lot of stuff, then swap it out again. That means you indeed get stutters whenever the lightgem shot is taken.

 

With lg at 1, you don't notice the stutter (because all frames are very slow), but with 30 you notice a big impact every 30 seconds.

 

Or in other words, the old adage "10 constant FPS are better than 60 FPS with 2 FPS every now and then" :)

 

I think we should add a "print time of frame to console" CVAR, then someone could run a mission,then open the console, dump the text and we could convert this to a graph. You'd then see exactly how the FPS fluctuate.

 

In any event setting lg to 5 should be a safe bet and improve performance with almost no ill effects, even if your map is already running at 2 FPS :)

"The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man." -- George Bernard Shaw (1856 - 1950)

 

"Remember: If the game lets you do it, it's not cheating." -- Xarax

Link to comment
Share on other sites

In any event setting lg to 5 should be a safe bet and improve performance with almost no ill effects

 

Are you basing that on anything specific? When I set mine to a low number like 2 or 5 I get worse distortion than at higher values.

Link to comment
Share on other sites

Are you basing that on anything specific? When I set mine to a low number like 2 or 5 I get worse distortion than at higher values.

 

Er, the "no ill effects" was based on "it shouldn't decrease performance or increase FPS stuttering NOR should it make the AI no longer see you dashing across a brightly lit corridor". I am not sure what the problem is with your system - why does it flicker at all? It shouldn't even do this.

 

Do you also get the flickering if it is set to 1? *puzzled*

 

Edit: What are your settings? Antialiasing? vsync etc? Grafic card and driver version?

"The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man." -- George Bernard Shaw (1856 - 1950)

 

"Remember: If the game lets you do it, it's not cheating." -- Xarax

Link to comment
Share on other sites

In any event setting lg to 5 should be a safe bet and improve performance with almost no ill effects, even if your map is already running at 2 FPS :)

 

Tels. lol Earlier, I said that ANY setting above 1 causes stuttering on my system. At 1, everything is smoooooooth, but at 2 or higher everything is stu...uu...uu...uu..uutery. It's slightly less noticeable at 2, but it's still unusable.

 

While not a killer system...i3 2.13Ghz, Geforce 105m 512megs (dedicated memory), 4 Gigs ram....it's better than my old desktop. There has to be a way to smooth things out though, since something as simple as turning on fraps smooths out performance. I guess the extra fraps overhead of recording a video prevents the CPU from spiking up and down.

 

I am not sure what the problem is with your system - why does it flicker at all? It shouldn't even do this.

 

It flickers because this has always been a potential side effect for 'some' cards, in fact it was a lot worse in the earlier versions of the lightgem. Spar didn't have access to the renderer so he had to guess about a lot of stuff. Short answer, it will flicker on some but not on all cards at different interleave settings. We likely won't be able to fix that until D3 is open source since it's just shooting in the dark. Perhaps the stuttering will be fixable.

Link to comment
Share on other sites

Tels. lol Earlier, I said that ANY setting above 1 causes stuttering on my system. At 1, everything is smoooooooth, but at 2 or higher everything is stu...uu...uu...uu..uutery. It's slightly less noticeable at 2, but it's still unusable.

 

Sorry I missed that. What makes me curious, however, is how can it stutter at 2, but not at 1?

 

I mean, 1 means:

 

* 1 frame rendered, 1 LG shot rendered (top)

* 1 frame rendered, 1 LG shot rendered (bottom)

* 1 frame rendered, 1 LG shot rendered (top)

* 1 frame rendered, 1 LG shot rendered (bottom)

* 1 frame rendered, 1 LG shot rendered (top) and so on

 

and 2 means:

 

* 1 frame rendered, 1 LG shot rendered (top)

* 1 frame rendered

* 1 frame rendered, 1 LG shot rendered (bottom)

* 1 frame rendered

* 1 frame rendered, 1 LG shot rendered (top) and so on

 

So why does that stutter more?

 

While not a killer system...i3 2.13Ghz, Geforce 105m 512megs (dedicated memory), 4 Gigs ram....it's better than my old desktop. There has to be a way to smooth things out though, since something as simple as turning on fraps smooths out performance. I guess the extra fraps overhead of recording a video prevents the CPU from spiking up and down.

 

Have you tried: "binding D3 to only one CPU" and/or disabling "variable CPU frequency" (usually done under powersafe)

 

?

 

I mean, if adding fraps to the mix smoothes things out, something is wrong indeed :)

 

It flickers because this has always been a potential side effect for 'some' cards, in fact it was a lot worse in the earlier versions of the lightgem. Spar didn't have access to the renderer so he had to guess about a lot of stuff. Short answer, it will flicker on some but not on all cards at different interleave settings. We likely won't be able to fix that until D3 is open source since it's just shooting in the dark. Perhaps the stuttering will be fixable.

 

I think that both the stutter and the flicker are not "normal" and that we should mention it, unfortunately, I think this also prevents us ever using >1 as default :( But at least we can document it.

"The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man." -- George Bernard Shaw (1856 - 1950)

 

"Remember: If the game lets you do it, it's not cheating." -- Xarax

Link to comment
Share on other sites

I tried this set to 5 and everything looks great, except... if you mantle on top of a light source (like I did in NHAT2), the lense flare starts flashing. I tried setting it to 25 and the flashing was much less rapid, so this must be the cause. Setting it to 1 eliminated the flashing.

--- War does not decide who is right, war decides who is left.

Link to comment
Share on other sites

So why does that stutter more?

 

My guess would be that some internal state in the renderer (e.g. particle orientation or the like) is retained between the lightgem rendershots and the subsequent regular frame. Therefore the visual result can vary depending on whether the previous render was a top LG, bottom LG or the previous frame.

Link to comment
Share on other sites

* 1 frame rendered, 1 LG shot rendered (top)

* 1 frame rendered, 1 LG shot rendered (bottom)

 

I don't think lg_split is on by default.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recent Status Updates

    • OrbWeaver

      Does anyone actually use the Normalise button in the Surface inspector? Even after looking at the code I'm not quite sure what it's for.
      · 7 replies
    • Ansome

      Turns out my 15th anniversary mission idea has already been done once or twice before! I've been beaten to the punch once again, but I suppose that's to be expected when there's over 170 FMs out there, eh? I'm not complaining though, I love learning new tricks and taking inspiration from past FMs. Best of luck on your own fan missions!
      · 4 replies
    • The Black Arrow

      I wanna play Doom 3, but fhDoom has much better features than dhewm3, yet fhDoom is old, outdated and probably not supported. Damn!
      Makes me think that TDM engine for Doom 3 itself would actually be perfect.
      · 6 replies
    • Petike the Taffer

      Maybe a bit of advice ? In the FM series I'm preparing, the two main characters have the given names Toby and Agnes (it's the protagonist and deuteragonist, respectively), I've been toying with the idea of giving them family names as well, since many of the FM series have named protagonists who have surnames. Toby's from a family who were usually farriers, though he eventually wound up working as a cobbler (this serves as a daylight "front" for his night time thieving). Would it make sense if the man's popularly accepted family name was Farrier ? It's an existing, though less common English surname, and it directly refers to the profession practiced by his relatives. Your suggestions ?
      · 9 replies
    • nbohr1more

      Looks like the "Reverse April Fools" releases were too well hidden. Darkfate still hasn't acknowledge all the new releases. Did you play any of the new April Fools missions?
      · 5 replies
×
×
  • Create New...