Jump to content
The Dark Mod Forums

Recommended Posts

Posted (edited)

Maybe you don't know but the last iterations of Windows 10 on multi-GPU systems automatically choose the "best" GPU for the rendering (in D3D11 and lastest OpenGL applications) enabling the possibility of an Optimus-like configuration on desktop systems too: the active AND single-showed desktop on the monitor connected to the iGPU (the "integrated" GPU on the processor chip) where the output of the "real working" (typically dedicated, on the classic PCI-Express card) GPU is eventually routed thanks to the framebuffer being copied through the main System Memory.

The results (so you can see the actual overhead of the framebuffer copying/swapping process )

Pure Dedicated GPU (GeForce 1050 Ti):

0.jpg

"Optimus-Like" configuration (Intel HD 630 + GeForce 1050 Ti)

1.jpg

 

Not bad! Only a 9.5% of performance hit on a system with DDR4 2400 (so with better RAM modules and IMC you can reduce it to 5% I think)

Of course you can always force the rendering GPU through the dedicated interface in  "Display Settings -> Graphic Settings"

Edited by lowenz

Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.

Posted
19 hours ago, lowenz said:

Maybe you don't know but the last iterations of Windows 10 on multi-GPU systems automatically choose the "best" GPU for the rendering (in D3D11 and lastest OpenGL applications) enabling the possibility of an Optimus-like configuration on desktop systems too: the active AND single-showed desktop on the monitor connected to the iGPU (the "integrated" GPU on the processor chip) where the output of the "real working" (typically dedicated, on the classic PCI-Express card) GPU is eventually routed thanks to the framebuffer being copied through the main System Memory.

The results (so you can see the actual overhead of the framebuffer copying/swapping process )

Pure Dedicated GPU (GeForce 1050 Ti):

0.jpg

"Optimus-Like" configuration (Intel HD 630 + GeForce 1050 Ti)

1.jpg

 

Not bad! Only a 9.5% of performance hit on a system with DDR4 2400 (so with better RAM modules and IMC you can reduce it to 5% I think)

Of course you can always force the rendering GPU through the dedicated interface in  "Display Settings -> Graphic Settings"

Good to hear things are moving in that direction but 10% (1.5msec) is un-explainable slow

1080p framebuffer is just 8MB of data. That means the effective transfer speed is ~5GB/sec. Meh!

Compare to promised ~15GB/sec for PCIEx16 3.0. Unless, of course, you're running in x8 or 2.0 mode.

Posted

3.0 x16 here!

But DDR4 2400 only.

Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.

Posted

Dual channel yes, but the limitation resides in the IMC and PCI-E management as you say. Still more speed (latencies are already low in my system) in the memory section could gives more clear results.

Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.

Posted (edited)

I can confirm all is working as intended !

I've literally unplugged the GeForce from the monitor input but I'm getting some solid 55-56 FPS in TDM 😛 (in the scene above)

With NO wizardry in Windows at all, it's just there.

Edited by lowenz

Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.

Posted

Does not work the other way around

When monitor is plugged to the nVidia, it won't use the IGP even though configured in Settings :(

Job half-done, as always with M$

Posted (edited)

Good old Source (SDK 2013 Single Player Upcoming branch), all maxed out (MSAA 4x too), with 999 FPS cap

Pure Dedicated GPU (Radeon RX 570 / 2050 MHz VRAM) -> 548 FPS

Optimus-Like configuration (Intel HD 630 + Radeon RX 570/ 2050 MHz VRAM ) -> 379 FPS

 

So the overhead is (548-379)/548= 31%

Edited by lowenz
  • Like 1

Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.

Posted (edited)

With modern engine like the Unreal Engine 4 in Borderlands 3 the overhead is 0 (yes, ZERO)

So maybe the main issue is the texture streaming optimization missing in old engines?

 

Borderlands 3 has the same average framerate in the integrated benchmark!

Edited by lowenz

Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recent Status Updates

    • JackFarmer

      "The Year of the Rat." 
      😄

      Al Stewart must be proud of you!
      Happy testing!
      @MirceaKitsune
      · 1 reply
    • datiswous

      I posted about it before, but I think the default tdm logo video looks outdated. For a (i.m.o.) better looking version, you can download the pk4 attached to this post and plonk it in your tdm root folder. Every mission that starts with the tdm logo then starts with the better looking one. Try for example mission COS1 Pearls and Swine.
      tdm_logo_video.pk4
      · 2 replies
    • JackFarmer

      Kill the bots! (see the "Who is online" bar)
      · 3 replies
    • STiFU

      I finished DOOM - The Dark Ages the other day. It is a decent shooter, but not as great as its predecessors, especially because of the soundtrack.
      · 5 replies
    • JackFarmer

      What do you know about a 40 degree day?
      @demagogue
      · 4 replies
×
×
  • Create New...