Jump to content
The Dark Mod Forums

Post your System Configuration here


greebo

Recommended Posts

1 hour ago, AluminumHaste said:

So I did something a bit nuts last night.

I've been wanting to replay some old games that heavily used PhysX, but they run like crap on my AMD Radeon system.
Games like Mirror's Edge 1, Batman series, Metro Exodus, Boderlands 2 etc.

I had a spare EVGA 980ti sitting on the shelf, so took the computer apart and added it to my system.

After playing around with PCIe lane splitting, I was able to get 8x Gen4 to the 7900XTX, 4x Gen 3 to the 980ti.

Also, using the DirectX to Vulkan wrapper for Mirror's Edge, I was playing the game at 1440p, 360 fps, with PhysX on and it just worked.

I did have to do some fiddling with software though, installed JUST the graphics driver from 2021, no control panel, and latest PhysX software. Of course you have to fiddle with each game to get it to use the latest physx binaries instead of the included ancient ones, but once I got it working, Batman Arkham City benchmark was running at an average 240 fps with Max of 360 fps with PhysX on.

 

 

Oh I never realized that was possible now, I recall Nvidia not letting AMD users use a Nvidia GPU has a PhysX accelerator!? 

And btw I don't think the extra PhysX effects are worth it for the extra heat and energy consumption of having two GPU's on a system. 

And I say this has someone that (unfortunately) bought a Ageia PPU and played the hell out of CellFactor and Warmonger, specially this last one. 

Link to comment
Share on other sites

Well even at full PhysX rendering the Nvidia GPU is using about 40 watts. Compared to the output of the 14700k and 7900XTX at 360 FPS output, it's not even noticeable.

I always assumed I'd taste like boot leather.

 

Link to comment
Share on other sites

yeah at some point several years ago they gimped the physx driver so it could only run with GPU acceleration if no amd card was detected, as expected they got quite a boatload of flak for that move but it persisted for some generations of the physx driver before they removed it again.

  • Like 1
Link to comment
Share on other sites

can be a fun project (or drive you to madness) :) it can be fiddly to get the cards to work together i can attest to that having used an old nvidia card as an accelerator for my no longer used amd R9 290X. when it works it works beautifully but it does not really make sense for games that dont use physx.

Link to comment
Share on other sites

Was working on my mother's PC this weekend.

AMD E-300 APU with HD 6310 graphics.

Doing everything I could from the performance tweak guide, I got about 18FPS in "A New Job" ( at most ).

Wasn't expecting miracles, the thing barely runs Doom 3. Just morbid curiosity.

 

I've been pondering setting up a Linux partition on it.

Modern ( updated ) Windows 10 is atrocious with a spinning HDD so it might be nice to have something a little snappier to do browsing (etc).

Downside is that Windows 10 tends to funk up dual boot systems when the OS's exist on the same HDD.

I don't think my mom is gonna want to learn to use a USB boot repair and cross fingers every time she updates.

  • Like 2

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

Ok so I think I might have found the reason for the performance delta between my wife's computer and mine. While working on getting the Nvidia card working, I manually set the 1st PCIe slot to x16 Gen4, whereas it was at 4x Gen3 before.

HWinfo confirms this:

(PCIe v4.0 x16 (16.0 GT/s) @ x16 (16.0 GT/s))

Now in Scroll of Remembrance at the start, I get 140 fps. Must have been bandwidth starved at 4x Gen3 PCIe.

  • Like 1

I always assumed I'd taste like boot leather.

 

Link to comment
Share on other sites

yeah even win7 was starting to have problems speedwise with a mecha monster (mechanical harddrive) x), one huge improvement would be smacking a cheap ssd in it and use the old bookweight as a backup drive :) but since it's a laptop the books need to be light hehe.

the E300 is a bobcat series so not exactly fast pr core and it's age "released 2011" means its threading in hazarddous territory even on win7/8 but can it run 10 ? sure ;) will it be fast eh ? nope...

some more ram might also help a bit since win10 runs best with atleast 8gb.

 

1 hour ago, AluminumHaste said:

Ok so I think I might have found the reason for the performance delta between my wife's computer and mine. While working on getting the Nvidia card working, I manually set the 1st PCIe slot to x16 Gen4, whereas it was at 4x Gen3 before.

HWinfo confirms this:

(PCIe v4.0 x16 (16.0 GT/s) @ x16 (16.0 GT/s))

Now in Scroll of Remembrance at the start, I get 140 fps. Must have been bandwidth starved at 4x Gen3 PCIe.

sounds likely, if your gfx card is pcie 4.0 though some pcie 4.0 cards actually run better in pcie 3.0 heh.

 

Link to comment
Share on other sites

doh i remember now i actually had an E300 laptop myself at some point 😂 mine was from asus and came with win7 32 bit though it could actually run the 64 bit version as well but it was damn slow even with the 32 bit version. so yah i allready went through upgrading it and i remember now that even with an ssd it was a slow starter (not that weird tbh the cpu was ment for the mobile market that it could actually run win7 was pretty fantastic).

Link to comment
Share on other sites

also you have to be carefull when adding new cards as the pcie slot allocation is determined by the cpu, i found that out the hard way back when the x99-AII was running with the original 6800k. the 6800k only has 28 pcie lanes and wifi steals 4 of those and the extra usb3 slots steals 2 more lanes plus the nvme also runs away with 4 lanes the gfx card muzzles in and gobbles up 16 so when i added a thunderbolt card that came with the board i suddenly noticied my gfx card was running at x8 instead of x16 🤣 because the thunderbolt card also took 4 lanes. with the 6950x i have 40 lanes so plenty of room for everything as long as i dont go sli in which case id be back to square one with atleast one of them running at x8.

some boards use a special chip for multiplying pcie lanes but it does take a toll on performance so i havent seen many of those lately.

still with the 6800k i had to either disable the extra usb3 ports or the onboard wifi if i wanted an nvme and i dont normally use a ton of usb3 ports but dragging a cable for my internet into my room ment 20m of cable so i opted for the wifi so instead the extra usb3 ports were turned off.

Link to comment
Share on other sites

  • 2 weeks later...

one weird thing i noticed with the x99 was that the gfx port downscaled to 16_1 when not running something that required the muscle as soon as i started a game it went up to 16_3 (verified with gpu-z), my older x79 allways ran in 16_3 even though it did not need to so it seems there were some changes in the pcie handling on later cpu's.

when the intel x299 came out the pcie lanes went up a bit 48 i think it is now, not sure if amd also upped theres ?.

 

Link to comment
Share on other sites

  • 2 months later...

My specs:
- Ubuntu 22.04.4 LTS
- Quad core Intel Core i7-3632QM
- 8 GB
- NVIDIA GeForce GT 635M, Intel 3rd Gen Core processor Graphics
- X.Org X Server - Nouveau display driver

Runs smoothly with these settings:
Resolution: 1024x600
V-sync: Off
A-aliasing: Off
Texture Anisotropy: 1x
LOD: Low
Shadows: Stencil
Soft Shadow Quality: Off
Max FPS: 60
Color Precision: 32 Bits
Ambient Occlusion: Off
Bloom: Off
Frontend Acceleration: On

The only problem is that every time I quit the game, it won't change back to screen's native resolution (1600x900). It just stays 1024x600.

Edited by jagedew
Link to comment
Share on other sites

4 hours ago, jagedew said:

My specs:
- Ubuntu 22.04.4 LTS
- Quad core Intel Core i7-3632QM
- 8 GB
- NVIDIA GeForce GT 635M, Intel 3rd Gen Core processor Graphics
- X.Org X Server - Nouveau display driver

Runs smoothly with these settings:
Resolution: 1024x600
V-sync: Off
A-aliasing: Off
Texture Anisotropy: 1x
LOD: Low
Shadows: Stench
Soft Shadow Quality: Off
Max FPS: 60
Color Precision: 32 Bits
Ambient Occlusion: Off
Bloom: Off
Frontend Acceleration: On

The only problem is that every time I quit the game, it won't change back to screen's native resolution (1600x900). It just stays 1024x600.

Why not use the Nvidia proprietary driver?

Instead of choosing a lower resolution, you should lower your "render scale". That way there will be no mode switching when you go to the desktop and the GUI will remain at native resolution.

Also, even if you set Max FPS to 60, we recommend that you select Uncapped FPS mode because it works better than the old Doom 3 capped mode.

See also:

https://wiki.thedarkmod.com/index.php?title=Performance_Tweaks

  • Like 2

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

I would recommend setting uncapped fps mode and then set v-sync on and Color Precision: 64 Bits. 32 bit just looks ugly. Texture Anisotropy: a bit higher. It has not much effect on performance in my experience, so try setting it to 8 and see what the difference is. I would try to play with soft shadow to on (Stencil), but on the low setting and the slider completely left.
You can try to disable AA, but in the driver force enable FXAA, this looks better than without, but performance is much better than game's AA setting.

Also, there's a massive difference in performance between missions, so it's worth at least to try to tweak your settings on older missions.

  • Like 1
Link to comment
Share on other sites

On 9/7/2024 at 5:31 PM, nbohr1more said:

Why not use the Nvidia proprietary driver?

Instead of choosing a lower resolution, you should lower your "render scale". That way there will be no mode switching when you go to the desktop and the GUI will remain at native resolution.

Also, even if you set Max FPS to 60, we recommend that you select Uncapped FPS mode because it works better than the old Doom 3 capped mode.

See also:

https://wiki.thedarkmod.com/index.php?title=Performance_Tweaks

I have got some issues with proprietary drivers before. Strange resolution issues on desktop and especially on the login screen. It was nvidia-driver-390, I think. That's why I have been using Nouveau.

Link to comment
Share on other sites

On 9/7/2024 at 9:12 PM, datiswous said:

I would recommend setting uncapped fps mode and then set v-sync on and Color Precision: 64 Bits. 32 bit just looks ugly. Texture Anisotropy: a bit higher. It has not much effect on performance in my experience, so try setting it to 8 and see what the difference is. I would try to play with soft shadow to on (Stencil), but on the low setting and the slider completely left.
You can try to disable AA, but in the driver force enable FXAA, this looks better than without, but performance is much better than game's AA setting.

Also, there's a massive difference in performance between missions, so it's worth at least to try to tweak your settings on older missions.

I really don't see the difference between 64 bit and 32 bit. Fps is uncapped, yes, but V-sync and AA has been kept off. I still check which shadow type works best.

I do have noticed the performance difference, not only between the missions but within the missions. Especially it there are like tons of guards roaming everywhere and lots of fancy light effects which cannot put out by water arrows.

Link to comment
Share on other sites

3 hours ago, jagedew said:

I really don't see the difference between 64 bit and 32 bit. Fps is uncapped, yes, but V-sync and AA has been kept off. I still check which shadow type works best.

I do have noticed the performance difference, not only between the missions but within the missions. Especially it there are like tons of guards roaming everywhere and lots of fancy light effects which cannot put out by water arrows.

64bit really reduces color banding in gradients, like looking at soft shadows or skyboxes etc. There's almost no performance penalty so leave it on.

  • Like 1

I always assumed I'd taste like boot leather.

 

Link to comment
Share on other sites

for most parts id recon 32 bit has better optimization because of restrictions in ram usage, with the comming of 64 bit it seems most attempts at making optimizations went out the window atleast in regards to memory since 64 bit can handle a huge ammount (not that the user might have a huge ammount of ram but thats secondary to the game devs i suspect).

it does help with high detail textures thats why it might look better, 32 bit is kinda locked to 4gb ram or vram usage and some textures today might easily surpass that.

Link to comment
Share on other sites

one strange thing is that this limit was set by ms on all but there server versions who could access 128 gb ram via PAE (physical address extension), some linux variants also used PAE.

some intel HD drivers were buggy with PAE which is kinda funny since the PAE extension was created by intel 😂.

so in essence it is limited by ntoskrnl you can patch it though to allow more than 4gb ram.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recent Status Updates

    • datiswous

      Beta test(er) tip:
      Test your mission at least once with all lights on. This can be done using notarget in console. Maybe just quickly fly around with noclip.
      Also test all lights which are off by default (enable all lights via script?). Mission testers will miss a lot of light bugs, because they take out lights with water arrows etc. and don't turn on lights so they don't spot light leaks etc. I've seen this now in some recent new missions after they're released.
      · 0 replies
    • Bergante

      welcome back Sotha 🫠
      👻
      · 6 replies
    • JackFarmer

      This site is getting more popular by the day - ca. 870 bots online this morning CET!
      · 2 replies
    • Xolvix

      Personal reminder for me to actually get back to TDM and all the missions I missed.
      · 1 reply
    • JackFarmer

      What is actually grammatically correct when it happens in the future? “Paul Atreides is an idiot” or ‘Paul Atreides was an idiot’? or ‘Paul Atreides will be an idiot’? The latter would at least fit in with the whole psychic and providence stuff!
      · 2 replies
×
×
  • Create New...