Jump to content
The Dark Mod Forums

Recommended Posts

Posted (edited)

I'm playing Metro - Last light, also very nice, from a Giveaway some time ago, not a problem for my gfx card. Min specs the same, so I'll will try this one too.

 

Edit: as said, works flawless with high fps, it adjust automaticly to the sys specs to the best quality possible.

Edited by Zerush
Posted

not sure about last light but metro had a bug in AP so when you set it to anything above 2 it  literally bogged the game to a crouch (atleast if playing it in 4k) this happens because it scales the scene to 4x native resolution so way above 8k.

if i set it to off i get 179 fps in 4k 2x drops that to 74 fps and 4x drops to about 20 😂

Posted
3 minutes ago, revelator said:

not sure about last light but metro had a bug in AP so when you set it to anything above 2 it  literally bogged the game to a crouch (atleast if playing it in 4k) this happens because it scales the scene to 4x native resolution so way above 8k.

if i set it to off i get 179 fps in 4k 2x drops that to 74 fps and 4x drops to about 20 😂

Not such issues in my laptop, both working fine without any dropdowns 🤔

maybe related to the specs I've.

Win 11 24H2 Home

hb78Xy.png

Posted

the original metro 2033 used MSAA which had an even worse hit on fps. the redux version uses the last light renderer which shifted antialiasing to SSAA which still has a massive hit on fps but far less than MSAA.

the benchmark utility does not have a lot of settings to play with but the SSAA setting is there :) try setting it to 4x in very high mode and watch the show.

the original needed 3x geforce 680 ti in SLI to render it at 60 fps at 1080 pixels on the highest setting with 4x MSAA. granted the 680 ti is massively outdated but it does tell a bit about the muscle it needed.

my test was done on a 2080 ti and a rtx 3070 on a ryzen 5800x with 64 gb ram.

Posted (edited)

Ah, OK. I only can say that I don't note any issue in both Metro in the default settings. Maybe it was also meanwhile patched. The only game where I noted fp dropdowns in higher settings was Sicaria, the free stealth game. Currently I play also Call of Juarez - Gunslinger, which also appears sometimes in the Giveaways, normally for few money (€2-3) in Steam, A very nice FPS well optimized with very good graphics. Somewhat hardcore gameplay. Old but gold

V8PTpa.png

Edited by Zerush
  • Like 1
  • 2 weeks later...
Posted (edited)
On 3/23/2025 at 6:38 PM, Zerush said:

A nice one, and yes, it remembers a bit of the Myst/Riven series.

The remastered (remake) version of amerzone is out now

Edited by taffernicus

2027/28

 

(Possible prepping)

subpx.png.f5f46728702be3eaa0cd8d5101c2c14a.png

Posted (edited)

now i want to play syberia the world before 

unrelatwd , I think with the proliferation of next-generation game engines and the latest graphics effects and features, cards like the rtx 3060/4050 or rx 6600/7600 are like the bottom of the barrel. 

Edited by taffernicus

2027/28

 

(Possible prepping)

subpx.png.f5f46728702be3eaa0cd8d5101c2c14a.png

Posted
2 hours ago, taffernicus said:

unrelatwd , I think with the proliferation of next-generation game engines and the latest graphics effects and features, cards like the rtx 3060/4050 or rx 6600/7600 are like the bottom of the barrel. 

Certainly there's a lot of complaints about Unreal Engine 5, which the Oblivion remaster is using.

8 GB VRAM is starting to become too little in newer titles, even at 1080p, because of the textures used, and features shaving away some of it. 12 GB will become the new minimum with 16 GB preferred, despite 8 GB cards continuing to be sold.

Are you including mobile? RTX 4050 isn't a desktop GPU.

  • Like 2
Posted (edited)
38 minutes ago, jaxa said:

Certainly there's a lot of complaints about Unreal Engine 5, which the Oblivion remaster is using.

I think the problem here stems from a different source. I don't know if it's correct, but, I read that the Oblivion remaster is basically a raytraced DX12 wrapper running the original source code in Unreal Engine, with remade assets, lighting, and the graphical possibilities UE5 offers. I actually asked myself, how they are able to port the gameplay 1:1 to UE5, and, I guess that's how they did it, if that's correct.

I'm pretty sure you can code much more efficiently in terms of performance in Unreal Engine, as usual. But, I guess they did a pretty good job in terms of optimizing anyway, considering all that is going on "behind the curtain".

Anyway, the demand modern graphics tech requires from your PC hardware is absolutely insane, when you consider the little value of shiny visuals. As I don't expect this insanity to end anytime soon, I can only be happy that 99,9% of games released these days don't interest me at all anyway.

Edited by chakkman
  • Like 2
Posted

could be ?, the witcher 3 remaster also used a dx12 wrapper to enable raytracing with differing success.

the wrapper seems to perform somewhat badly though compared to games written with raytracing in mind.

  • Like 1
Posted

well 12 gb can be used but you have to lower settings considerably otherwise it starts overshooting the vram in some of the recent ue5 titles. for example in indiana jones and the great circle i have to lower texture resolution to its absolute minimum otherwise the game slows to a crawl. afterburners vram plugin shows it using close to 16 gb on the medium setting while the lowest setting uses about 7 gb vram. i suspect the highest setting would use something close to the 24 gb the top cards flout.

  • Like 1
Posted

20 gb vram confirmed by a 4090 user on max so ya it eats up vram like its christmas 😂.

while the requirements arent as steep as posted for indiana jones it does require lots of vram as one can see for themself when the 3060 16 gb model is faster than the 8gb 4060 by quite a margin.

my 2080 ti can handle medium but i have to lower some settings to keep vram usage in check.

even so it is damn close to putting a plug in the old 11gb card, a shared memory model might help the low vram gfx card models a bit but since motherboard memory is usually quite a bit slower than vram i would expect it to be quite choppy.

Posted

btw the latest nvidia driver allows offloading atleast cuda vram to system memory so there is that...

performance wise it seems to be workable atleast in the benchmarks avaliable for the Llama model, exLlama shows some improvements but is still in the testing phase.

you can find the setting in the nvidia control panel under 3d settings -> cuda policy for fallback to system ram.

Posted
On 4/27/2025 at 6:57 PM, jaxa said:

Certainly there's a lot of complaints about Unreal Engine 5, which the Oblivion remaster is using.

8 GB VRAM is starting to become too little in newer titles, even at 1080p, because of the textures used, and features shaving away some of it. 12 GB will become the new minimum with 16 GB preferred, despite 8 GB cards continuing to be sold.

Are you including mobile? RTX 4050 isn't a desktop GPU.

my bad , i was meant to type rtx 3050 desktop version

I made an unwise purchase decision, I bought rx 6400 + rx 6600 non xt (6400 is used for backup needs only if one day the rx 6600 GPU goes awry like getting artifact, no display,etc ). I should have been able to get an rx 7800 or rtx 4070 with the price of rx 6400 + rx 6600 combined (with little additional funding of course)

with this insane game system requirement, i will hold a pity party about 8GB GFX card in the next 3-4 yrs

 

 

  • Sad 1

2027/28

 

(Possible prepping)

subpx.png.f5f46728702be3eaa0cd8d5101c2c14a.png

Posted

there is a guy who does vram modding though mostly on the rtx 3070 and rtx 2080 ti where he doubles the vram on them so the 3070 gets 16 gb and the 2080 ti gets 22 gb. wont make the cards faster in games they cannot handle but in some titles like indiana jones it atleast allows pushing the detail level some (game runs fine on both but they lack vram for the higher detail settings). he does this by changing the 1gb modules with 2gb and adjusting the straps so the card can actually use it.

things like alan wake 2 though is best left for people with atleast an rtx 4090 if you want to play it at 4k with settings on max 😂. can get pretty far with a 2080 ti with some adjustments and even the 3070 can be used but that card is simply gimped vram wise. 4k runs ok but i have to set dlss to performance to keep fps at 60 or scaling to the worst image size but that looks like crap 🤢

 

Posted

atm amd cards seem the best solution vram wise, the 9070 xt is really good but nowhere near msrp (performs better than an rtx 3090 ti). with some luck you can get a 7900 xt for a good price (good deal faster than the 9070 xt). the older 6800 xt is also a good card if raster performance is all you need as raytracing performance is kinda abysmal with the pre 7xxx series. some lower cost amd models are also worth considering like the 7600 xt.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recent Status Updates

×
×
  • Create New...