Jump to content
The Dark Mod Forums

2016+ CPU/GPU News


jaxa

Recommended Posts

10 hours ago, jaxa said:

I'm getting a refurb i5-6600T. I wonder how well TDM can run on trash Intel HD Graphics 530.

It would be funny to give it 64 GB of memory for $120.

I'm guessing it will work just fine. OpenGL 4.6 compliant and performance roughly equivalent to Geforce GT 730

  • Thanks 1

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

dont see a problem with it either :) should be fine as nbohr1 said it does support opengl 4.6.

intels built in gfx can be a bit finnicky at times though but i only had problems with a few select games using them.

for my own part i ended up using the older pc components for gaming while my threadripper now chugs along building msys2 packages and other general development projects, gfx is a 970 gtx.

The 1080 ti i paired with the asrock z97 extreme6 which runs really well when not overclocking, my old asus x79 deluxe is used as a second gaming computer for when my friends come to visit, and used as a media pc when im alone. I paired that one with my old msi R9 290X gfx card.

Link to comment
Share on other sites

p.s sorry about the rant it did not in fact have any place in this thread i was just so dissapointed after getting two new pc cases that each of them despite being pretty expensive used sub standard materials, one even shredded the screw threads after pulling my gfx card despite the screws only being tightened with my fingers 🤮. The obsession with glass windows and lightshows in these cases bear a good deal of the responsibility i reckon.

  • Like 1
Link to comment
Share on other sites

  • 2 months later...

 

Only having 8GB on the standard RTX4060, half way through 2023, is dog-shit. But hey, you can always pay twice as much to get the card the RTX4060 should have been (with 16GB). Heck, the base RTX4060 even has less memory than the RTX3060 of three years ago does! I am definitely going Intel next time, because they are rapidly catching up to NVidia in Blender, and they have done more for the Blender project in the past two years than AMD ever has. (OIDN, path guiding). You get OIDN and path guiding for free, even if you don't have any Intel hardware at all!

 

8GB is no longer enough for gamers, and is straight-up laughable for anyone who wants to do anything more with their GPU than play games. NVidia's monopoly is eroding and they're still acting like they can shit on customers from the roof.

Link to comment
Share on other sites

It's becoming apparent that I miscalculated slightly when I bought a new PC at Christmas 2021.  I only got a 6-core CPU, 16 GB RAM and an 8 GB graphics card.  I looked at that game Chakkman posted the other day - the minimum requirements are an 8-core CPU, 12 GB card and 16 GB RAM (but 32 GB recommended).  Oops. Oh well, good thing I pretty much only play TDM these days...

Link to comment
Share on other sites

3 minutes ago, Frost_Salamander said:

It's becoming apparent that I miscalculated slightly when I bought a new PC at Christmas 2021.  I only got a 6-core CPU, 16 GB RAM and an 8 GB graphics card.  I looked at that game Chakkman posted the other day - the minimum requirements are an 8-core CPU, 12 GB card and 16 GB RAM (but 32 GB recommended).  Oops. Oh well, good thing I pretty much only play TDM these days...

>Processor: AMD Ryzen 7 2700X Eight-Core Processor (16 CPUs), ~3.7GHz; Intel i5 7000 series

Intel i5 7000 series = 4-core, 4-thread Kaby Lake CPUs. So no, your 5600X or whatever will be just fine.

These companies can write whatever they want as minimum/recommended specs and it doesn't have to make any sense. But you've also misinterpreted it.

  • Like 1
Link to comment
Share on other sites

4 minutes ago, jaxa said:

>Processor: AMD Ryzen 7 2700X Eight-Core Processor (16 CPUs), ~3.7GHz; Intel i5 7000 series

Intel i5 7000 series = 4-core, 4-thread Kaby Lake CPUs. So no, your 5600X or whatever will be just fine.

These companies can write whatever they want as minimum/recommended specs and it doesn't have to make any sense. But you've also misinterpreted it.

But is the graphics VRAM not anything to worry about?  To be honest, I'm not really that concerned or anything, but was hoping this PC wouldn't be obsolete in 2 years in case I got the itch to play some new AAA thing at some point.

Link to comment
Share on other sites

Games are likely to be runnable on your 8 GB GPU, but you might need to turn down settings, or game devs might silently swap in low quality textures to keep performance up on 8 GB GPUs. It's known that Ultra settings tend to be overkill.

You will be gaming at 1080p resolution more often if you aren't now. What are "recommended" settings for, 4K resolution? That's why I like to see a detailed table with a few columns rather than old-style "minimum" and "recommended".

The people who should be most mad about 8 GB VRAM are those who bought an expensive RTX 3070 or 3070 Ti, thinking they would be doing ray-tracing, 1440p/4K, etc. for many years. But if you are just doing 1080p you will probably be fine.

Edited by jaxa
  • Like 1
Link to comment
Share on other sites

indeed... my old R9 290x still kicks ass in pretty much any game at 1080 with its 4g ram where it starts to cough is at greater resolutions as the vram usage skyrockets in 4k. my 1080 ti runs most titles quite fine in 4k due to its 11G vram but coughs a bit on titles requiring dlss most games do atleast support both dlss and fidelityfx so i can get away with using fidelityfx and still be able to play it though not with 60 fps (dips to about 45 fps with fidelityfx and raytracing in callisto protocol at 4k). though most competitive gamers dont want to go near a card that cant do 60 fps minimum i honestly cannot feel the difference as it runs quite smooth even at 45 fps but im not the youngest anymore so that might be why 🤣. at 1080 resolutions the 1080 ti is a beast still and flings a bit over double the performance of the R9 290X while staying a lot cooler so i could actually OC it for a bit more performance. the 3060 ti is about the same performance as the 1080 ti but has less ram so wont actually do so well in 4k but you do get dlss as a bandaid.

Link to comment
Share on other sites

also 4k looks great when setup correctly but it is also the most annoying part of playing in it... -> need to setup HDR brightness/contrast correctly or everything looks washed out, win10 does have a HDR setting for running your desktop in 4K but it tends to disable itself if a game you play does not support HDR to begin with making things look weird at times. You also need to enable scaling of the desktop as otherwise the text gets to small to read and the icons are allmost impossible to hit with the mouse. i use a 42" smart TV and have scaling set at 175% win10 actually sets it to 200% but thats a bit big for my taste 😅 win11 has auto HDR so it reenables desktop HDR when you exit a game atleast.

Also if you play on a TV like i do 4K gets quite vonky in movies if you try to use full rgb mode (blocky/noisy video playback) this is due to the fact that most movies you can stream are not actually 4K even when advertised as such and get scaled up, in fact only one app in win10/11 can play real 4K content and that is the netflix player and only if you have the premium package. this is caused by microsoft removing the codec needed for 4K playback due to the film industry complaning about people who (might) be copying there 4K content so they were forced to remove it before win10 hit release.

the old edge could also play 4k content but this has been removed in the new edge and no other browser can play 4k content now.

you might get lucky and find the codec somewhere on the net but take exceedingly good care cause you might get more than you barganied for (malware / virusses -> other nastiness).

Link to comment
Share on other sites

I guess one positive aspect of this, is that if they keep 8GB as the mainstream standard, then games will have no choice but to support 8GB cards for a very long time, and so we can get a lot of life out of e.g. an RTX2070 or GTX1080.

Link to comment
Share on other sites

https://wccftech.com/amd-radeon-rx-7600-8-gb-graphics-card-specs-leak-6nm-navi-33-xl-gpu-2048-cores-8-gb-vram/

So yeah, NVidia and AMD have both decided to completely stall progress in the mainstream GPU market, similar to the state the CPU market was in a decade ago when Bulldozer came out. I guess I won't need to get a new graphics card for a long time. Finally I remind you that memory is cheaper right now than it has probably been in a decade.

 

Hopefully the gaming community is astute enough to collectively say "no" to these products.

Edited by kano
Link to comment
Share on other sites

considering the beastly performance of todays cards it had to drop at some point, not to mention getting a case where that 4090 ti will fit 🤣 i also need a nuclear power plant to drive it hehe.

it is quite funny AMD was allways known in the past for power hungry toasters when it came to gfx cards but no AMD card today holds a candle to the 4090 in power draw not even the R9 295 Xt which had a rather ( blow your fuses ) attitude :) the 4090 can go up to 800 watt power draw!!! that is insane and just shows how powerfull a card has to be to drive an 8K monitor. but here is something rather scary the AMD rx 7900 xt can match it if you remove the restrictions on power draw ( a guy tried that out recently ) with a hacked driver. the card needed a tripple fan liquid cooling system but that was all and it stayed at 45" while trading blows with the 4090 ti 😮 holy sh..

Link to comment
Share on other sites

On 5/19/2023 at 10:36 PM, Frost_Salamander said:

It's becoming apparent that I miscalculated slightly when I bought a new PC at Christmas 2021.  I only got a 6-core CPU, 16 GB RAM and an 8 GB graphics card.  I looked at that game Chakkman posted the other day - the minimum requirements are an 8-core CPU, 12 GB card and 16 GB RAM (but 32 GB recommended).  Oops. Oh well, good thing I pretty much only play TDM these days...

As I can run it on my 6-core i5-11600K, 16 gigs of Ram, and a RTX 3060 Ti with 8 GB VRam, and get 100 plus frames per second most of the time (actually the full 144 frames of my monitors refresh rate most of the time), I'm sure you can run it well. :) 

If you ask me, those requirements are probably for the VR mode, which obviously needs a lot more beef.

The game is nice, by the way. I finished it two days ago, and, it was a lot of fun.

Edited by chakkman
  • Like 2
Link to comment
Share on other sites

4 hours ago, chakkman said:

The game is nice, by the way. I finished it two days ago, and, it was a lot of fun

Bit worried about Firmament because I'm a big Myst fan, but the Steam reviews are currently "mixed" due to bugs and jank and Cyan don't seem to have worked out how to get into a groove for a while. Even their last game Obduction showed promise but didn't hit the mark compare to older Myst games. I dunno - then again there aren't many devs making these types of games anymore so maybe I should be more forgiving.

On the other hand, the state of the modern gaming industry makes it really, really easy to hold onto older hardware and not worry about upgrading until it's absolutely necessary. I play a ton of older games anyway so whatever, if I upgrade at some point the game will hopefully be patched and improved by then anyway. :)

 

  • Thanks 1

A word of warning, Agent Denton. This was a simulated experience; real LAMs will not be so forgiving.

Link to comment
Share on other sites

11 hours ago, kano said:

https://wccftech.com/amd-radeon-rx-7600-8-gb-graphics-card-specs-leak-6nm-navi-33-xl-gpu-2048-cores-8-gb-vram/

So yeah, NVidia and AMD have both decided to completely stall progress in the mainstream GPU market, similar to the state the CPU market was in a decade ago when Bulldozer came out. I guess I won't need to get a new graphics card for a long time. Finally I remind you that memory is cheaper right now than it has probably been in a decade.

 

Hopefully the gaming community is astute enough to collectively say "no" to these products.

"There are no bad products, only bad prices." Which are not known for RX 7600 (XT) yet. Though it doesn't sound like it will be a fantastic MSRP going by the rumors.

GDDR6 costs have fallen from pandemic/shortage highs, but they are different from DRAM and other types. We're all still waiting for cheap HBM, and will be waiting forever potentially.

8 GB of VRAM is going to be enough for many 1080p gamers. Remember that 4-6 GB cards (AMD 6500 XT and Arc A380) are still in the market. Which reminds me, leaks have pointed towards a possible 7500 XT based on the same Navi 33 die instead of a Navi 34, so maybe 8 GB can reach an even lower price point.

It's possible that both AMD and Nvidia have plans to leave the low-end GPU market high and dry forever, but for different reasons. AMD is making powerful APUs that will be enough for 1080p gaming if/when they come to the AM5 desktop socket. If Rembrandt and Phoenix aren't enough performance for consistent 1080p60, Strix Point should be. Many people will be able to forgo buying any kind of discrete graphics card in the near future.

Nvidia on the other hand is more of an AI company now. Gamers don't bring in the big bucks anymore, although they can be milked effectively with flagships like the RTX 4090 and "mid-tier" cards busting past $700. Based on the waste products of the professional market, of course.

Link to comment
Share on other sites

quite a bit more life in the old hardware than most would imagine, new features are mostly good but not allways as was seen over the years (example intel started using TIM internally in there cpu's instead of soldering the chip to the IHS the result was way worse thermals and subsequently the chip could not reach higher clocks) they finally got the hint but it took about 10 years and now all the higher end intel cpu's are soldered again.  example from UEFI while mostly a nice change it turned out corsairs ICUE would break some UEFI bioses (i had that unfortunate experience with a PC i built for a friend).

My old intel 3930K is still happily chugging along with an OC of 5.2 ghz on air newer hitting 80 " which is an OC of 2 ghz it runs 3.2 ghz when not overclocked 😅  by comparison my devils canyon 4790k which is a good deal newer can barely do an OC of 200 mhz before it hits 95 " with a liquid cooling system... by comparison the devils canyon runs 4 ghz standard so a bit faster than the old sandy bridge but even at stock the devils canyon only beats it in single thread applications despite having a higher clock and not by a lot.

Link to comment
Share on other sites

10 hours ago, Xolvix said:

Bit worried about Firmament because I'm a big Myst fan, but the Steam reviews are currently "mixed" due to bugs and jank and Cyan don't seem to have worked out how to get into a groove for a while. Even their last game Obduction showed promise but didn't hit the mark compare to older Myst games. I dunno - then again there aren't many devs making these types of games anymore so maybe I should be more forgiving.

Essentially, like many other developers, they're making the same game again and again. Firmament has a new device though, which makes it a bit like the portal games. It also makes the game a bit easier than their former games though, as most puzzles are solved via the device.

There are some bugs, as usual, but, they fixed the majority of them in the first update. You get the occasional stuck in the game world, when you move some platforms or cranes, and you were standing in a spot which was inbetween the static and moveable area, but, you can reset the playbale character's position to a safe spot, which also resets the thing you moved in the game world (I think Myst and Obduction had that as well). All in all, I was able to play it through without anything game breaking, so, it's definitely playable. You know how it is, people tend to exaggerate. ;)

I have another good example for that: I'm playing The Callisto Protocol, and, people literally trashed that game on release. Yes, there were some performance issues, but, nothing which hasn't been fixed by now, and, the game is actually great. I don't think the younger generations especially can appreciate what they are being offered these days. You have to search for a long time for a game which is received generally positive these days. Mostly, the trashing has to to do with graphics or the game's performance. Shame.

Edited by chakkman
Link to comment
Share on other sites

I just don't get it.

While having more VRAM on the GPU is an easy solution to improve performance and resource issues,

GPU architects have known for ages that the best solution is using some sort of hierarchy where you page out to system RAM once you are low on GPU RAM.

AGP Aperture was designed around this idea.

When PCIE came out, it was meant to have a flexible memory aperture natively but somehow IHV's never optimized that?

Then Nvidia created PCIE "TurboCache" and AMD created HyperMemory. Issue solved? Apparently not?

id Software pioneered the use of virtual textures in id Tech 5 and that became a standard graphic feature called "Partially Resident Textures" How in the world is a graphic shader hack more effective than hardware solutions like TurboCache? Anyway, it works but it seems to be cumbersome to implement so it hasn't caught on I guess.

Then the AMD Vega architecture launched and AMD began touting how great it's Cache Hierarchy system worked? As if the same idea was not in play per all the above previous examples?

Also when Vega launched, they made a branded version of "Resizable BAR" which conceptually is the same idea as PCIE's native flexible aperture but apparently even though we made a flexible specification somewhere in the bowels of the graphic hardware specification we still had harded coded limits that needed to be fixed?

Then PS5 console proposed Direct Storage which also has a graphics memory hierarchy design.

We have all these tools to use system ram and even SSD \ NVMe for video memory yet we are still designing game engines to front-load all data into VRAM and acting like it's "impossible" to have good performance when dedicated video RAM is below X amount.

This should be a solved problem by now.

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

Game developers want 16-32 GB VRAM to work with now. They can't always count on these other features, especially on PCs. Having a large amount of VRAM is the easiest solution for reducing latency.

Let me counter your DirectStorage idea with another idea: put an SSD inside consumer GPUs, i.e. the "SSG" concept that AMD launched to the professional market a few years ago. Putting a tiny 512 GB SSD inside graphics cards would be relatively cheap, hold the entirety of any video game, and the GPU could still work if it breaks. Just cache the entire game you're playing into the GPU (or everything the GPU wants).

Everyone will ultimately want a slab of APU that contains 3D memory-storage, with the CPU, GPU, RAM, accelerators, etc. all together to limit latency.

https://wccftech.com/amd-radeon-rx-7600-269-usd-launch-navi-33-8-gb-tackles-nvidia-rtx-4060-1080p/

$270 is on the low end of RX 7600 price estimates (some people wanted $250 obviously). That's not bad for something that is presumably faster than an RX 6650 XT. There will be some complaints about price/performance not moving forward, but prices can drop below MSRP at some point. At this moment though, something like this 6750 XT for $320 soundly defeats the RX 7600, while supplies last. The RX 6700 10 GB is also a good option.

If AMD later tosses a RX 7600 XT 16 GB onto the market for $350, then they won the "low-end GPU market" this generation.

Edit: AMD did a last minute price drop on the RX 7600, $300 to $270. That 10% is enough to change it from kinda bad to kinda OK.

 

Edited by jaxa
Link to comment
Share on other sites

AMD isn't sandbagging the market and trolling customers as hard as NVidia, but if they were smart, they would have just kept the higher price of $300 and included 16GB of memory. This would have been an easy and epic KO, perhaps along with an advertising campaign about being for gamers who don't want smeared, low resolution textures. when all game development is stopped for PS4/Xbox One next year and developers have the freedom to crank quality up some more.

Link to comment
Share on other sites

AMD always knows how to fumble an opportunity. But expecting them to launch a 16 GB version at $300 is just unrealistic.

That would probably cannibalize sales of the conspicuously absent 7700/7800 "mid-range" cards at that price. And if it does exist, maybe it's not ready, much like the 4060 Ti 16 GB is launching months later. By waiting, they can sell off more RDNA2 cards, so that those are out of the mix by the time they launch the rest of the RDNA3 lineup.

As we usually see, prices will migrate to below MSRP after a while. So the $270 RX 7600 could become $220 or whatever.

Edited by jaxa
Link to comment
Share on other sites

The pricing is a little better but still feels much like what you would see in an illegal duopoly.

There are almost no cases where two cards have the exact same performance but one is priced lower.

They all seem to have their own tidy slots where there are more expensive higher performers above them

and less expensive but less performant options below them.

Both vendors have scaled back on volumes to clear out legacy stock in the retail channel.

Both vendors have done as much as possible to ensure not to drop prices of legacy stock too much.

Both vendors have done "just enough" value and performance to fend off Intel ARC offerings in the same price range.

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recent Status Updates

    • Petike the Taffer  »  DeTeEff

      I've updated the articles for your FMs and your author category at the wiki. Your newer nickname (DeTeEff) now comes first, and the one in parentheses is your older nickname (Fieldmedic). Just to avoid confusing people who played your FMs years ago and remember your older nickname. I've added a wiki article for your latest FM, Who Watches the Watcher?, as part of my current updating efforts. Unless I overlooked something, you have five different FMs so far.
      · 0 replies
    • Petike the Taffer

      I've finally managed to log in to The Dark Mod Wiki. I'm back in the saddle and before the holidays start in full, I'll be adding a few new FM articles and doing other updates. Written in Stone is already done.
      · 4 replies
    • nbohr1more

      TDM 15th Anniversary Contest is now active! Please declare your participation: https://forums.thedarkmod.com/index.php?/topic/22413-the-dark-mod-15th-anniversary-contest-entry-thread/
       
      · 0 replies
    • JackFarmer

      @TheUnbeholden
      You cannot receive PMs. Could you please be so kind and check your mailbox if it is full (or maybe you switched off the function)?
      · 1 reply
    • OrbWeaver

      I like the new frob highlight but it would nice if it was less "flickery" while moving over objects (especially barred metal doors).
      · 4 replies
×
×
  • Create New...