Jump to content
The Dark Mod Forums

2016+ CPU/GPU News


jaxa
 Share

Recommended Posts

$200 for a 4GB graphics card in 2022, and it doesn't even support CUDA or Optix, lol...

This makes it mostly a non-starter for anything that isn't playing video games, and even there, the textures will be ultra-smeared because of the need to reduce resolution.

https://www.pcgamer.com/amd-radeon-rx-6500-xt-review-benchmarks/

In the immortal words of the Angry Video Game Nerd, "I've had more fun playing with dog turds".

If society hadn't allowed competition in the graphics card market like 3DLabs to fail twenty years ago. Maybe then AMD wouldn't be trolling graphics card enthusiasts so hard now with this garbage, while capable products from NVIDIA which feature reasonable amounts of memory for the year 2022 and support established industry standards like CUDA/Optix with a mature driver stack wouldn't be as untouchable as Al Capone.

On the bright side, I guess my ancient 4GB graphics card won't become literally unusable in games for quite some time to come, because the bar has been lowered.

Link to comment
Share on other sites

1 hour ago, kano said:

$200 for a 4GB graphics card in 2022, and it doesn't even support CUDA or Optix, lol...

This makes it mostly a non-starter for anything that isn't playing video games, and even there, the textures will be ultra-smeared because of the need to reduce resolution.

https://www.pcgamer.com/amd-radeon-rx-6500-xt-review-benchmarks/

In the immortal words of the Angry Video Game Nerd, "I've had more fun playing with dog turds".

If society hadn't allowed competition in the graphics card market like 3DLabs to fail twenty years ago. Maybe then AMD wouldn't be trolling graphics card enthusiasts so hard now with this garbage, while capable products from NVIDIA which feature reasonable amounts of memory for the year 2022 and support established industry standards like CUDA/Optix with a mature driver stack wouldn't be as untouchable as Al Capone.

On the bright side, I guess my ancient 4GB graphics card won't become literally unusable in games for quite some time to come, because the bar has been lowered.

While it doesn't sound great in terms of relative performance to it's current competitors, if you look at the overall scene you will see that the card is roughly equivalent to a GTX 980, GTX 1060, R9 390, GTX 1650 GDDR6 but is at the bottom of the pricing for AMD's product stack ( lowend ). So if you were stuck on a previous gen lowend card like a GTX 1030 or GTX 960 this looks like a nice bump other than the out-of-whack scarcity pricing. It is still miles above integrated APU graphics or Intel integrated graphics. I guess the takeaway is to just ignore it until the GPU market crashes and the pricing is corrected to reflect it's place in the product stack. I personally am not a fan of the 64-bit bus and low number of PCIE lanes but I understand exactly what happened there:

"Hey TMSC, I think we're gonna win a shit-load of Laptops with AMD products. Please manufacture a bunch of laptop 6xxx series chips!"

"AMD, very cool! We are in high demand, please reserve as much as you can beforehand. What about midrange and lowend GPU's?"

"We still have lots of old stock in the channel on those. We will make a bunch of highend GPU's and salvage midrange from them at first then start our orders for lowend later"

"Cool cool"

"TMSC all the miners are buying every GPU we ship, can we get more wafers?"

"No, Nvidia and Apple have already reserved all the rest of our capacity plus we are having problems getting supplies."

"Shit! And you can't squeeze any lowend order in either?"

"Nope. Sucks to be you!"

"Fuck it, we will overclock those laptop GPU's you made us and sell them as lowend."

"Cool idea."

"Heh! Since these have less PCIE lanes we can also use the bandwidth issue to encourage folks to upgrade to new PCIE 4 motherboards! Hopefully the tech news won't burn our asses in reviews about how bad these work on PCIE 3. Oh well"

 

hly9gyg9gjh81.png

 

Look at that chart closely. You can sorta see that AMD performance slots between Nvidia performance in a way that almost looks like they are both avoiding direct segment competition. ( Duopoly collusion? ) Watch for that pattern to change once Intel arrives in the GPU market...

  • Haha 1

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

At least the CPU market is relatively healthy. I just wish they had invested more effort in speeding up the Blender Cycles X Rewrite for the CPU instead of focusing on GPUs. As a hobbyist, right now it makes more sense to get something like a 3950x/5950x and 64 or 128 GB memory instead of paying as much for a graphics card that has a literal fraction of the memory.

 

In the CPU market AMD and Intel have to keep moving, because Apple is coming after them with their custom M1/M2 architecture.

Link to comment
Share on other sites

just got me an older R9 290X which is kinda sucky compared to current cards, though i will say this the card is great at scaling performance. only downside is that it is a 4 gb card which simply does not cut it with current texture sizes so im going for the x-fire option which at the same time will allow me to push into 4k gamin though my electric bill will probably explode as a result xD. Atleast i can heat my flat with this setup.

Link to comment
Share on other sites

well not as sucky as i feared it seems 🤣 im running halo infinite on high with consistent 60 fps using this card. but i guess 4k will be out of reach even when i get the second one (not enough ram). 1920x1200 is certainly running very well though.

i hear some chatter about this game being a cpu hog though, not sure why my ass old core i7 3930k just bulldozes games like that but it does 😮 not that im complaining. my previous 970 gtx also runs it a ok with my current setup.

Link to comment
Share on other sites

On 2/24/2022 at 10:31 AM, kano said:

If society hadn't allowed competition in the graphics card market like 3DLabs to fail twenty years ago. Maybe then AMD wouldn't be trolling graphics card enthusiasts so hard now with this garbage, while capable products from NVIDIA which feature reasonable amounts of memory for the year 2022 and support established industry standards like CUDA/Optix with a mature driver stack wouldn't be as untouchable as Al Capone.

On the bright side, I guess my ancient 4GB graphics card won't become literally unusable in games for quite some time to come, because the bar has been lowered.

They will have competition soon: Intel.

The 6500 XT should be understood for what it is: a low-end desktop card based on a die intended for laptops. If the price is right (fake MSRP), then it might be worth it for some people. The 4 GB makes it cheaper to make (since VRAM prices are up) and less attractive for cryptomining.

Nvidia recently announced the RTX 2050 laptop GPU, another one with 4 GB. And the GeForce MX550 and MX570 will probably lower the bar to 2 GB.

Link to comment
Share on other sites

been some years since i tried the steam deck :) maybe it is time again.

on a sidenote 4k pc monitors have dropped in price here in dk so i can actualy afford one now 😮 so im probably going for that next, might give me some insights on where my old setup falls short. over the years i learned a bit about why the old x79 chipset was so popular even into our times -> 40+ pcie lanes, and the old sandybridge cpu's overclocked like little monsters (not unheard that a 3930k could reach 5.4 ghz on less than sci-fi themed cooling solutions "liquid nitro etc" mostly you could get away with such an overclock using a really good water based solution). ofc there where shortcommings as well, for one x79 had no native usb 3.0 support so relied on questionable 3'rd party chipsets for that. It also only supported 2 native sata 6g ports which still came no where near the throughput current chipsets manage so it booted pretty slowly even using ssd's (could get around that by using an nvme drive but those where not natively supported by bios untill the x79 deluxe, and also was a pita to setup as boot drives). and pcie 3 was not officially supported with the sandy cpu though quite a lot of the sandy cpu's i tested did support it, you still needed a hack for nvidia cards to enable it. with the ivy processors it was fully supported but the ivy bridge overclocked worse and only had half the pcie lanes making 4 way sli impossible, besides that the only really new stuff was inclusion of the FMA3 instructions which did not really rock many boats.

Link to comment
Share on other sites

  • 2 weeks later...

well though halo infinite seems to run just fine as long as i dont push the more demanding options i got a beast of a problem with running horizon zero dawn which is drivin me nuts 🤬 basically it looks like crap on the R9 290x despite the card using the same chip as the later 390x model which btw runs this game no problems and looking gorgeus. lowering the gfx options do not help either everything looks washed out and soapy, walls have the look of molten vax if seen from a distance and closer up it just flickers between high def and crap. i suspect the 4g vram this card comes with is not up to the task despite the game claiming that it only uses 4g max at original settings. the pop in is so bad that i feel lucky to not be afflicted by light induced seisures uuuugh. mass effect andromeda runs like a rocket in comparison and with visuals that dont really stand apart from those in horizon at 1080 resolutions. unfortunatly a used 8g 390x cost allmost double the 4g 290x meaning you pay what this card used to be sold for as new for something which might not last for more than a short time since you dont know what it has been used for before you got it. 

Link to comment
Share on other sites

I don't have the game but have you tried this bellow? Some people online say it solves their horizon zero dawn texture problem.

Quote

If you are having soft blurry textures on AMD GPUs, just disable "Surface Format Optimization" , and switch "Texture Filtering Quality" to performance.

 

  • Like 1
Link to comment
Share on other sites

yep tried all the tricks in the book untill i noticed that the game did not have a profile in the amd driver 🤣 after correcting that it ran somewhat better though the texture pop in is still a problem. this is however linked with 4g not being enough for running the game with textures on high as that allready consumes more than 4g vram even if everything else is set to low / medium. so while double the ram does not make the card any faster it sure does wonders with the insane detail levels of newer games. but tbh its kinda a letdown when using 1080 resolutions as all this detail goes lost at those resolutions anyway. for instance crysis 3 still looks just as gorgeous as HZD but only consumes half the vram on ultra 🤨.

on ultra which my previous R9 390 card could run at 45 to 58 fps the R9 290X which uses the same gpu the fps drops hard to around 11 so welp 🥺. sadly amd has dropped all support for the gcn series of cards so there wont be any newer drivers either. might sell the R9 290X and see if i can get myself an RX 580 instead as it is a little faster than this card and 8G models are easier to get hold of without being completely ripped off.

Link to comment
Share on other sites

ill go for a 1070ti instead, the RX580 while a little faster at 1080 gets taken over at higher resolutions by my current card and if it had enough vram it would murder it xD.

also these cards have recently dropped in price here in denmark so i can actually afford one now.

kinda shocking to think about that a card that is now allmost 10 years old is still able to compete with much newer gpu's...

the R9 290X performance today is comparable to a 1060 gtx previous gen it was comparable to the 970 gtx and could sometimes beat even a 980 gtx non ti. it was faster at high res than a 780 ti and the titan which came out about the same time this card was made. make one wonder if they had used die shrink and optimized the arch a bit more what could have been accomplished.

 

  • Like 1
Link to comment
Share on other sites

https://www.techpowerup.com/gpu-specs/radeon-680m.c3871

This is just an estimate and should be taken with a grain of salt, but you can get an idea of how the top Rembrandt APU stacks up against your current GPU.

The Radeon 680M should be faster than the GTX 960, Radeon HD 7970, GTX 1050 Ti, and GTX 770.

My guess is that the next-gen Phoenix APU (top CU model) will be at least 33% faster, which would put it in the ballpark of the R9 290X and GTX 970.

  • Like 1
Link to comment
Share on other sites

hmm aye, though on that note it would take a considerable ammount more for me to go with an APU atm as i would have to ditch around 70% of my current hardware which is rather old but still holds up pretty well regardless.

it consists of core i7 3930k cpu asus x79 extreme mainboard and 64 gb kingston black ddr3 memory.

for a current gen APU i would need to replace all of these :S

Link to comment
Share on other sites

I would skip as deep into AM5 as possible. So skip Rembrandt, skip Phoenix, look for Strix Point or later 3+ years from now. CPU performance uplift ought to be substantial, since a 5700G is already 2-3 times faster than an i7-3930K.

DDR6 is in development, and the introduction of that could mark the end of the AM5 socket... in 2027?

Link to comment
Share on other sites

i have had an eye on the 5000 series of ryzen, still i would need atleast new board and ram as well as the cpu/apu itself.

my 3930k is a bit long in the teeth but still quite capable dependent on what you throw at it ofc, strangely i see the 5000 series now also default to 6 cores 12 threads ?.

Link to comment
Share on other sites

got hold on a gtx 980 ti, allways wanted one of those beasts when it came out sadly at the time i was poor as dirt so it newer happened. so far it has taken any game i could throw at it running quite well indeed. it only has 6gb ram though it did help somewhat with hzd which i can now run on ultra with <> 60 fps at 1080 resolutions. i somehow doubt this card can hold that in 2 or god forbid even 4k resolutions but atleast it should still be playable at 2k with it.

Link to comment
Share on other sites

the 1070 ti and 1080 ti models are down in prices now atleast here in denmark, so if you need to upgrade and dont have the dough for the newer models it might be time. just keep in mind that atm. prices on energy and food is skyrocketing so make sure you have enough to put aside for emergency cases.

Link to comment
Share on other sites

  • 2 weeks later...

 

Pulling twice as many watts to barely beat the 5950x in Blender is hilarious. But that being said, we do seriously owe Intel a massive thank you for bringing down the price of Zen 3 and specifically the 5950x by overtaking it in the gaming space with a slightly more significant amount, gaming tending to dictate pricing for consumer products.

  • Like 1
Link to comment
Share on other sites

welp yeah that sounds like a nogo 🤣, on a sidenote after i got the 980 ti i delegated my R9 290x to an msi z87-gd65 gaming board with an i7 4770k and 16 gb ram. this is a newer cpu than my old i7 3930k and all tests say it should be faster but... ran kombustor and im getting 11 fps Oo where i get more than double that with my old i7 3930k in the dx12 benchmark huh ?!?. not really sure what gives though i know the older i7 3930k has a better multicore profile the 4770k should murder it in single core performance but that aint happening, in fact it gets utterly destroyed in both multicore and single core.

Link to comment
Share on other sites

If you want to grasp the Alder Lake efficiency situation, you should look into the Ryzen 5 5600X vs. Core i5-12400 and Rembrandt vs. mobile Alder Lake at low TDPs.

As AMD moves to a new socket, it has seen what Intel is doing to top the charts, and will raise the top TDP from 105W to 170W:

https://hothardware.com/news/amd-ryzen-7950x-specs-leak-170w-tdp-16-cores

And why not? If you are a 16-core user, you can probably afford a good cooler.

Link to comment
Share on other sites

allready have one of (if not the best) cpu cooler out there atleast on air :) a noctua NH-D15 dual fan, it keeps my old sandy cool even though it is overclocked quite a bit running 5.2 ghz and it newer hits 80 degrees but close sometimes.

i doubt any of the newer cpu's go this high when overclocked :P but they most likely have it beat in anything else hehe.

Link to comment
Share on other sites

indeed sadly my old beast is anything but manageable in the size department, i dread the day i have to move again...

atleast it has keept up pretty nicely for a PC from around 2013 and is still useable though it is far from the fastest out there anymore.

Link to comment
Share on other sites

2 hours ago, kano said:

The trouble with excessive TDP though is if you want to build a small form factor/ITX computer for travel, you are limited in the size of cooler and airflow.

So if they end up offering only a single 16-core model with the heightened 170W TDP,  you could TDP-down/underclock it with Ryzen Master, the BIOS, or other utilities.

BTW, rumor has it that ITX motherboards will not be able to use a X670 chipset, only B650:

https://wccftech.com/amd-next-gen-ryzen-7000-raphael-zen-4-cpu-rumors-computex-announcement-x670-flagship-chipset-with-ddr5-pcie-5-support-rdna-2-igpu/

https://wccftech.com/amd-5nm-zen-4-ryzen-7000-raphael-cpusmass-production-april-2022-rumor/

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share


  • Recent Status Updates

    • nbohr1more

      Anyone have any luck with light.setShader( string ) ? It seems to make whichever light you apply it to full-bright on the initial invoke?
      · 0 replies
    • thebigh

      I'm starting to think we need another mapping contest.
      · 9 replies
    • kano

      Don't you hate it when there's a quality discussion on a forum somewhere online about something, but then two disagreeing users derail and transform it into a back-and-forth poo slinging competition at one another?
      · 9 replies
    • Diego

      Oh look the status updates are back! 
      · 2 replies
    • JackFarmer

      After watching the first three and a half episodes of "The Sandman" last night, I realize once again that overly imaginative narratives are not for me. Also, the main actor looks like he has a toothache.
      Which makes me wonder, is there a Dark Mod mission with a medieval dentist?
      · 4 replies
×
×
  • Create New...