Jump to content
The Dark Mod Forums

2016+ CPU/GPU News


jaxa

Recommended Posts

wtf... the second cable was also the wrong type the third worked and the board runs again 🤯.

corsair has some explaining to do i reckon since all my current PSU's are from them and the only indicator that the type is not for that PSU is a small badge printed on the connector that its either a type 3 or 4 (both fit on the modular types but only one will work) the models are a 750 watt cw and a 1000 watt hx. the hx was the one i swapped in and its a gold certified PSU with 10 years warranty. the cables despite the type number are also visually the same except the type 4 having 1 cable mounted differently and is the only one that fits the hx model appareantly. luckily the cable that is mounted differently has no connection inside the 1000 watt PSU so thats a plus as otherwise it would probably had incurred damage to either the board or the PSU but damn... 💀

  • Like 1
Link to comment
Share on other sites

On 1/30/2024 at 3:03 PM, nbohr1more said:

Yep, it's a race:

Who will win:

Silicon graphene hybrid ( already demoed high volume low defect production, not sure about remaining obstacles )

Boron Arsenide ( still has high defect rates )

High-NA EUV ( uses silicon, expected to have medium to high defects initially, will probably be adapted to use alternate substrates like those above but will take more time )

Looks like we have a more mundane scenario.

 

TMSC is gonna do Gate All Around Nano-wire FINFET on 2nm

vs

Intel doing 1.8nm ( 18 angstroms ) with High-NA EUV but at least a year after TMSC starts.

 

Both options are still risky so who knows if either will experience delays. TMSC took the safer option.

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

big problem with that is that we are nearing a point where the electrons cant travel safely anymore (something to do with things on the quantum level), i guess they need to speed up development of light based computers or get smart with autoreconfiguring gateways internally in processors.

Link to comment
Share on other sites

  • 2 weeks later...

Well some good news for those who have the 2080 ti and no money to upgrade to the 4000 series.

Alan Wake 2 runs at > 70 fps with the FSR3 mod on it at high at 4K 😃 need to set "m_bLensDistortion": false, in AppData\Remedy\Renderer.ini to get rid of some ghosting effect on the main char when moving, but other than that it looks quite nice.

the FSR3 mod is here -> https://www.nexusmods.com/site/mods/738

an alternative is this mod -> https://www.nexusmods.com/site/mods/757 which also supports intels xee models.

there is also one for slightly older AMD cards but it is a paid mod and the author got himself into quite a shitstorm due to some rude behaviour so i wont link it here, look for LukeFZ if you really must.

The other FSR3 mods require an RTX or Xee card so for nvidia everything from a 20 series and 30 series.

The LukeFZ mod also supports the older GTX 10 series.

One thing to note is that your card must be able to handle atleast 30 FPS minimum else it will look like crap and if movement is laggy without it it will still be laggy with framegen (this also applies to cards that support it natively like the 40 series).

Link to comment
Share on other sites

  • 4 weeks later...

ok so after getting myself a rtx 3070 im left with a bit of a wonder about all the fud on the net.

elitist users claim the 3070 cant do 4k (debunked it handles 4k just fine but you need to lower the texture resolution in some titles to not overshoot the frankly rather low amount of vram -> 8 gb).

some back and forth on the 2080ti some claim that the 3070 is faster while others claim the 2080ti is. (from my own experience the 2080ti is a bit faster in 4k while the 3070 is a bit faster in lower resolutions). if you play exclusively in 4k go for the 2080ti -> reason it has more vram 11gb vs 8gb this might not sound like a huge deal but the extra 3gb helps a lot with ultra high texture resolutions. debunked (claims that the 3070 uses newer dlss features, it does not. the 2080ti supports the exact same dlss features that the 3070 does, it even supports dlss 3 minus the framegen feature. some claims the 3070 uses newer tensor cores which are faster, well is they are i dont see it... the 2080 ti has 4 times the amount of tensor cores compared to the 3070 while the 3070 has around 1000 more cuda cores hmm ???).

the real reason i think the 3070 got so popular is that it delivered close to the same performance of the insanely overpriced 2080ti, i cant fault people for that choice but i would like some realism in the comparison and not something based on just the price. the 2080ti was a highend card back when it was new while the 3070 is a mid range card at half the price of the 2080ti with at least comparable performance but lacks enough vram to play all titles at 4k with everything cranked to the max.

playing hzd forbidden west on the 3070 atm in 4k with everything on max except texture resolution which i have on high and i get > 80 fps with the framegen mod and around 45 fps without it (dlss is flaky in this game though), the 2080ti in the same game in 4k gets around 100 fps with the framegen mod and 55 fps without it with texture resolution at the highest setting).

Link to comment
Share on other sites

It was a relatively strong mid-range card that obviously has less VRAM than it should. And it's still funny that the RTX 3060 packs 12 GB while the RTX 3080 copes with 10 GB.

Game devs would like PC users to have 12-16 GB VRAM, but they'll support 8 GB and do little tricks like downgrading the textures automatically.

  • Like 1
Link to comment
Share on other sites

aye the rtx 3060 was another weird one, it only has a 128 bit bus which is to low to effectively handle 12 gb so it did not really help with the extra vram in higher resolutions. sadly they decided to continue with the same eh "mistake" with the rtx 4060 16 gb model 😔. id call that deception to make users pay more for a card which is not even rated for 4K... sadly.

the 16 gb 3070 model was scrapped by nvidia because it would be a contender for the much higher priced 3080 non ti i guess as it has a 256 bit bus and hence would be a capable 4k card.

the 3060 ti 8 gb was a much better card sadly.

https://www.techradar.com/reviews/evga-geforce-rtx-3060-black-xc

Link to comment
Share on other sites

Posted (edited)
24 minutes ago, revelator said:

aye the rtx 3060 was another weird one, it only has a 128 bit bus which is to low to effectively handle 12 gb so it did not really help with the extra vram in higher resolutions. sadly they decided to continue with the same eh "mistake" with the rtx 4060 16 gb model 😔. id call that deception to make users pay more for a card which is not even rated for 4K... sadly.

 

3060 has 192-bit bus (cut to 128-bit for the maligned 8 GB model), and the gimped cards (like 7600 XT 16 GB 128-bit) can definitely use the extra VRAM in some scenarios.

https://en.wikipedia.org/wiki/GeForce_30_series#Desktop

Edited by jaxa
Link to comment
Share on other sites

biggest gripe with my 2080 ti is that it only has hdmi 2.0b so it cant do 4k above 60hz 🤬 luckily the displayport can and since i bought a monitor with displayport it happily chugs along in 4k 145 hz 😉 its an asus ultrawide 32" so matches my board which is also from asus and gfx card also from asus as a bonus, just stay the f... away from asus software 😂 the hardware is quite good though. the ultrawide screen has its own problems with some games though (letterboxing) but most of those can be worked around. in hzd the cutscenes are allways letterboxed which is annoying since it has a load of those but the game itself runs fine in ultra widescreen.

my 3070 runs on an older phillips tv i had "1080p" eww. but i tried it on my 4k screen just for comparison.

sometime in the near future ill ditch the old phillips for a 4k TV since i use my second PC as a streaming device mainly (i dont have cable TV) .

Link to comment
Share on other sites

so some models have 128 bit and others 192 ??? though pretty much all reviews state the bus is only 128 bit wide so did nvidia up the ante in some later 12gb models to battle bad reviews 🤔. the table also seems weird as the ti models all have a 256 bit bus even the 3060 ti but there are no 12 gb 3060 ti models as far as i can see.

Link to comment
Share on other sites

No, the 192-bit RTX 3060 12 GB came first. The cut down 8 GB model came over a year and a half later, and probably in small numbers because nobody talks about it much other than "don't get it, it's 20-30% slower".

3060 Ti had 8 GB from the start, and always has, although it looks like they made a GDDR6X version. They would have to change the bus width to accommodate 12 GB. There were rumors of products like 3070 16 GB, 3080 20 GB and so on, but they never materialized outside of engineering samples.

If you think things are confusing now, just wait until 3 GB GDDR7 chips materialize within a couple of years. We could see 12 GB cards on a 128-bit bus, 9 GB on 96-bit, and so on.

  • Haha 1
Link to comment
Share on other sites

yeah its a jungle out there 😆 but a 12 gb card on a 128 bit bus would only be viable with dlss and then only just 🙄.

but there will probably be plenty of options as you mention though id still go for something that would atleast be able to drag the ammount of vram without workarounds. strangely the old amd R9 390 had a 512 bit bus and could probably have accepted 32 gb vram but the card is to slow to do 4k in modern titles. even so the 8 gb model actually ran quite nicely in games such as the first horizon in 4K but would probably choke and die on forbidden west 🤣.  minor wtf moment is crysis remastered it runs it at 4k in can it run crysis with a gazillion fps 😵 looks quite purty to but i suspect this is a bug.

Link to comment
Share on other sites

1 hour ago, revelator said:

yeah its a jungle out there 😆 but a 12 gb card on a 128 bit bus would only be viable with dlss and then only just 🙄.

Well, the 7600 XT is considered sus for putting 16 GB on 128-bit, but it clearly works in some scenarios. Also IIRC GDDR7 will have about +30% bandwidth over GDDR6X right out of the gate, rising to about +100% as the generation progresses. Big caches (Infinity Cache L3 for AMD, lots of L2 for Nvidia) have made smaller bus widths more viable, and I think they have improved compression techniques and other factors over time to help alleviate bandwidth demands.

There's already a little bit of analysis of what we can expect to see in RDNA3+ and RDNA4, very technical though:

https://chipsandcheese.com/2024/02/04/amd-rdna-3-5s-llvm-changes/

I am eager to see if AMD is bold enough to do (or allow AIBs to put) 32 GB on the top RDNA4 card, which has long been rumored to be slower than the 7900 XTX in raster, but will hopefully beat it in raytracing and other areas such as AI/ML perf. And I think that card will have a 256-bit bus and 16 GB memory normally.

Link to comment
Share on other sites

sadly the upcomming 8xxx series from amd will only be for midrange atleast according to leaks (grain of salt maybe ?).

well it would be something quite different thats for sure :) not sure if 32 gb vram would actually help (what is the max texture size today ?), it might help if they really go nuts with the detail level in upcomming titles but i suspect this might take longer as the game companies dont want to alienate players with less vram.

ofc it will come at some point but i dont see it in the near future.

the 2 and 3 gb vram chips might actually make a dent in the bus width war.

what the hell happened with hbm memory ???, the old fury cards could actually do 4k no problem with only 4 gb vram because the hbm memory was so blazing fast.

Link to comment
Share on other sites

It should be around $400-600, a price bracket that was once not considered mid-range, delivering raster performance similar to the 7900 XT but likely with better raytracing performance.

We can only assume RDNA4 tops out at 16 GB, but 32 GB would be a funny option if they go for it. I think you can create scenarios where games could use as much or more than 24 GB in 4K, but it's obviously rare and largely unneeded. It would be a good amount of VRAM for AI stuff, though the sky's the limit there and 32 GB isn't going to be enough for some LLMs.

HBM memory is expensive to make and in huge demand for AI accelerators, enterprise GPUs, and other enterprise products (such as Intel Sapphire Rapids CPUs aka Xeon Max with HBM). I think it's as much as 5x more expensive per gigabyte than GDDR6X/7. So while it would be great for consumer gaming GPUs, with major bandwidth and efficiency benefits, AMD and Nvidia are going to put it in $10,000 to $40,000 products instead.

Years ago there was talk of making cheaper, less capable versions of HBM for the mass market, but it never materialized:

https://www.tweaktown.com/news/53536/low-cost-hbm-way-hit-mass-market-soon/index.html

If the AI bubble pops, we might see some efforts to pivot back to consumer products. Aside from GPUs, probably every CPU should eventually be packing a big L4 cache utilizing HBM, DRAM, or bespoke 3D layers by the late 2030s.

  • Like 1
Link to comment
Share on other sites

Yeah no lol. It's all getting sucked in by the AI industry. How much does 24 GB of HBM cost anyway? It could be $600 or something. Which doesn't sound like much when you consider the MSRP of an RTX 4090 but they are making a killing with those marginz.

Well, I've just managed to upgrade to an i3-10105 system, possibility of future GPU upgrade (need to look for low profile), for $75. And I'm sticking in 64 GB of RAM that I happened to have lying around. This is likely to be my new TDM system if everything works properly. And I bought not one but two of these things with the other destined for media duty. I stuck the 8 GB from one in the other one.

I guess I could end up putting an 11th gen Rocket Lake chip in it, but I'm in no particular hurry to do that.

INB4 I'm an unironic buyer of the RTX 3050 6 GB.

Link to comment
Share on other sites

heh yeah :) i wonder if trumph has stock in nvidia ? all those lawsuits... 😂, not even so sure it will just blow over tbh, here in denmark we allready use AI heavily. one example of all things it now governs are taxes 🤣 well in the worlds most tax heavy country who just loves tech i'd say thats a no brainer but c'mon...

not really feeling the need for more upgrading for a good time either, my machines now run the latest stuff no problem and i recently aquired a motorized table for my aching back (well it really started to ache after i had to lift it to the first floor... damn that hing is heavy 65 kg) and me nearing 57 years with a back who was broken in two places which required operation with two artificial discusses and a lot of screws and with heavy nerve damage because it took them 10 years to dicover it was broken i can only say ouch.

  • Like 1
Link to comment
Share on other sites

This was a great purchase for me, covers all my needs, and brings me up to immunity to any massive chip shortage as a result of geopolitics. ;)

It appears to be running a core at 4.2 GHz basically forever even with low usage, but is quiet most of the time. Clearly faster than the i5-6600T despite the years of quad-core stagnation. The higher TDP and hyperthreading really helps.

I doubt there's any noticeable improvement going from HD 530 to UHD 630 iGPU, but it does gain the better H.265/VP9 HW decode that came immediately after Skylake.

Link to comment
Share on other sites

allmost sounds like the haswell turbo core feature ?, my 6950x also has one core who runs at a higher speed than the rest (3.8 ghz) but otherwise it behaves temperature wise. if i link the cores temperatures rise somewhat but it does OC up to 4.5 ghz which is not to bad for the 6950x but then temps rise to 55" idle and damn near 80" when it has to do something 😆.

 

  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recent Status Updates

    • OrbWeaver

      Does anyone actually use the Normalise button in the Surface inspector? Even after looking at the code I'm not quite sure what it's for.
      · 7 replies
    • Ansome

      Turns out my 15th anniversary mission idea has already been done once or twice before! I've been beaten to the punch once again, but I suppose that's to be expected when there's over 170 FMs out there, eh? I'm not complaining though, I love learning new tricks and taking inspiration from past FMs. Best of luck on your own fan missions!
      · 4 replies
    • The Black Arrow

      I wanna play Doom 3, but fhDoom has much better features than dhewm3, yet fhDoom is old, outdated and probably not supported. Damn!
      Makes me think that TDM engine for Doom 3 itself would actually be perfect.
      · 6 replies
    • Petike the Taffer

      Maybe a bit of advice ? In the FM series I'm preparing, the two main characters have the given names Toby and Agnes (it's the protagonist and deuteragonist, respectively), I've been toying with the idea of giving them family names as well, since many of the FM series have named protagonists who have surnames. Toby's from a family who were usually farriers, though he eventually wound up working as a cobbler (this serves as a daylight "front" for his night time thieving). Would it make sense if the man's popularly accepted family name was Farrier ? It's an existing, though less common English surname, and it directly refers to the profession practiced by his relatives. Your suggestions ?
      · 9 replies
    • nbohr1more

      Looks like the "Reverse April Fools" releases were too well hidden. Darkfate still hasn't acknowledge all the new releases. Did you play any of the new April Fools missions?
      · 5 replies
×
×
  • Create New...