Jump to content
The Dark Mod Forums

2016+ CPU/GPU News


jaxa

Recommended Posts

Speaking of TV monitors and 1360x768 resolution, I never managed to get anything but AMD cards to actually use those natively. I was always stuck with 1280x720 and upscale, or 1920x1080 and downscale with my Toshiba TV. Which is the choice of hanging or shooting, if you want to use the system for anything more than video watching.

 

Did that change with recent Nvidia drivers, or is it an oddity of my TV?

What Model Toshiba?

 

Did you "disable DPI scaling" in the (right-click) compatibility tab of TheDarkMod.exe ?

 

Please post your Darkmod.cfg

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

Heh, my old rants about PowerVR are starting to echo into the present.

 

Apparently Nvidia has had a rendering efficiency edge since the Maxwell architecture because

they've secretly been doing "Tile Based Deferred Rasterization" (akin to PowerVR's approach)

in hardware. No wonder AMD has been having such a hard time.

 

Well, now AMD's "Vega" architecture will be doing tile-based binning in it's raster hardware too:

 

http://www.anandtech.com/show/11002/the-amd-vega-gpu-architecture-teaser/3

 

The slowdown of Moore's Law and the increasing importance of Mobile (PowerVR is killing everyone in Mobile) must have pressured them to adopt these changes.

 

Now how long will it take these guys to include Ray-Tracing hardware like PowerVR is currently doing:

 

https://imgtec.com/blog/hybrid-rendering-for-real-time-lighting/

 

So I guess this begs the question. Is it now worthwhile to consider moving to a deferred render approach in TDM now

that GPU hardware is already doing that behind the scenes?

 

(Well, probably still "yes" to improve legacy hardware support unless someone on the TDM team is rich enough to hand out new GPU's)

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

So I guess this begs the question. Is it now worthwhile to consider moving to a deferred render approach in TDM now

that GPU hardware is already doing that behind the scenes?

[smartass mode on] Since Pentium Pro x86 cpus have been running RISC microops behind the scenes - so should we move to a RISC compiler? :blush:

Link to comment
Share on other sites

No significant graphics improvements in Kaby Lake's iGPU. Only better hardware support for bleeding edge encoded video.

That and its not much extra grunt over the older 4th gen series like my 4790k, so I am not missing anything. The only upgrade I will do in the near future is get a mobo with 2x PCIe 3.0 slots and an onboard m.2 slot running at PCIe x4 or faster. My current mobo (Msi Z97-g43) was a midrange board and will do fine as a backup/spare or get sold to a mate.

 

And on the subject of upgrades -

 

- http://forums.thedarkmod.com/topic/18597-illeagal-hardwaresoftware-manipulation-a-sign-of-things-to-come/

Link to comment
Share on other sites

Only Haswell(?)/Skylake+ iGPUs will get complete Vulkan support, I suppose.

So it's not about performance, but about APIs compatibility.

Edited by lowenz

Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.

Link to comment
Share on other sites

  • 1 month later...

It's time to talk AMD's Ryzen.

 

If for some reason you do decide to buy an Intel CPU, watch out for price cuts:

 

http://hothardware.com/news/intel-reacting-to-amd-ryzen-apparently-cutting-prices-on-core-i7

 

Three 8-core Ryzen chips will launch first. All have hyperthreading (16 threads). Cheapest is $329.

 

Ryzen supposedly increased instructions per clock by 52% rather than the long promised 40%.

 

http://www.anandtech.com/show/11143/amd-launch-ryzen-52-more-ipc-eight-cores-for-under-330-preorder-today-on-sale-march-2nd

 

More will be known when they go on sale on March 2nd.

Edited by jaxa
Link to comment
Share on other sites

But doesn't it have a iGPU ?

Some of the models do, but like a lot of people I am looking at this from a "I am not upgrading to kabylake as I am not upgrading to Win10" perspective. I am aware Ryzen dosen't support Win7/8 but I am hopping AMD will be less agreesive able outright disabling the older OS like I have seen on some laptops.

Link to comment
Share on other sites

The 3 models launching on March 2nd do not have an iGPU. It's not clear which desktop models will include an iGPU (aka which ones can be called APUs). Maybe some of the quad and hex-core models. The laptop versions coming much later in the year should be APUs.

 

Ryzen's "lack of support" does not mean that Win 7/8 can't run on them. Feel free to contradict this, if you can.

 

One thing I forgot to mention is that the cheapest one of the pack, the $329 R7 1700, has a 65 W TDP instead of the 95 W TDP the $400 and $500 models have.

 

Ryzen Reddit AMA with Lisa Su: March 2, 11:30am to 12:30pm CT reddit.com/r/amd

Edited by jaxa
Link to comment
Share on other sites

Ryzen's "lack of support" does not mean that Win 7/8 can't run on them. Feel free to contradict this, if you can.

It more a case of the few skylake/kabylake systems I have run across and tried to install Win7/8 on have been illegally & deliberately hobbled -

  • Booting from cold to desktop on a Z170/skylate system took 3-4x longer compared to the previous gen (Z97/77)(45-60s versus 15-20)
  • Trying to get a Kaybelake based Acer labtop dual boot linux/Win8.1 refused to work untill the hidden legacy mode was enabled. And even then once into Win8.1 I found there was no driver support for the onboard HD620 GPU, so as the laptop came with an nVidia GTX 950 I tried disabling the intel but no joy. And because the intel 2D was forced to work in tandom with the nVidia 3D, I was not able to get windows to display correctly in 2D.

What I wont stand for is not be able to not having GFX driver support for Win7/8, which is what intel are doing.

Link to comment
Share on other sites

I really dont give a shit for iGPU's because thus far they have had crap perf. What I wont stand for is not having GFX driver support for Win7/8, which is what intel are doing.

So Win10/Micro$oft leaning on vendor bullshit continues, sorta -

 

- http://www.fudzilla.com/news/42957-amd-snubs-windows-8

 

As of the latest driver, 32bit versions of Windows 7/8 wont be supported. AMD are shooting themselfs in the foot here as nVidia have supported 32bit and older cards for years. I really hope that in this instance AMD don't go down the route with Ryzen iGPU's and only offer Win10 support.

Link to comment
Share on other sites

Welp, Ryzen is still a bit disappointing. It classes well as a workstation CPU but fails at besting Intel at "GAMING".

It seems they still skimped on Floating Point hardware. Probably on the gambit that this work will be done on the GPU.

 

The last question: Will the APU version (Raven Ridge) smartly offload Floating Point to the iGPU?

 

They could absolutely murder top-end i7's if they did that.

 

My prediction: Nope.

 

They are gonna keep trying to evangelize developers to stop using the CPU for that type of work.

Maybe this is why they are "partnering" with Bethesda. They'll need to get Crytech and UE4

into this game if that's their gambit. Can't live on id Tech 6 alone.

  • Like 1

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

Let's see if Raven Ridge even comes out in 2017.

 

Looks like people who do video editing will love the Ryzen 7. Ryzen smoked Intel in many multithreaded benchmarks. This made the AnandTech review for it look great, since they decided to do gaming benchmarks in a later review for some stupid reason.

 

Suddenly Kaby Lake (no IPC gain) doesn't look so bad since it generally managed to boosts clocks, what, 200 MHz? Which can make a big difference in some cases.

 

I'm also wondering if there is a game that 8-core Ryzen can handle well in the near future. As we know, PS4 and Xbone have APUs with 8 cores, 6-7 of which should be usable by games. New versions like the PS4 Pro and Xbox equivalent are coming out in the next year. That means devs will be targeting a wide range of 8 core chips (and those chips are also made by AMD, although they are lower power). Intel is rumored to be preparing a mainstream 6-core chip, but AMD would still have the lead in core/thread count at a reasonable price. Will PC games start to use more than 4 cores?

Link to comment
Share on other sites

http://www.pcworld.com/article/3176191/computers/ryzen-review-amd-is-back.html?page=3

 

“As we presented at Ryzen Tech Day, we are supporting 300+ developer kits with game development studios to optimize current and future game releases for the all-new Ryzen CPU. We are on track for 1,000+ developer systems in 2017. For example, Bethesda at GDC yesterday announced its strategic relationship with AMD to optimize for Ryzen CPUs, primarily through Vulkan low-level API optimizations, for a new generation of games, DLC and VR experiences,” Taylor said. “Oxide Games also provided a public statement today on the significant performance uplift observed when optimizing for the 8-core, 16-thread Ryzen 7 CPU design—optimizations not yet reflected in Ashes of the Singularity benchmarking. Creative Assembly, developers of the Total War series, made a similar statement today related to upcoming Ryzen optimizations.
“CPU benchmarking deficits to the competition in certain games at 1080p resolution can be attributed to the development and optimization of the game uniquely to Intel platforms—until now. Even without optimizations in place, Ryzen delivers high, smooth frame rates on all ‘CPU-bound’ games, as well as overall smooth frame rates and great experiences in GPU-bound gaming and VR. With developers taking advantage of Ryzen architecture and the extra cores and threads, we expect benchmarks to only get better, and enable Ryzen to excel at next-generation gaming experiences as well. Game performance will be optimized for Ryzen and continue to improve from at-launch frame rate scores.”
To boil it down: The world of game developers basically develop for two platforms: Intel’s small socket or Intel’s large socket. AMD, as much as it pains the faithful, has been invisible outside of the budget realm, and the results are showing up in the tests. Whether that's what’s really going on I can’t say for sure, and I doubt anyone can at the moment, but it’s at least plausible.

 

 

How much credibility do we want to give to AMD's marketing technical department when addressing the claim that games are optimized for Intel CPUs and performance will recover in the future?

 

And in the quote we see evidence that some game devs will use all 8 cores, as I predicted wondered.

Edited by jaxa
Link to comment
Share on other sites

Games are NOT optimized for Intel, but simply built on Intel system (a minor bias, still a bias).

Edited by lowenz

Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.

Link to comment
Share on other sites

It classes well as a workstation CPU but fails at besting Intel at "GAMING", it seems they still skimped on Floating Point hardware.

No good for TDM then compared to a Core i7 3770 or 4790, both of which have excellent FP perf.

How to install Ryzen on Win7

Sweet, and I found the article so interesting i email the guy -

 

Hello

 

Regarding your very informative article –

 

 

A while back I tried the Z170 chipset along with a 6700k, the main issue I had at the time was boot times, specifically cold-to-desktop. As no hibernate, just raw SSD and mobo. My previous and current mobo’s Z77/Z97 boot from C2D in under 20 seconds.

 

May I ask what the C2D was for the Ryzen CPU/Mobo combo you tested in the above article...?

 

Kind regards

Link to comment
Share on other sites

Welp, Ryzen is still a bit disappointing. It classes well as a workstation CPU but fails at besting Intel at "GAMING".

It seems they still skimped on Floating Point hardware. Probably on the gambit that this work will be done on the GPU.

 

The last question: Will the APU version (Raven Ridge) smartly offload Floating Point to the iGPU?

 

They could absolutely murder top-end i7's if they did that.

 

My prediction: Nope.

 

They are gonna keep trying to evangelize developers to stop using the CPU for that type of work.

Maybe this is why they are "partnering" with Bethesda. They'll need to get Crytech and UE4

into this game if that's their gambit. Can't live on id Tech 6 alone.

I suppose, MS, and motherboard vendors just need some time to tune up Windows, and UEFIs, respectively. AFAIK, there is something wrong within the task manager in Win10 when it works on Ryzen, whereas everything is fine in Win7 (not sure about Win8/8.1).

And yes, AMD have skimped on the SIMD unit in Ryzen, but only in the AVX-256 bit mode, i.e. the SSE- and AVX-128 modes both work very fast.

About the offloading the FPU to the iGPU: as far as the code uses FPU instructions (x87, MMX, SSE, AVX, FMA, etc.), this will never happen. Anyone who wants to offload the FPU in the (i/d)GPU, should use special languages and/or libraries: OpenCL, DirectCompute, Vulkan, Direct3D 12, GNM/GNMX with PSSL, etc.

About the Ryzen-based APUs: since they will contain not more than 4 cores (i.e. just one core complex (CCX), not two, as in Ryzen 7), there will be no L3<->L3 traffic during the task migration between the the CCXs (Ryzen 7 has two of them). Plus, these APU will boast a higher CPU frequency (at least 4 GHz in the high-end models, probably even higher), so I personally wouldn't say "Meh" about these APUs :).

About evangelizing game developers: it's very hard to convince some of them to cooperate, so even Bethesda, with its id Tech 6, Void, and Creation engines, is very good for AMD for now ;). Look at Ubisoft: they just hate everything excluding Intel, and nVidia, and mind you, Ubisoft uses in almost all its games its own engines. About Epic & CryTek: since Ryzen/Ryzen2/Ryzen3 and Vega/Navi will be the foundation of the next PlayStation and XBox, I think they'll cooperate with AMD, too - when the time comes.

Link to comment
Share on other sites

Hmm.

 

Thinking on this, if video cards had one CPU core on them the OS could be coaxed to treat that CPU

as part of the SMT pool (there are already PCI-E cards that let you add old CPU's like this).

Then you could make a custom CPU which is very light on integer and has an FPU hierarchy that includes it's own AVX,

etc but also calls to the GPU cores which would include some baseline support for the simplest x86 FPU stuff.

The only snag here would be to get Microsoft to update their Kernel to load balance threads based on execution type

(Eg. If X CPU has this much FP vs Y CPU only having this much, send the FP thread there.)

 

This would solve part of the risk\benefit scenario with customizing this interconnect. The card would benefit both

AMD and Intel environments. I'll bet Intel wouldn't like that move too much though.

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

As long as there is a bus between the CPU and GPU (even the integrated one), no major CPU vendors would put their efforts into offloading the FPU directly via the x86-code. Even the fastest buses have a multi-thousand clock latency, so it's a no go.

But even if there was no bus between the CPU and GPU (i.e. they are fully integrated into single die), there should be an ISA that programmers would use to program those GPUs. And since every new GPU generation even from the single vendor has a different ISA, I would barely imagine how different GPU vendors would standardize such an ISA for decades to come.

Edited by MoroseTroll
Link to comment
Share on other sites

Right. The idea would really be an APU on a PCI-E bored so the CPU and GPU would exist on the same die.

That would homogenize the behavior for all platforms.

 

As for ISA? They make new ones all the time don't they (SSE, SSE2, etc)? You could offer something akin to that for the new paradigm...

but primarily you would support existing ISA's by modifying both the GPU and CPU to better handle sharing the work.

 

I know. A pipe dream but it seems to be the only logical way to break the barrier here. You can't count on software developers to pickup on

this.

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recent Status Updates

    • Petike the Taffer

      I've finally managed to log in to The Dark Mod Wiki. I'm back in the saddle and before the holidays start in full, I'll be adding a few new FM articles and doing other updates. Written in Stone is already done.
      · 4 replies
    • nbohr1more

      TDM 15th Anniversary Contest is now active! Please declare your participation: https://forums.thedarkmod.com/index.php?/topic/22413-the-dark-mod-15th-anniversary-contest-entry-thread/
       
      · 0 replies
    • JackFarmer

      @TheUnbeholden
      You cannot receive PMs. Could you please be so kind and check your mailbox if it is full (or maybe you switched off the function)?
      · 1 reply
    • OrbWeaver

      I like the new frob highlight but it would nice if it was less "flickery" while moving over objects (especially barred metal doors).
      · 4 replies
    • nbohr1more

      Please vote in the 15th Anniversary Contest Theme Poll
       
      · 0 replies
×
×
  • Create New...