Jump to content
The Dark Mod Forums

Nvidia GeForce RTX 20 Series and Real-Time Ray Tracing


jaxa

Recommended Posts

Nvidia has announced the upcoming launch of its RTX 2080 Ti ($1000-$1200, September 20), RTX 2080 ($700-800, September 20), and RTX 2070 ($500-600, October) GPUs. You didn't read that wrong, the 'R' is for Ray tracing.

 

The key feature they are touting is real-time ray tracing using "dedicated" ray tracing (RT) cores. The tensor cores for machine learning are also used to help ray tracing by denoising. Here is an example of how that can work:

 

 

Nvidia's keynote presentation at Gamescom 2018 included a demo of the Eidos Montreal game Shadows of the Tomb Raider, using the real-time ray tracing technique, highlighting improvements made to shadows.

 

Earlier in the year, Microsoft announced a DirectX Raytracing API. Similar improvements are being made to Vulkan.

 

There are a lot of questions raised here. How and when will AMD respond, for example? But most importantly, could/will a hybrid ray tracing technique ever be applied to TDM?

Edited by jaxa
  • Like 2
Link to comment
Share on other sites

the rtx 2080's

for this to work as they say then it would be a grid of math co-processers, each dedicated to processing the raytracing in a section of the screen all working together at the same time something like the way supercomputers work like a cray-2.

these are likly to be bought by bitcoin farmers putting them out of the range of gamers, or graphic makers.

anyway these are likly to run very hot during use. even with two fans there optimal running speed is going to be very hot.

Link to comment
Share on other sites

IMO the demos where not that impressive not something that makes me think, i need to have it! And GPU prices are very high today, i bought my RX570 some months ago and i hope that will last, at least five years or more and to me it was expensive enough already.

 

About AMD responding, they already hinted ( through the driver guy ) on twitter, that ray tracing is coming for their gaming GPUs has well, in what form i don't know, let's wait.

 

btw

 

Link to comment
Share on other sites

raytracing just means if you have a light source then all the possible rays of light from that source are traced to destination, if its two lights then double the amount of rays, they make it sound as though it generates shadows, and not that is a by-product of a ray of light not getting there.

Basically the same way doom 3 and the dark mod works where the world is dark and its lit by lights. shadows happen in thoses places the light doesn't reach.

 

the reflections are just created by an object reflecting that ray of light from the light source, as for water then the water would have a refraction amount so the ray of light bends when coming into contact with the surface.

Edited by stumpy
Link to comment
Share on other sites

Meh, all I want to see is if 1x RTX 2070 is as fast as a 1080Ti... in current games.

I think it's impossible it could be faster.

And I hope there will be no problem (performance/compatibility) with now old D3D9 games.

Edited by lowenz

Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.

Link to comment
Share on other sites

Meh, all I want to see is if 1x RTX 2070 is as fast as a 1080Ti... in current games.

 

It has been suggested that RTX 2070 will be 8% faster than 1080 (regular). Definitely not as fast as a 1080Ti (as long as we're ignoring the raytracing stuff).

 

The prices are too damn high anyway. The obvious move is to wait a few months to see if AMD releases anything and if Nvidia slashes the prices. Until then it's expensive raytracing eye candy for early adopters.

 

Edit: Another thing to note is that it's a 12nm node GPU, the stopgap node between 14nm and 7nm. GeForce 10 series was 16nm. Both of these are going to be lackluster compared to 7nm (just looking at the numbers, 7 is about 42% less than 12, which is only 25% less than 16... and yes, I know the node numbers are partly marketing fiction).

Edited by jaxa
Link to comment
Share on other sites

They took a big risk dedicating all that chip space to Ray Tracing tech but I think it was a "Good Idea" tm

 

The current Xbox One \ PS4 console cycle is gonna be much longer than the previous ones due to the extension

products ( Xbox One X, PS4 Pro) so games are gonna be stuck targeting the original Xbox One specification.

 

Since the bulk of the visuals are in geometry and textures and those will be stuck at a fixed quality level the one

remaining visual option that has a big impact on "the look of a game" is lighting.

 

Normally, getting game companies to make a new lighting model for the PC version is like pulling teeth but with Ray Tracing

it actually simplifies things and can make that part of the renderer require less special casing and code.

 

Nvidia can then leverage their mindshare in Ray Tracing to maybe get more console GPU contracts beyond Nintendo Switch,

or at least offer something game-changing to Nintendo for the next iteration.

 

Ray Tracing has been the end goal of GPU manufacturers and lithography is hitting the limit of where things can shrink quickly

so getting dedicated Ray Tracing tech into the GPU now sets Nvidia up for moving the bulk of the GPU die to be dedicated to that

in the next two or three shrinks before things hit a dead end. If other GPU vendors do not follow suit, they will need to radically redesign

their GPU on the last of the shrinks.

 

Nvidia can take the PR hit of having only a modest bump in Raster performance now and then as the industry migrates to Ray Tracing

they will already be in an advantageous spot whilst AMD might have to explain to it's clients why their new competing Ray Tracing GPU

performs worse at Raster than the previous gen products.

 

TLDR; Nvidia seems to have taken stock of the current state of the gaming industry and has chosen a pretty good time to take a chance

on this tech.

  • Like 1

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

I'm kinda reluctant on the idea of adopting their RT technology until it's cross-vendor at least (and ideally opengl-compatible).

Not mentioning the 20XX card user share.

Vulkan RT is on the way! (Do not expect it in OpenGL)

 

DX12 already supports Ray Tracing.

Edited by lowenz

Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.

Link to comment
Share on other sites

They took a big risk dedicating all that chip space to Ray Tracing tech but I think it was a "Good Idea" tm

 

The current Xbox One \ PS4 console cycle is gonna be much longer than the previous ones due to the extension

products ( Xbox One X, PS4 Pro) so games are gonna be stuck targeting the original Xbox One specification.

 

Since the bulk of the visuals are in geometry and textures and those will be stuck at a fixed quality level the one

remaining visual option that has a big impact on "the look of a game" is lighting.

 

Normally, getting game companies to make a new lighting model for the PC version is like pulling teeth but with Ray Tracing

it actually simplifies things and can make that part of the renderer require less special casing and code.

 

Nvidia can then leverage their mindshare in Ray Tracing to maybe get more console GPU contracts beyond Nintendo Switch,

or at least offer something game-changing to Nintendo for the next iteration.

 

Ray Tracing has been the end goal of GPU manufacturers and lithography is hitting the limit of where things can shrink quickly

so getting dedicated Ray Tracing tech into the GPU now sets Nvidia up for moving the bulk of the GPU die to be dedicated to that

in the next two or three shrinks before things hit a dead end. If other GPU vendors do not follow suit, they will need to radically redesign

their GPU on the last of the shrinks.

 

Nvidia can take the PR hit of having only a modest bump in Raster performance now and then as the industry migrates to Ray Tracing

they will already be in an advantageous spot whilst AMD might have to explain to it's clients why their new competing Ray Tracing GPU

performs worse at Raster than the previous gen products.

 

TLDR; Nvidia seems to have taken stock of the current state of the gaming industry and has chosen a pretty good time to take a chance

on this tech.

Nbohr, how can you always speak for me too? :D

Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.

Link to comment
Share on other sites

The current Xbox One \ PS4 console cycle is gonna be much longer than the previous ones due to the extension

products ( Xbox One X, PS4 Pro) so games are gonna be stuck targeting the original Xbox One specification.

 

 

Not as long as you might think. The mid-cycle "Pro" consoles basically put 4K resolution on the table, but won't extend the console cycle much past 7 years. Xbox One and PS4 came out in November 2013. Xbox Two "Scarlett" has been said to have a 2020 release date, and now a report from yesterday claims that Sony will try to sneak ahead with a December 2019 release for the PS5.

 

Thinking about it some more, now that these consoles use customized x86 CPUs and GPUs from AMD, you could see the console makers becoming even more aggressive with refreshes. Why not drop a slightly improved chip in every 2 years? All new gains can go towards lowering power consumption and increasing frame rates at a given resolution (especially VR).

 

At least, I expect the new console APUs to ditch Jaguar and use underclocked Ryzen. Probably keeping the core count at 8 but doubling the thread count to 16. Remember, the 7nm "Zen 2" Ryzen chips might be doubling the max core counts to 16, so octo-core would become the new "dual core" of the lineup, the ground level.

 

Nvidia can then leverage their mindshare in Ray Tracing to maybe get more console GPU contracts beyond Nintendo Switch,

or at least offer something game-changing to Nintendo for the next iteration.

 

 

https://en.wikipedia.org/wiki/Tegra#Tegra_X1

 

Look at the Tegra X1, Tegra X2, Xavier, and Orin (no details). Tegra X1 was used in Nintendo Switch. Tegra X2 would only offer modest improvements and will look pretty old soon. Xavier increases TDP to 20-30 W and is oriented towards automotive machine learning uses. Also note that core counts have been changed around. The Switch can use 4 cores at one time and swap out for the 4 lower-powered cores at the discretion of the SoC. The Tegra X2 has a 2+4 configuration, and Xavier has 8 identical cores.

 

My point is that it's not clear which SoC you would drop in to a new version of the Nintendo Switch. And you would want to see something newer get used since quadrupling the GPU performance *could* allow for 4K gaming in docked mode, and possibly enable a VR headset mode, which Nintendo seems to be testing already. Obviously, the Switch's current 720p screen would have to be upgraded, to at least 1080p and maybe even 1440p. A 90 Hz framerate would be a good minimum, but if you look at Oculus Go, it is only targeting 60-72 Hz at 1440p. So maybe if GPU performance of the Switch is quadrupled, you could have 4K or pseudo-4K in docked mode, and 1080p or 1440p at 75 Hz undocked.

 

Nintendo Switch only came out in March 2017, so you could see them targeting a late 2020 release for a 4K/VR refresh to coincide with the Xbox release. The chip used could be Nvidia Orin, except it would probably be underclocked and maybe the die size would need to be cut down from the chips being used in automobiles. I'm also not sure about "RT cores" since the automotive platform probably doesn't need them and Nintendo might not be able to ride the raytracing wave within ~2 years, especially on something relatively underpowered like the Switch.

 

I came up with all this junk on my own, so I'd like to hear your opinions on this.

 

Ray Tracing has been the end goal of GPU manufacturers and lithography is hitting the limit of where things can shrink quickly

so getting dedicated Ray Tracing tech into the GPU now sets Nvidia up for moving the bulk of the GPU die to be dedicated to that

in the next two or three shrinks before things hit a dead end. If other GPU vendors do not follow suit, they will need to radically redesign

their GPU on the last of the shrinks.

 

 

1. I don't think it's completely clear that the RT cores are "dedicated". It's just not clear what they are. Maybe the shader cores can be switched to raytracing and back as needed. Here is how Anandtech's article described them:

 

New to the Turing architecture is what NVIDIA is calling an RT core, the underpinnings of which we aren’t fully informed on at this time, but serve as dedicated ray tracing processors.

 

Does "serve as dedicated ray tracing processors" really mean that "they are dedicated ray tracing processors"? Only Nvidia knows the answer.

 

2. The next 3 big shrinks will likely be 7nm, 5nm, and 3nm. Although various roadmaps, particularly Samsung's, have had interim nodes on them, such as 6nm or 4nm. At this point it's pretty clear that 5nm can be done, and the same can probably be said for 3nm. So that's 3 big shrinks. From there on, there is promising news about even smaller nodes such as 2.5nm and 1.5nm. Keep in mind that many of the feature sizes such as gate length are nowhere near to actually being described by those numbers, so the name of the node is basically marketing in most aspects. But it will be an improvement nonetheless.

 

What we need to start asking is whether we will see 3D chips by the "end" of Moore's law scaling. Basically, the CPU equivalent of 3D NAND, with many layers of transistors, and 1 or more cores per layer. These will have massive heat dissipation challenges, but the industry has about 10 more years to figure it out. The ability to move data to adjacent cores that are directly up or down could have both benefits and drawbacks.

 

Nvidia can take the PR hit of having only a modest bump in Raster performance now and then as the industry migrates to Ray Tracing

they will already be in an advantageous spot whilst AMD might have to explain to it's clients why their new competing Ray Tracing GPU

performs worse at Raster than the previous gen products.

 

 

There is no PR hit. 7nm node is not quite ready for the prime time, so they are going with the 12nm node, like AMD has for new Ryzen and Threadripper CPUs. There's also no incentive for them to push forward with 7nm GPUs since AMD is not offering any real GPU competition. There may be a PR hit from the high prices, but they are "good" prices because 1.) some people will be willing to pay them, 2.) you might get as much bang for your buck as the GeForce 10 series, except you'll have to pay twice the buck for twice the bang, 3.) they keep the demand low so that they can continue to sell off the remaining GeForce 10 series GPUs, and 4.) they will just lower the prices to sane levels once AMD launches some competing GPUs. The numbers reflect that Nvidia is unopposed at the high end. And even if the emphasis on raytracing is marketing PR to sell these half-node GPUs, I didn't really expect any games to come with support for real-time raytracing (at launch), and they have themselves a short list.

Edited by jaxa
Link to comment
Share on other sites

Opinions are rather skeptic after the demos, with Tomb Rider performance regularly dipping below 60 fps on 2080Ti in 1080p (!). IMO it would be just like early days of Physx, where cards could support the technology, but were too slow to render everything in a scene combined.

Link to comment
Share on other sites

  • 4 weeks later...

https://hexus.net/tech/reviews/graphics/122045-nvidia-turing-architecture-examined-and-explained/

 

 

 

 

apparently the ray-tracing doesn't work in windows, microsoft haven't updated the drivers for you to see it or use it, and they'll only update windows 10 to use it, this is for the developer side of games and 3d scenes. so if something like blender gets updated to use the rtx card then it will have to wait on microsoft adding the drivers to use it.

Edited by stumpy
Link to comment
Share on other sites

  • 1 year later...

Real-time ray-tracing is actually pretty cool, in practice.

I played "Control" with an nVidia RTX2070 and I found it pretty incredible. It felt like it was the graphical feature I didn't know I was missing, lol. In one part of the game early on, I was sneaking around in a dimly-lit (after hours) office building. As I approached an office that was enclosed in glass walls, a perfect reflection of 'me' realistically appeared in the glass and it totally startled me.

For a game like Thief, where lighting and environments play a big role... polished tiles, polished oak wood mansion set pieces, marble, rain-soaked stones, windows, etc. reflecting torchlight, fireplaces, characters, etc., in a realistic way, it could be an amazing thing; adding another layer/element to the environment and gameplay. My own reflection startling me?? That's a crazy potential gameplay element. One that may not work forever if you get used to it... but combined with other character reflections and moving set pieces being realistically reflected, you never know. Maybe you would never get used to it since it's so dynamic and environment-dependent. 

That's how I felt, anyways. I saw a ton of potential. You might have to experience real-time ray-tracing firsthand to fully understand/appreciate, though.

PS: If it were ever to be in Thief or TDM, I'm in the camp that says we should not see Garrett's face. Sooo, his face would be hidden in the shadow of his hood in any mirror, window, or golden chalice he might reflect in.

Caveat: I don't have a PS4 or XboxOne.... so, I've been a bit out-of-the-loop graphically-speaking up until recent. So, maybe it's not that great of leap for gamers who have been experiencing awesome graphics the past 5+ years.

Edited by Darkness_Falls
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recent Status Updates

    • Petike the Taffer

      I've finally managed to log in to The Dark Mod Wiki. I'm back in the saddle and before the holidays start in full, I'll be adding a few new FM articles and doing other updates. Written in Stone is already done.
      · 1 reply
    • nbohr1more

      TDM 15th Anniversary Contest is now active! Please declare your participation: https://forums.thedarkmod.com/index.php?/topic/22413-the-dark-mod-15th-anniversary-contest-entry-thread/
       
      · 0 replies
    • JackFarmer

      @TheUnbeholden
      You cannot receive PMs. Could you please be so kind and check your mailbox if it is full (or maybe you switched off the function)?
      · 1 reply
    • OrbWeaver

      I like the new frob highlight but it would nice if it was less "flickery" while moving over objects (especially barred metal doors).
      · 4 replies
    • nbohr1more

      Please vote in the 15th Anniversary Contest Theme Poll
       
      · 0 replies
×
×
  • Create New...