Jump to content


Photo

Nvidia GeForce RTX 20 Series and Real-Time Ray Tracing

nvidia raytracing turing 2080 2070 2080 ti gpu machine learning tensor cores ray tracing

  • Please log in to reply
22 replies to this topic

#1 jaxa

jaxa

    Advanced Member

  • Member
  • PipPipPip
  • 1352 posts

Posted 20 August 2018 - 05:34 PM

Nvidia has announced the upcoming launch of its RTX 2080 Ti ($1000-$1200, September 20), RTX 2080 ($700-800, September 20), and RTX 2070 ($500-600, October) GPUs. You didn't read that wrong, the 'R' is for Ray tracing.

 

The key feature they are touting is real-time ray tracing using "dedicated" ray tracing (RT) cores. The tensor cores for machine learning are also used to help ray tracing by denoising. Here is an example of how that can work:

 

 

Nvidia's keynote presentation at Gamescom 2018 included a demo of the Eidos Montreal game Shadows of the Tomb Raider, using the real-time ray tracing technique, highlighting improvements made to shadows.

 

Earlier in the year, Microsoft announced a DirectX Raytracing API. Similar improvements are being made to Vulkan.

 

There are a lot of questions raised here. How and when will AMD respond, for example? But most importantly, could/will a hybrid ray tracing technique ever be applied to TDM?


Edited by jaxa, 20 August 2018 - 05:45 PM.

  • nbohr1more and RPGista like this

#2 stumpy

stumpy

    Advanced Member

  • Member
  • PipPipPip
  • 1869 posts

Posted 20 August 2018 - 07:32 PM

I got an idea on how the raytracing on the 2080 cards works but explaining it isn't easy.


Edited by stumpy, 20 August 2018 - 07:33 PM.


#3 kano

kano

    Member

  • Member
  • PipPip
  • 355 posts

Posted 20 August 2018 - 09:46 PM

Quake Wars (Doom3 engine) was already run with ray-tracing.



#4 stumpy

stumpy

    Advanced Member

  • Member
  • PipPipPip
  • 1869 posts

Posted 21 August 2018 - 08:07 AM

the rtx 2080's

for this to work as they say then it would be a grid of math co-processers, each dedicated to processing the raytracing in a section of the screen all working together at the same time something like the way supercomputers work like a cray-2.

these are likly to be bought by bitcoin farmers putting them out of the range of gamers, or graphic makers.

anyway these are likly to run very hot during use. even with two fans there optimal running speed is going to be very hot.



#5 HMart

HMart

    Advanced Member

  • Member
  • PipPipPip
  • 747 posts

Posted 21 August 2018 - 09:53 AM

IMO the demos where not that impressive not something that makes me think, i need to have it! And GPU prices are very high today, i bought my RX570 some months ago and i hope that will last, at least five years or more and to me it was expensive enough already.

 

About AMD responding, they already hinted ( through the driver guy ) on twitter, that ray tracing is coming for their gaming GPUs has well, in what form i don't know, let's wait.

 

btw 

 

  


  • Bikerdude likes this

#6 lowenz

lowenz

    Advanced Member

  • Member
  • PipPipPip
  • 1922 posts

Posted 21 August 2018 - 01:00 PM

Quake 2 and path tracing ;)

 

https://github.com/e...ree/pathtracing


Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.


#7 stumpy

stumpy

    Advanced Member

  • Member
  • PipPipPip
  • 1869 posts

Posted 21 August 2018 - 06:24 PM

raytracing just means if you have a light source then all the possible rays of light from that source are traced to destination, if its two lights then double the amount of rays, they make it sound as though it generates shadows, and not that is a by-product of a ray of light not getting there.

Basically the same way doom 3 and the dark mod works where the world is dark and its lit by lights. shadows happen in thoses places the light doesn't reach.

 

the reflections are just created by an object reflecting that ray of light from the light source, as for water then the water would have a refraction amount so the ray of light bends when coming into contact with the surface.


Edited by stumpy, 21 August 2018 - 06:28 PM.


#8 jaxa

jaxa

    Advanced Member

  • Member
  • PipPipPip
  • 1352 posts

Posted 21 August 2018 - 10:32 PM

Here's some more info, including games that will support it on launch:

 

https://www.anandtec...time-raytracing



#9 Bikerdude

Bikerdude

    Mod hero

  • Member
  • PipPipPipPipPip
  • 20281 posts

Posted 22 August 2018 - 01:53 AM

Meh, all I want to see is if 1x RTX 2070 is as fast as a 1080Ti... in current games.



#10 lowenz

lowenz

    Advanced Member

  • Member
  • PipPipPip
  • 1922 posts

Posted 22 August 2018 - 02:17 AM

Meh, all I want to see is if 1x RTX 2070 is as fast as a 1080Ti... in current games.

I think it's impossible it could be faster.

And I hope there will be no problem (performance/compatibility) with now old D3D9 games.


Edited by lowenz, 22 August 2018 - 02:17 AM.

Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.


#11 jaxa

jaxa

    Advanced Member

  • Member
  • PipPipPip
  • 1352 posts

Posted 22 August 2018 - 03:18 AM

Meh, all I want to see is if 1x RTX 2070 is as fast as a 1080Ti... in current games.

 

It has been suggested that RTX 2070 will be 8% faster than 1080 (regular). Definitely not as fast as a 1080Ti (as long as we're ignoring the raytracing stuff).

 

The prices are too damn high anyway. The obvious move is to wait a few months to see if AMD releases anything and if Nvidia slashes the prices. Until then it's expensive raytracing eye candy for early adopters.

 

Edit: Another thing to note is that it's a 12nm node GPU, the stopgap node between 14nm and 7nm. GeForce 10 series was 16nm. Both of these are going to be lackluster compared to 7nm (just looking at the numbers, 7 is about 42% less than 12, which is only 25% less than 16... and yes, I know the node numbers are partly marketing fiction).


Edited by jaxa, 22 August 2018 - 03:29 AM.


#12 Judith

Judith

    Advanced Member

  • Member
  • PipPipPip
  • 1725 posts

Posted 22 August 2018 - 04:12 AM

As usual, first editions of new tech are there to milk enthusiasts and early adopters; it's always better to wait until it becomes more common and prices are lower.



#13 nbohr1more

nbohr1more

    Darkmod PR, Wordsmith

  • Development Role
  • PipPipPipPipPip
  • 9102 posts

Posted 22 August 2018 - 08:06 AM

They took a big risk dedicating all that chip space to Ray Tracing tech but I think it was a "Good Idea" tm

 

The current Xbox One \ PS4 console cycle is gonna be much longer than the previous ones due to the extension

products ( Xbox One X, PS4 Pro) so games are gonna be stuck targeting the original Xbox One specification.

 

Since the bulk of the visuals are in geometry and textures and those will be stuck at a fixed quality level the one

remaining visual option that has a big impact on "the look of a game" is lighting.

 

Normally, getting game companies to make a new lighting model for the PC version is like pulling teeth but with Ray Tracing

it actually simplifies things and can make that part of the renderer require less special casing and code.

 

Nvidia can then leverage their mindshare in Ray Tracing to maybe get more console GPU contracts beyond Nintendo Switch,

or at least offer something game-changing to Nintendo for the next iteration.

 

Ray Tracing has been the end goal of GPU manufacturers and lithography is hitting the limit of where things can shrink quickly

so getting dedicated Ray Tracing tech into the GPU now sets Nvidia up for moving the bulk of the GPU die to be dedicated to that

in the next two or three shrinks before things hit a dead end. If other GPU vendors do not follow suit, they will need to radically redesign

their GPU on the last of the shrinks.

 

Nvidia can take the PR hit of having only a modest bump in Raster performance now and then as the industry migrates to Ray Tracing

they will already be in an advantageous spot whilst AMD might have to explain to it's clients why their new competing Ray Tracing GPU

performs worse at Raster than the previous gen products.

 

TLDR; Nvidia seems to have taken stock of the current state of the gaming industry and has chosen a pretty good time to take a chance

on this tech.


  • lowenz likes this
Please visit TDM's IndieDB site and help promote the mod:

http://www.indiedb.c...ds/the-dark-mod

(Yeah, shameless promotion... but traffic is traffic folks...)

#14 duzenko

duzenko

    Advanced Member

  • Active Developer
  • PipPipPip
  • 1809 posts

Posted 22 August 2018 - 10:51 AM

I'm kinda reluctant on the idea of adopting their RT technology until it's cross-vendor at least (and ideally opengl-compatible).

Not mentioning the 20XX card user share.


  • Boiler's_hiss likes this

#15 lowenz

lowenz

    Advanced Member

  • Member
  • PipPipPip
  • 1922 posts

Posted 22 August 2018 - 04:59 PM

I'm kinda reluctant on the idea of adopting their RT technology until it's cross-vendor at least (and ideally opengl-compatible).

Not mentioning the 20XX card user share.

Vulkan RT is on the way! (Do not expect it in OpenGL)

 

DX12 already supports Ray Tracing.


Edited by lowenz, 22 August 2018 - 05:00 PM.

Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.


#16 lowenz

lowenz

    Advanced Member

  • Member
  • PipPipPip
  • 1922 posts

Posted 22 August 2018 - 05:01 PM

They took a big risk dedicating all that chip space to Ray Tracing tech but I think it was a "Good Idea" tm

 

The current Xbox One \ PS4 console cycle is gonna be much longer than the previous ones due to the extension

products ( Xbox One X, PS4 Pro) so games are gonna be stuck targeting the original Xbox One specification.

 

Since the bulk of the visuals are in geometry and textures and those will be stuck at a fixed quality level the one

remaining visual option that has a big impact on "the look of a game" is lighting.

 

Normally, getting game companies to make a new lighting model for the PC version is like pulling teeth but with Ray Tracing

it actually simplifies things and can make that part of the renderer require less special casing and code.

 

Nvidia can then leverage their mindshare in Ray Tracing to maybe get more console GPU contracts beyond Nintendo Switch,

or at least offer something game-changing to Nintendo for the next iteration.

 

Ray Tracing has been the end goal of GPU manufacturers and lithography is hitting the limit of where things can shrink quickly

so getting dedicated Ray Tracing tech into the GPU now sets Nvidia up for moving the bulk of the GPU die to be dedicated to that

in the next two or three shrinks before things hit a dead end. If other GPU vendors do not follow suit, they will need to radically redesign

their GPU on the last of the shrinks.

 

Nvidia can take the PR hit of having only a modest bump in Raster performance now and then as the industry migrates to Ray Tracing

they will already be in an advantageous spot whilst AMD might have to explain to it's clients why their new competing Ray Tracing GPU

performs worse at Raster than the previous gen products.

 

TLDR; Nvidia seems to have taken stock of the current state of the gaming industry and has chosen a pretty good time to take a chance

on this tech.

Nbohr, how can you always speak for me too? :D


Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.


#17 HMart

HMart

    Advanced Member

  • Member
  • PipPipPip
  • 747 posts

Posted 22 August 2018 - 07:53 PM

Nvidia being Nvidia. RTX logo in games doesn't mean they have ray tracing, even tho "R" is on the name and they "failed" to mention that in the initial presentation.  

 

 



#18 jaxa

jaxa

    Advanced Member

  • Member
  • PipPipPip
  • 1352 posts

Posted 22 August 2018 - 08:08 PM

The current Xbox One \ PS4 console cycle is gonna be much longer than the previous ones due to the extension

products ( Xbox One X, PS4 Pro) so games are gonna be stuck targeting the original Xbox One specification.

 

 

Not as long as you might think. The mid-cycle "Pro" consoles basically put 4K resolution on the table, but won't extend the console cycle much past 7 years. Xbox One and PS4 came out in November 2013. Xbox Two "Scarlett" has been said to have a 2020 release date, and now a report from yesterday claims that Sony will try to sneak ahead with a December 2019 release for the PS5.

 

Thinking about it some more, now that these consoles use customized x86 CPUs and GPUs from AMD, you could see the console makers becoming even more aggressive with refreshes. Why not drop a slightly improved chip in every 2 years? All new gains can go towards lowering power consumption and increasing frame rates at a given resolution (especially VR).

 

At least, I expect the new console APUs to ditch Jaguar and use underclocked Ryzen. Probably keeping the core count at 8 but doubling the thread count to 16. Remember, the 7nm "Zen 2" Ryzen chips might be doubling the max core counts to 16, so octo-core would become the new "dual core" of the lineup, the ground level.

 

Nvidia can then leverage their mindshare in Ray Tracing to maybe get more console GPU contracts beyond Nintendo Switch,

or at least offer something game-changing to Nintendo for the next iteration.

 

 

https://en.wikipedia.../Tegra#Tegra_X1

 

Look at the Tegra X1, Tegra X2, Xavier, and Orin (no details). Tegra X1 was used in Nintendo Switch. Tegra X2 would only offer modest improvements and will look pretty old soon. Xavier increases TDP to 20-30 W and is oriented towards automotive machine learning uses. Also note that core counts have been changed around. The Switch can use 4 cores at one time and swap out for the 4 lower-powered cores at the discretion of the SoC. The Tegra X2 has a 2+4 configuration, and Xavier has 8 identical cores.

 

My point is that it's not clear which SoC you would drop in to a new version of the Nintendo Switch. And you would want to see something newer get used since quadrupling the GPU performance *could* allow for 4K gaming in docked mode, and possibly enable a VR headset mode, which Nintendo seems to be testing already. Obviously, the Switch's current 720p screen would have to be upgraded, to at least 1080p and maybe even 1440p. A 90 Hz framerate would be a good minimum, but if you look at Oculus Go, it is only targeting 60-72 Hz at 1440p. So maybe if GPU performance of the Switch is quadrupled, you could have 4K or pseudo-4K in docked mode, and 1080p or 1440p at 75 Hz undocked.

 

Nintendo Switch only came out in March 2017, so you could see them targeting a late 2020 release for a 4K/VR refresh to coincide with the Xbox release. The chip used could be Nvidia Orin, except it would probably be underclocked and maybe the die size would need to be cut down from the chips being used in automobiles. I'm also not sure about "RT cores" since the automotive platform probably doesn't need them and Nintendo might not be able to ride the raytracing wave within ~2 years, especially on something relatively underpowered like the Switch.

 

I came up with all this junk on my own, so I'd like to hear your opinions on this.

 

Ray Tracing has been the end goal of GPU manufacturers and lithography is hitting the limit of where things can shrink quickly

so getting dedicated Ray Tracing tech into the GPU now sets Nvidia up for moving the bulk of the GPU die to be dedicated to that

in the next two or three shrinks before things hit a dead end. If other GPU vendors do not follow suit, they will need to radically redesign

their GPU on the last of the shrinks.

 

 

1. I don't think it's completely clear that the RT cores are "dedicated". It's just not clear what they are. Maybe the shader cores can be switched to raytracing and back as needed. Here is how Anandtech's article described them:

 

New to the Turing architecture is what NVIDIA is calling an RT core, the underpinnings of which we aren’t fully informed on at this time, but serve as dedicated ray tracing processors.

 

Does "serve as dedicated ray tracing processors" really mean that "they are dedicated ray tracing processors"? Only Nvidia knows the answer.

 

2. The next 3 big shrinks will likely be 7nm, 5nm, and 3nm. Although various roadmaps, particularly Samsung's, have had interim nodes on them, such as 6nm or 4nm. At this point it's pretty clear that 5nm can be done, and the same can probably be said for 3nm. So that's 3 big shrinks. From there on, there is promising news about even smaller nodes such as 2.5nm and 1.5nm. Keep in mind that many of the feature sizes such as gate length are nowhere near to actually being described by those numbers, so the name of the node is basically marketing in most aspects. But it will be an improvement nonetheless.

 

What we need to start asking is whether we will see 3D chips by the "end" of Moore's law scaling. Basically, the CPU equivalent of 3D NAND, with many layers of transistors, and 1 or more cores per layer. These will have massive heat dissipation challenges, but the industry has about 10 more years to figure it out. The ability to move data to adjacent cores that are directly up or down could have both benefits and drawbacks.

 

Nvidia can take the PR hit of having only a modest bump in Raster performance now and then as the industry migrates to Ray Tracing

they will already be in an advantageous spot whilst AMD might have to explain to it's clients why their new competing Ray Tracing GPU

performs worse at Raster than the previous gen products.

 

 

There is no PR hit. 7nm node is not quite ready for the prime time, so they are going with the 12nm node, like AMD has for new Ryzen and Threadripper CPUs. There's also no incentive for them to push forward with 7nm GPUs since AMD is not offering any real GPU competition. There may be a PR hit from the high prices, but they are "good" prices because 1.) some people will be willing to pay them, 2.) you might get as much bang for your buck as the GeForce 10 series, except you'll have to pay twice the buck for twice the bang, 3.) they keep the demand low so that they can continue to sell off the remaining GeForce 10 series GPUs, and 4.) they will just lower the prices to sane levels once AMD launches some competing GPUs. The numbers reflect that Nvidia is unopposed at the high end. And even if the emphasis on raytracing is marketing PR to sell these half-node GPUs, I didn't really expect any games to come with support for real-time raytracing (at launch), and they have themselves a short list.


Edited by jaxa, 22 August 2018 - 11:17 PM.


#19 Judith

Judith

    Advanced Member

  • Member
  • PipPipPip
  • 1725 posts

Posted 23 August 2018 - 12:57 AM

Opinions are rather skeptic after the demos, with Tomb Rider performance regularly dipping below 60 fps on 2080Ti in 1080p (!). IMO it would be just like early days of Physx, where cards could support the technology, but were too slow to render everything in a scene combined.



#20 The Black Arrow

The Black Arrow

    Member

  • Member
  • PipPip
  • 49 posts

Posted 16 September 2018 - 02:01 PM

Raytracing is not even "realistic".


Posted Image

#21 jaxa

jaxa

    Advanced Member

  • Member
  • PipPipPip
  • 1352 posts

Posted 17 September 2018 - 03:40 PM

Raytracing is not even "realistic".

 

Care to back up your statement?



#22 stumpy

stumpy

    Advanced Member

  • Member
  • PipPipPip
  • 1869 posts

Posted 18 September 2018 - 08:01 AM

https://hexus.net/te...-and-explained/

 

 

 

 

apparently the ray-tracing doesn't work in windows, microsoft haven't updated the drivers for you to see it or use it, and they'll only update windows 10 to use it, this is for the developer side of games and 3d scenes. so if something like blender gets updated to use the rtx card then it will have to wait on microsoft adding the drivers to use it.


Edited by stumpy, 18 September 2018 - 08:08 AM.


#23 lowenz

lowenz

    Advanced Member

  • Member
  • PipPipPip
  • 1922 posts

Posted 18 September 2018 - 09:06 AM

IF you use DX backend.

Vulkan too is on the RT bandwagon ;)


Task is not so much to see what no one has yet seen but to think what nobody has yet thought about that which everybody see. - E.S.






Also tagged with one or more of these keywords: nvidia, raytracing, turing, 2080, 2070, 2080 ti, gpu, machine learning, tensor cores, ray tracing

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users