Jump to content
The Dark Mod Forums

Geforce & Radeons cards over the last 5 years: a comparision


Bikerdude

Recommended Posts

There's only so much performance you can squeeze out when you use the same size lithography. We've been stuck at 28nm for awhile now

as I recall. I think Apple just grabbed exclusive rights to 14nm for a year at TMSC so that stalls things further. Next GPU's down to 22nm?

They are trying wacky new stuff like stacking chips now because they can't just rely on process shrinking. That said, they could still improve

their overdraw reduction tech and it's well overdue for them to do it.

Please visit TDM's IndieDB site and help promote the mod:

 

http://www.indiedb.com/mods/the-dark-mod

 

(Yeah, shameless promotion... but traffic is traffic folks...)

Link to comment
Share on other sites

I agree with nbohr. I think you have to give them credit for squeezing so many improvements out of 28nm.

 

http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review

 

At the risk of sounding like a broken record, the biggest story in the GPU industry over the last year has been over what isn’t as opposed to what is. What isn’t happening is that after nearly 3 years of the leading edge manufacturing node for GPUs at TSMC being their 28nm process, it isn’t being replaced any time soon. As of this fall TSMC has 20nm up and running, but only for SoC-class devices such as Qualcomm Snapdragons and Apple’s A8. Consequently if you’re making something big and powerful like a GPU, all signs point to an unprecedented 4th year of 28nm being the leading node.

With 28nm however that 2 year cadence has stalled, and this has driven GPU manufacturers into an interesting and really unprecedented corner. They can't merely rest on their laurels for the 4 years between 28nm and the next node - their continuing existence means having new products every cycle - so they instead must find new ways to develop new products. They must iterate on their designs and technology so that now more than ever it's their designs driving progress and not improvements in manufacturing technology.

At the start of this year we saw the first half of the Maxwell architecture in the form of the GeForce GTX 750 and GTX 750 Ti. Based on the first generation Maxwell GM107 GPU, NVIDIA did something we still can hardly believe and managed to pull off a trifecta of improvements over Kepler. GTX 750 Ti was significantly faster than its predecessor, it was denser than its predecessor (though larger overall), and perhaps most importantly consumed less power than its predecessor. In GM107 NVIDIA was able to significantly improve their performance and reduce their power consumption at the same time, all on the same 28nm manufacturing node we've come to know since 2012. For NVIDIA this was a major accomplishment, and to this day competitor AMD doesn't have a real answer to GM107's energy efficiency.

At the very high end the GTX 980 will be unrivaled. It is roughly 10% faster than GTX 780 Ti and consumes almost 1/3rd less power for that performance. This is enough to keep the single-GPU performance crown solidly in NVIDIA’s hands, maintaining a 10-20% lead over AMD’s flagship Radeon R9 290X.

Bikerdude is also right. It's better to wait and skip generations. Especially now that we're in between early adopter 4K equipment (a portion of which can't do 60 Hz) and ubiquitous and cheap 4K.

 

http://www.anandtech.com/show/8585/nvidia-geforce-gtx-980m-970m-mobile-maxwell-gm204

http://www.theregister.co.uk/2014/10/07/nvidia_gtx_970m_and_980m/

 

Like some of the earlier GeForce 800M series, the new GeForce GTX 970M and GeForce GTX 980M cards are based on Nvidia's Maxwell architecture. But while Maxwell was reserved for the lower end of the 800M series, Nvidia says the 900M series chips can beat all comers, offering fully 80 per cent of the performance of the equivalent desktop GPUs.

 

http://www.theregister.co.uk/2014/08/12/nvidia_claims_first_64bit_armv8_soc_for_androids/

 

http://www.enterprisetech.com/2014/03/25/future-nvidia-pascal-gpus-pack-3d-memory-homegrown-interconnect/

 

The Pascal GPU will sport 3D stacked memory, just like Volta was expected to, but also adds a high-speed interconnect for linking CPUs and GPUs called NVLink that was under development and that the company says can be pulled into its GPUs earlier.

 

No matter what happens, they will manage to scale down to at least 10nm eventually. GPU performance scaling has been way better than CPU scaling since graphics and CUDA tasks are massively parallel. And even today's shittiest GPUs are basically alien technology compared to what you can buy for $300 10 to X years ago. You are getting the pinnacle of "nanoscale" manufacturing technology for commodity prices (although in the case of NVIDIA discrete GPUs, you're only getting the sub-pinnacle of TSMC, #IntelNumber1). Make a modest budget for the tech products you want and enjoy it.

Edited by jaxa
Link to comment
Share on other sites

  • Bikerdude is also right. It's better to wait and skip generations.

Especially now that we're in between early adopter 4K equipment (a portion of which can't do 60 Hz) and ubiquitous and cheap 4K.

  • Im not perfect, but I have always managed to sell my old card before upgrading. And the last 5 or so have been to my best mate, he gets a cost effective upgrade in VGC with original box, with a portion of its warranty left to run and I get to use said money yo buy the next upgrade. Every 2yrs give or take, it has only cost me around £100 to upgrade (GTX480, HD5870, GTX670, GTX970) he traded the 480 and some cash for the 670 and I sold the 5870 to SirTaff when I built a custom PC for him.

I was in Currys for a laugh today and I was genuinely surprised by how cheap TV's have become, we can buy a respectable 40/42" LED 1080p for £300 and a 4K model for £600. The latest and greatest 60" curved 4k TV's cost £1300+, but aren't worth the cost when most of what we watch over here is still in SD or 720p. Man it was so tempting to consider upgrading my trusty old 36" Flat-screen Toshiba CRT, but then even a £300 LED HD TV can't match the black levels or faithfully reproduce SD content.

Link to comment
Share on other sites

  • I was in Currys for a laugh today and I was genuinely surprised by how cheap TV's have become, we can buy a respectable 40/42" LED 1080p for £300 and a 4K model for £600.
  • The latest and greatest 60" curved 4k TV's cost £1300+, but aren't worth the cost when most of what we watch over here is still in SD or 720p. Man it was so tempting to consider upgrading my trusty old 36" Flat-screen Toshiba CRT, but then even a £300 LED HD TV can't match the black levels or faithfully reproduce SD content.
  • During U.S. Black Friday, you could get a 50" LED 1080p for under $600.
  • I don't think anyone has proven that curved is better (or much better). The industry would sure like people to think so though. Some info.

    With H.265 arriving you could be more likely to watch 720p or 1080p. For example, 45 minutes of 720p in H.265 might be around 200 MB - very small.

  • As a laptop TDM player, one day I hope to experience true black.
Edited by jaxa
Link to comment
Share on other sites

As a laptop TDM player, one day I hope to experience true black.

Well LG have a 50" OLED unit out, but its very espensive @ £2500. The con is its only 1080 not 4k and you'd best take out an extended warranty if it dosent already some with one because OLED's will burn out real fast if you use the TV a lot.

Link to comment
Share on other sites

I tend to skip generations with geforce cards, having skipped the older 4 and 6 series entirely. I believe I went: Pure3D (voodoo) > Riva128 > TNT > GeForce2GTS > GF3 > GF5700Ultra > GF8800GTX > GF660(oc) > GF780(oc)

Now I tend to buy whichever 2-year old card I can afford at the time.

Since my motherboard/CPU/RAM is approaching 5 years old and runs DDR2-800, I am certainly not seeing the performance that my current 780 is capable of.

 

moving the rest of this post to a new thread...

 

Edited by PranQster

System: Mageia Linux Cauldron, aka Mageia 8

Link to comment
Share on other sites

In the old days before the consoles took the thunder from the PC i updated GPU's year after year, after that, i stayed with my crossfire AMD HD 5770 until yesterday, where i bought a single AMD R9 270X 2GB, the jump in performance in 3dmark (the last one using the skydiver demo) went from 9938 using the crossfire system to more than 20000 with a single GPU and this mobo is a pci-e 2.0 and the R9 270X supports pci-e 3.0 so it could be even faster, so the jump in performance to me was fair and i even save on electricity, but this was five generations later and the HD5770 was not a high end GPU to begin with, so waiting for 3rd or more generation cards if you have middle end GPU's is the best course of action imo.

Link to comment
Share on other sites

Do you have similar graphs for the AMD video cards? I imagine the lines will be fairly similar, though I am curious about the power increases through my 3870, 5850, 7870

Intel Sandy Bridge i7 2600K @ 3.4ghz stock clocks
8gb Kingston 1600mhz CL8 XMP RAM stock frequency
Sapphire Radeon HD7870 2GB FLeX GHz Edition @ stock @ 1920x1080

Link to comment
Share on other sites

  • 3 weeks later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recent Status Updates

    • nbohr1more

      TDM 15th Anniversary Contest is now active! Please declare your participation: https://forums.thedarkmod.com/index.php?/topic/22413-the-dark-mod-15th-anniversary-contest-entry-thread/
       
      · 0 replies
    • JackFarmer

      @TheUnbeholden
      You cannot receive PMs. Could you please be so kind and check your mailbox if it is full (or maybe you switched off the function)?
      · 1 reply
    • OrbWeaver

      I like the new frob highlight but it would nice if it was less "flickery" while moving over objects (especially barred metal doors).
      · 4 replies
    • nbohr1more

      Please vote in the 15th Anniversary Contest Theme Poll
       
      · 0 replies
    • Ansome

      Well then, it's been about a week since I released my first FM and I must say that I was very pleasantly surprised by its reception. I had expected half as much interest in my short little FM as I received and even less when it came to positive feedback, but I am glad that the aspects of my mission that I put the most heart into were often the most appreciated. It was also delightful to read plenty of honest criticism and helpful feedback, as I've already been given plenty of useful pointers on improving my brushwork, level design, and gameplay difficulty.
      I've gotten back into the groove of chipping away at my reading and game list, as well as the endless FM catalogue here, but I may very well try my hand at the 15th anniversary contest should it materialize. That is assuming my eyes are ready for a few more months of Dark Radiant's bright interface while burning the midnight oil, of course!
      · 4 replies
×
×
  • Create New...