NVIDIA brings Maxwell to the high end

Discussion in 'Off Topic' started by iMacmatician, Sep 19, 2014.

?

What type of product do you most want to see GM204 in?

  1. [url="http://forums.empiresmod.com/showthread.php?t=18756"]A Steambox for [i]Candles[/i].[/url]

    85.7%
  2. [url="http://forums.empiresmod.com/showthread.php?t=18259"]A nice and slim gaming laptop.[/url]

    14.3%
  3. The next computer I decide to build.

    57.1%
  4. All-in-one desktops like the rumored Retina iMac.

    14.3%
  5. Due to the GM204's performance/W, I'm not upset at the lack of 20 nm... at least for now.

    0 vote(s)
    0.0%
  6. I camped out to get an iPhone 6/6+ on launch day because I want an 20 nm GPU now!

    0 vote(s)
    0.0%
Multiple votes are allowed.
  1. iMacmatician

    iMacmatician Member

    Messages:
    310
    Likes Received:
    0
    Trophy Points:
    0
    We saw NVIDIA's high-performance-per-watt Maxwell architecture land in the low-end with the 750 and 750 Ti earlier this year. Now they have improved it with the GM204. While the "4" indicates a normal performance chip as opposed to a "big" compute-oriented chip, its positioning in the GeForce lineup is at the high end (for now), just like with the GK104 over two years ago.

    [​IMG]

    Unlike with the GK107 and GK104, the GM204 is not quite four times the GM107 in terms of SMM count. While the GM107 has 5 SMMs per GPC, the GM204 has 4. Due to various improvements in 2nd generation Maxwell, the GM204 is even more efficient than GM107, which was quite a boost over Kepler to begin with. And if anyone is wondering, yes, the GM204 is still 28 nm. Currently 20 nm is mostly for phone SoCs. Last year I remember discussing the situation of a 3rd year of 28 nm with ImSpartacus, and here we are, soon to start the 4th year of 28 nm (we can safely assume the GM204 won't be retired for most of a year at least).

    Notice that the memory bandwidth of the 980 is less than that of the 780 Ti (also the 780) and equal to that of the 770. This situation is not bad news in and of itself, since it's generally the case that the GFLOPS/bandwidth ratio for GPUs increases over time.

    Code:
    NVIDIA
                       8800 GTX    GTX 280    GTX 480    GTX 680    GTX 980
    Architecture          Tesla      Tesla      Fermi     Kepler    Maxwell
    SP GFLOPS             518        933       1435       3090       4612
    Bandwidth (GB/s)       86.4      141.7      177.4      192.0      224.0
    Ratio                   6.0        6.6        7.6       16.0       20.6
    
    AMD
                        2900 XT       4870       5870       6970       7970      290X
    Architecture           R600       R700  Evergreen       N.I.       S.I.      C.I.
    SP GFLOPS             476       1200       2720       2703       3789      5632
    Bandwidth (GB/s)      128.0      115.2      153.6      160.0      264.0     320.0
    Ratio                   3.7       10.4       17.7       16.9       14.4      17.6
    NVIDIA made another change with GM204 though, they doubled the number of ROPs, whose job is to form the final pixels that are sent to the display. They use a considerable amount of bandwidth so it doesn't necessarily make sense to increase ROPs without an associated memory bandwidth increase. But GDDR5 is maxed out, high-bandwidth memory isn't here yet, and a wider memory interface can cost more in terms of $ and power than a narrower one. NVIDIA's solution was to improve color compression. Color compression looks for multiple identical pixels or certain patterns among pixels to reduce color data sizes and thus bandwidth. With 2nd generation Maxwell, the number of recognized patterns has increased, resulting in more bandwidth savings.

    [​IMG]

    Thus while the GTX 770 has 224 GB/s bandwidth, the GTX 980 has, in this sense, an average of ~297 GB/s bandwidth.

    The power consumption of the GTX 980 and 970 are impressive but unsurprising after the GTX 750 and 750 Ti. Under load, the 980 uses similar power to a 680, and significantly less than any GK110 or Hawaii card. Its performance per watt is only beaten by GM107-based cards.

    [​IMG]

    As for performance, the 970 fits in right with the previous generation's top cards and the 980 has a small but notable lead. Those who didn't find GK110- and Hawaii-based cards a compelling upgrade over whatever they already had probably won't see GM204 as one either.

    [​IMG]

    One place I hope to see more Maxwell GPUs is in systems such as Steamboxes. Supposedly GM204 is coming to mobile with a large performance increase over GK104- and Pitcairn-based mobile GPUs. While mobile GM204 is likely to be 75-100+ W, similar to mobile GK104, at least one can get a lot more performance out of it. There is reportedly also a GM206 coming soon, and if it ends up halfway between the GM107 and GM204 then it'll have around GK104 performance for around GK106/Pitcairn power consumption.

    Sources: AnandTech, TPU.
     
    Last edited: Sep 8, 2015
  2. Lawliet

    Lawliet Member

    Messages:
    869
    Likes Received:
    19
    Trophy Points:
    0
    I may purchase a 970 in the near future considering the power consumption is perfect for either of my rigs.
     
  3. McGyver

    McGyver Experimental Pedagogue

    Messages:
    6,533
    Likes Received:
    31
    Trophy Points:
    0
    I'm glad they can squeeze out some more performance out of 28nm silicon. But really, 4 years on the same process is just silly. I hope we don't hit the same performance peak as with CPUs, 4k gaming will be awesome, just waiting for some good IPS 4k display.
     
  4. Z100000M

    Z100000M Vithered Weteran

    Messages:
    9,120
    Likes Received:
    70
    Trophy Points:
    0
    Will 4k actually be usable in any form? Thats a crapton of pixels to compute and with supposed "advancements" in graphics, won't it just lead to having 10 fps , even with top tier cards? Im not feeling it as well because how the " LOL 12/10 NEXT GEN, GOTYAY" games are pretty crap. Cant really enjoy movies in it as they will probably stick to the 720 and 1080 for like the next 10 years.
     
    Last edited: Sep 19, 2014
  5. ImSpartacus

    ImSpartacus nerf spec plz

    Messages:
    8,598
    Likes Received:
    7
    Trophy Points:
    0
    I was thinking about making this thread. I would've called it "Trickster so mad," because Trickster loves extracting as much efficiency from his GPUs as possible and the new highest performance Nvidia GPU is probably its most efficient GPU ever.

    Lel, already have. We're living in a post-Sandy Bridge world.

    It's all about power consumption now. The same core in your ~4.5W tablet is also used in your 80+W desktop. We know that CPU architectures can generally only be designed for efficient use within an order of magnitude of TDPs and the range I just described is quite a bit bigger than that. Something has to give. It's either going to be the 50-80W range or the 8-4.5W range. Which range do you think Intel cares about more? Fun fact: This is also why today's overclocking is shitty.

    We're already getting ~30 average fps on 4K Crysis 3 @ high. GPU manufacturers kinda saw this coming.

    [​IMG]
     
    Last edited: Sep 19, 2014
  6. Lazybum

    Lazybum :D Staff Member Moderator

    Messages:
    4,827
    Likes Received:
    190
    Trophy Points:
    0
    This may sound silly, but I swear I heard from somewhere(it was like 10 years ago or something), that the higher the res the less need for anti-aliasing. Is that a silly statement? I just say it because if true then anti aliasing can be toned down and still look just as good if not better than what you get in a smaller display.

    Oh that reminds me, isn't fxaa the cheapest anti aliasing? I wonder what the bench marks would look like if they used msaa or even ssaa.
     
  7. Candles

    Candles CAPTAIN CANDLES, DUN DUN DUN, DUN DUN DUN DUN.

    Messages:
    4,251
    Likes Received:
    10
    Trophy Points:
    0
    That's true assuming a constant screen size. Or, rather, it's true for an increasing PPI but not necessarily for an increasing resolution. Aliasing is caused because pixels are discrete and not continuous. Once you hit a certain pixel density, then the difference between a discrete color display and a continuous one becomes blurred, both metaphorically and physically.
     
  8. Z100000M

    Z100000M Vithered Weteran

    Messages:
    9,120
    Likes Received:
    70
    Trophy Points:
    0
    But thats terrible. Crysis only on high? And, from personal experience, its a pretty nicely optimised game right now. Newer titles will have even more requirements. It seems like its at least two years before its anywhere near usable.

    I also love how its called "4k". Gotta love marketing, seeing as the last "new big thing" shoulve been called "2k".
     
  9. ImSpartacus

    ImSpartacus nerf spec plz

    Messages:
    8,598
    Likes Received:
    7
    Trophy Points:
    0
    4K was inherited from photography & film. TV marketers use UHD, which is consistent with their previous HD marketing.
     
  10. iMacmatician

    iMacmatician Member

    Messages:
    310
    Likes Received:
    0
    Trophy Points:
    0
    Spartacus, I think you'll like this:

    Gigabyte Mini-ITX GTX 970
    • 1076 MHz base, 1216 MHz boost (which are 2% and 3% higher respectively than the reference clocks)
    • 4 GB VRAM
    • 1x 8-pin
    • 12 cm long
    [​IMG]

    Source: Guru3D
     
  11. ImSpartacus

    ImSpartacus nerf spec plz

    Messages:
    8,598
    Likes Received:
    7
    Trophy Points:
    0
    Oh, I saw it on tech report and had an instagasm. I want one.
     
  12. Candles

    Candles CAPTAIN CANDLES, DUN DUN DUN, DUN DUN DUN DUN.

    Messages:
    4,251
    Likes Received:
    10
    Trophy Points:
    0
    The real problem with small form factors is that I can't shove a D-14 into one.
     
  13. ImSpartacus

    ImSpartacus nerf spec plz

    Messages:
    8,598
    Likes Received:
    7
    Trophy Points:
    0
    We've been over this...

    It can be done. The last time fooshi asked, I provided pics.

    And where it can't, why the fuck aren't you on a clc anyway? I'd think you people were peasants.
     
    Last edited: Oct 21, 2014
  14. Trickster

    Trickster Retired Developer

    Messages:
    16,576
    Likes Received:
    46
    Trophy Points:
    0
    When are they going to stop wasting time with this efficiency bullshit and actually bring out a card capable of giving good FPS on a 4k monitor. Seriously, what the fuck is this shit. I want a 4K monitor but I'm not gaming in shit graphics or anything less than glorious 60fps.
     
  15. D.D.D. Destroyer

    D.D.D. Destroyer Member Staff Member Moderator

    Messages:
    9,509
    Likes Received:
    111
    Trophy Points:
    0
    I bet they can't and are just focusing on powersaving until someone in the tech department has a breakthrough.
     
  16. ImSpartacus

    ImSpartacus nerf spec plz

    Messages:
    8,598
    Likes Received:
    7
    Trophy Points:
    0
    Never. This architecture is in tablets now. I'm sure Nvidia wouldn't mind going into phones in a few years.
     
  17. w00kie

    w00kie Mustachioed Mexican

    Messages:
    3,863
    Likes Received:
    13
    Trophy Points:
    0
    use two cards blergh blergh blergh.


    (yes, SLI and DualGPU cards are shit)
     
  18. Space_Oddity

    Space_Oddity The Shitstorm

    Messages:
    2,958
    Likes Received:
    43
    Trophy Points:
    0
    People still buy nvidia products?
     
  19. ImSpartacus

    ImSpartacus nerf spec plz

    Messages:
    8,598
    Likes Received:
    7
    Trophy Points:
    0
    I bet you're fun at parties.
     
  20. McGyver

    McGyver Experimental Pedagogue

    Messages:
    6,533
    Likes Received:
    31
    Trophy Points:
    0
    With the 970, Nvidia has the best mainstream card out there and with dual 980s you get in a territory where 4k@60FPS is possible. Also, Gsync works fine and only with Nvidia cards, so that is another good reason to go with Nvidia.
     

Share This Page