The NVIDIA GTX 960 and the future of 1080p

Discussion in 'Off Topic' started by iMacmatician, Jan 22, 2015.

?

How does 1080p relate to the primary monitor you use for gaming?

  1. I have moved on to higher resolutions!

    36.4%
  2. I am on 1080p but plan to move to a higher resolution in the next 0-2 years.

    18.2%
  3. I am on 1080p but plan to move to a higher resolution in the next 3-4 years.

    18.2%
  4. I am on 1080p and plan to stay at 1080p for a long time.

    36.4%
  5. I am a peasant who uses a resolution lower than 1080p.

    18.2%
  6. I have skipped or plan to skip over 1080p entirely.

    9.1%
  7. I use a multi-monitor setup for gaming.

    18.2%
  8. I use Dynamic/Virtual Super Resolution to run a game at 4K on my 1080p display.

    9.1%
Multiple votes are allowed.
  1. iMacmatician

    iMacmatician Member

    Messages:
    310
    Likes Received:
    0
    Trophy Points:
    0
    In many previous years, new GPU architectures were released quickly. AMD often introduced three or four cards in the space of a few months. However, the situation seems to be different lately. Since the first half of 2012, we haven’t had more than two GPUs from a single architecture released in the same half year. 11 months after the introduction of the first Maxwell GPU, NVIDIA is finally filling out the midrange with the second generation Maxwell chip GM206. The first desktop card using the GM206 is the GTX 960, and like other x60 cards, is focused on the 1080p resolution.

    [​IMG]

    Notice anything odd about this picture? What about the "effective memory speed"? As you may remember, NVIDIA's 2nd generation Maxwell GPUs feature improved color compression over previous NVIDIA GPUs. NVIDIA claims that the improvements get the same performance from 75% of the bandwidth so the GM206's 7.0 Gbps physical memory data transfer speed is equivalent to 7.0 Gbps x (1/75%) = 9.3 Gbps in a previous GPU. Sure, these improvements have been described in earlier articles and even my GTX 980 thread here mentions effective memory bandwidth, but this is the first time I remember seeing a GPU marketing slide with "effective" clock/data rates. NVIDIA also says that (2nd gen) Maxwell has a 40% per-core performance increase over Kepler, so why not also mention an "effective" core clock of 1127 MHz x 1.4 = 1578 MHz?

    [​IMG]

    You can see that the specs of the GTX 960 are almost exactly half of the GTX 980. Thus it isn’t surprising that 960 is about 56% of the performance of the 980 at 1080p. Note that this percentage gets closer to 50% as the resolution increases. If we compare the GTX 960 to other cards, we see that it has a small (10%) average performance increase over the GTX 760, and falls short of the GTX 770, R9 285, and R9 280X. The latter two are currently priced similarly or slightly above the 960. In the TechPowerUp 1080p benchmarks, the 960 performed better than the 760 at 1080p 16 out of 20 times, but has always stayed ahead of the 660. The latter comparison is more suitable for NVIDIA's target audience for this card, since the GTX 660 is almost a year older than the 760. At least you get lower power consumption than any of those parts.

    [​IMG] [​IMG]

    If you’re disappointed, note that NVIDIA’s xx6 chip has had more than half the specs of the corresponding xx4 chip in the past two generations. Both the Fermi GF106/GF116 and the Kepler GK106 have 192-bit buses compared to the 256-bit buses of the Fermi GF104/GF114 and the Kepler GK104. Also, the GK106 has more than half (960) the cores of the GK104 (1536), and the highest-end cards based on GF106 and GF116 enjoyed notable clock speed advantages over the highest-end GF104/GF114 cards.

    In addition, let’s look at the performance of various single-GPU NVIDIA cards near the $200 (GTX 960) and $550 (GTX 980) price points. The prices are from the AnandTech reviews for the corresponding ~$200 card (as far as that can be done) because it has launched later each time. The performance values are the overall performance percentages at 1920x1080 or 1920x1200 from the TechPowerUp reviews for the corresponding ~$200 card (where possible).

    Code:
                                        GTX 480         GTX 580         GTX 680         GTX 780         GTX 980
                                  Chip  GF100           GF110           GK104           GK110           GM204
                                 Price   $500            $480            $500            $650            $550
    
                                        GTX 460 768 MB  GTX 560         GTX 660         GTX 660         GTX 960
                                  Chip  GF104           GF114           GK106           GF106           GM206
                                 Price   $200            $200            $230            $210            $200
    Performance relative to chip above     61%             66%             74%             60%             56%
    
    The good news is that you’ll see a lot of pre-overclocked cards, some of which don’t even carry a price premium over the base $199. So you might be getting more bang for the buck than what you’d otherwise expect. Of course, you can overclock the card yourself, and some really high core clocks are possible, but the actual performance increases seem to be limited by bandwidth (even after the improved color compression).

    Increased resolution gives an increased performance advantage to the above AMD cards plus the R9 280, but that doesn't occur with nearby NVIDIA cards. In NVIDIA's favor, 1080p is likely to stay a common gaming resolution for the next year or two, which means that there is still a lot of room for 1080p-focused cards such as the GTX 960.
    Code:
             1600p  1080p  1440p  2160p
    GTX 760   90%    91%    91%    90%
    HD 7950   87%    91%    97%   101%
    GTX 960  100%   100%   100%   100%
     R9 285  100%   103%   107%   111%
    R9 280X  110%   115%   123%   129%
    Sources: TechPowerUp, AnandTech - 960, AnandTech - 980, Hardware Canucks, Legit Reviews.
     
    Last edited: Jan 22, 2015
  2. McGyver

    McGyver Experimental Pedagogue

    Messages:
    6,533
    Likes Received:
    31
    Trophy Points:
    0
    People still play on 1080p? What is this, 2005?

    I'm waiting for AMD's answer to Maxwell, if they don't have one, they might as well shut down, considering how fast they are burning money.
     
  3. D.D.D. Destroyer

    D.D.D. Destroyer Member Staff Member Moderator

    Messages:
    9,509
    Likes Received:
    111
    Trophy Points:
    0
    I play in 720p. Hater.
     
  4. ImSpartacus

    ImSpartacus nerf spec plz

    Messages:
    8,598
    Likes Received:
    7
    Trophy Points:
    0
    Do you even fiji?
     
  5. McGyver

    McGyver Experimental Pedagogue

    Messages:
    6,533
    Likes Received:
    31
    Trophy Points:
    0
    Planned for summer and will consume 400W, that's not the future.
     
  6. ImSpartacus

    ImSpartacus nerf spec plz

    Messages:
    8,598
    Likes Received:
    7
    Trophy Points:
    0
    TDP?

    Meh, it's a desktop. I'll take as much heat as the GPU manufacturers are willing to put out as long as the cooler matches it. I think fiji is rumored to be getting a CLC, so there shouldn't be issues with cooling, albeit with a small hit to reliability.

    So I see no reason to think that AMD is out of the GPU fight.
     
  7. CyberKiller

    CyberKiller Nyooks!

    Messages:
    1,107
    Likes Received:
    8
    Trophy Points:
    0
    Why would I want a 2GB graphics card now? Even 4GB seems kind of mid range.
     
  8. ImSpartacus

    ImSpartacus nerf spec plz

    Messages:
    8,598
    Likes Received:
    7
    Trophy Points:
    0
    If you're only gaming at no more than 1080p like >95% of steam users polled, then it wouldn't matter to you as much.

    Nvidia's GTX *60 GPU is a perennial sales powerhouse, so I trust that they did their homework. Though I wouldn't be surprised if we get a GTX 965 with a 192-bit bus (and therefore 3GB of VRAM). The gap between the $320+ 970 and the $200 960 is a wide one at the moment.
     
  9. ImSpartacus

    ImSpartacus nerf spec plz

    Messages:
    8,598
    Likes Received:
    7
    Trophy Points:
    0
  10. Lawliet

    Lawliet Member

    Messages:
    869
    Likes Received:
    19
    Trophy Points:
    0
    I heard this is like 15% better than the 760 but I haven't really looked at benchmarks atm. I'm still crying over my 970 vram issue. :(
     
  11. CyberKiller

    CyberKiller Nyooks!

    Messages:
    1,107
    Likes Received:
    8
    Trophy Points:
    0
    Well, I reckon that the VRAM requirements/usage of games are going to increase.
    Same with RAM usage too.
     
  12. Z100000M

    Z100000M Vithered Weteran

    Messages:
    9,120
    Likes Received:
    70
    Trophy Points:
    0
    I still wonder when its going to be a good idea to upgrade from my 770. It feels like it was a value purchase, but it seems the games are not really content with staying with the console level power at all. I kinda worry when i see it being recommended (ie forget about it looking good) for witcher3
     
  13. ImSpartacus

    ImSpartacus nerf spec plz

    Messages:
    8,598
    Likes Received:
    7
    Trophy Points:
    0
    You gaming at 1440p or greater? If not, then that 770 is perfect for you.
     
  14. complete_

    complete_ lamer

    Messages:
    6,438
    Likes Received:
    144
    Trophy Points:
    0
    since this is a resolution topic, how does 1080p fare on 27-28inch monitors? i've heard conflicting opinions
     
  15. ImSpartacus

    ImSpartacus nerf spec plz

    Messages:
    8,598
    Likes Received:
    7
    Trophy Points:
    0
    Big pixels r bad, mkay?

    Save the cash and just get a 24" monitor if you're chill with 1080p. Those 1080p 27" monitors are literally just there to trick people into paying too much. There's no advantage aside from bigger pixels. You're supposed to want small pixels, not big ones.
     
  16. complete_

    complete_ lamer

    Messages:
    6,438
    Likes Received:
    144
    Trophy Points:
    0
    already have one. thinking about the future and using one as both a tv from a distance and monitor close up
     

Share This Page