Skylake - let's wait and pray for it together

Discussion in 'Off Topic' started by McGyver, Apr 24, 2015.

?

Emma!

Poll closed Jul 10, 2015.
  1. [IMG]http://i.imgur.com/8oIQnY1.jpg[/IMG]

    3 vote(s)
    50.0%
  2. [IMG]http://i.imgur.com/OwlKTLF.jpg[/IMG]

    0 vote(s)
    0.0%
  3. [IMG]http://i.imgur.com/mPARTDl.jpg[/IMG]

    1 vote(s)
    16.7%
  4. [IMG]http://i.imgur.com/d92ngxZ.jpg[/IMG]

    2 vote(s)
    33.3%
  1. w00kie

    w00kie Mustachioed Mexican

    Messages:
    3,863
    Likes Received:
    13
    Trophy Points:
    0
    You can't?? I can (but it downsamples).
     
  2. McGyver

    McGyver Experimental Pedagogue

    Messages:
    6,533
    Likes Received:
    31
    Trophy Points:
    0
    Nope, my Q6600 is too weak and my AMD card doesn't support accelerating the decoding on the fly.
     
  3. w00kie

    w00kie Mustachioed Mexican

    Messages:
    3,863
    Likes Received:
    13
    Trophy Points:
    0
    And that's "only" a 20% performance difference. (q6600 vs q9550)
    Now thing about all the bottlenecks you currently run into.


    EDIT:

    LOL, Intel changed dropped support for EHCI on boot and with it USB Input devices get fucked during OS installation.
    [​IMG]

    better have a PS2 keyboard ready :rolleyes:
     
    Last edited: Aug 6, 2015
  4. ImSpartacus

    ImSpartacus nerf spec plz

    Messages:
    8,598
    Likes Received:
    7
    Trophy Points:
    0
    The whole idea of a benchmark is that you test something that represents a user's use case.

    Skylake does great at cinebench, but that's not a good benchmark of our use case, so we don't care. Equally so, gaming at 720p isn't part of our use case, so we still don't care.

    Our performance-limited use case is at 1080p-4k+ gaming. That's where we care.

    If it turns out that cpu performance isn't super useful in that area that we care about, then the logical conclusion would be to not spend undue money on the cpu and, instead, spend that money on your gpu or something that actually improves performance in the area that we care about.

    Similarly so, even if relevant performance under skylake becomes meaningful in the presence of speedier ram, it still wouldn't be a good idea because now you're spending too much on ram as well as your cpu & brand new mobo. At some point, you look at your build and go, "well shit, I could've lost <5% relevant performance moving to a cheaper cpu, mobo and ram and used the savings to get 10-20+% more relevant performance with a better gpu." This effect only gets magnified as our monitors push out more and more pixels.

    The return of not-shit overclocking is nice, but it requires a pricier vrm-packing mobo and much more expensive cooling to deal with the denser 14nm chip. Again, both of those cause the cost to go up and that's money you aren't spending on your gpu.
     
  5. flasche

    flasche Member Staff Member Moderator

    Messages:
    13,299
    Likes Received:
    168
    Trophy Points:
    0
    or you know
    [​IMG]
    costs a few euros.
     
  6. Beerdude26

    Beerdude26 OnThink(){ IsDownYet(); }

    Messages:
    7,243
    Likes Received:
    13
    Trophy Points:
    0
    I never switched. #ps2masterrace
     
  7. iMacmatician

    iMacmatician Member

    Messages:
    310
    Likes Received:
    0
    Trophy Points:
    0
    The Hardware Canucks 1080p gaming benchmarks show a 0%-4% increase over the 4790K.
     
  8. w00kie

    w00kie Mustachioed Mexican

    Messages:
    3,863
    Likes Received:
    13
    Trophy Points:
    0
    Ahem, many people will reuse their 1150/1155/1156 cooling solutions that were mostly overkill at the time they bought it. I don't expect a cooling problem as with AMD CPUs. Heck, I even reuse my socket 775 cooler.
     
    Last edited: Aug 7, 2015
  9. McGyver

    McGyver Experimental Pedagogue

    Messages:
    6,533
    Likes Received:
    31
    Trophy Points:
    0
    Nobody needs much CPU power for single player games, even GTA V runs somewhat decent on my seven year old CPU. People interested in unlocked CPUs are probably looking into them because they want more multiplayer performance and it's this use case from WoW to Battlefield 4 that all the tech blogs avoid like cholera.

    That is why 720p test are the most important to decide what processor one should get, it's in this setting only that the GPU has no (or little) influence on the result and thus allows extrapolation to all the CPU-limited (multiplayer) games I like.

    You could always test CPUs with 12k display resolution and come to the conclusion that there is no difference between a Q6600 and a 6700K. Or with some statistical deviation get a result where the Q6600 is actually 2% faster. That's pretty much what Anandtech did.

    Anyways, I'm probably going to order the same stuff as Wookie did, Skylake i5 + ASUS motherboard, just not sure if I should invest in faster RAM or not. And also not sure if a better soundchip on the board is worth the money.

    The performance jump will be glorious.
     
  10. w00kie

    w00kie Mustachioed Mexican

    Messages:
    3,863
    Likes Received:
    13
    Trophy Points:
    0
    From the DDR4 reviews so far, I hope to overclock that 2133 ram to 2666 without much hassle. But that's just wishful thinking right now. Parts will be here tomorrow. OC options won't be touched until everything is up and running and I had my share of stock gametime.
     
  11. flasche

    flasche Member Staff Member Moderator

    Messages:
    13,299
    Likes Received:
    168
    Trophy Points:
    0
    i see ...
     
  12. McGyver

    McGyver Experimental Pedagogue

    Messages:
    6,533
    Likes Received:
    31
    Trophy Points:
    0
    What do you see?
     
  13. w00kie

    w00kie Mustachioed Mexican

    Messages:
    3,863
    Likes Received:
    13
    Trophy Points:
    0
    He's just jealous of your hairy old cock.
     
  14. ImSpartacus

    ImSpartacus nerf spec plz

    Messages:
    8,598
    Likes Received:
    7
    Trophy Points:
    0
    That's a good point. I admit that I didn't look past the linked page of 720p benchmarks. As to be expected, there's still a small benefit as you double the number of pixels to 1080p. I wish that they would've reviewed results at 1440p to see what happens when you double pixel count using their testing methodology. I'm surprised that they only tested at 1080p & lower in 2015 with 4k adoption steadily ramping up. I haven't read every page, but I can't find an explanation for why they tested like that.

    I've heard that kind of speculation from laymen, but I can't recall reading that from any notable tech reviewer. Where did you hear about it?

    Don't get me wrong, it's plausible, but I've never seen any evidence that suggests that multiplayer games demand a significant cpu load (especially from modern multicore cpus). I tend to be pretty biased in the limited sample of review sites that I frequent, so I might be missing something.

    Furthermore, if multiplayer games are more cpu heavy, then I haven't seen any testing that suggests that low res offline play is a good benchmark for high res online play. It could be, but I can't recall reading about that.

    Based on a snippet from hardware canucks, it doesn't look like they meant to approach game testing in a unique way. so I don't think they did low res offline gaming benchmarks to emulate high res online gaming. They seem to have pretty much the same conclusion as other sites.

    I feel like if online gaming needed a special benchmarking approach, then these reviewers would at least mention it. It's just too popular compared to offline gaming.
     
    Last edited: Aug 6, 2015
  15. iMacmatician

    iMacmatician Member

    Messages:
    310
    Likes Received:
    0
    Trophy Points:
    0
    That increase probably isn't enough to change your earlier conclusion though.

    I wonder when 5K benchmarks will become commonplace in GPU reviews.
     
  16. flasche

    flasche Member Staff Member Moderator

    Messages:
    13,299
    Likes Received:
    168
    Trophy Points:
    0
  17. McGyver

    McGyver Experimental Pedagogue

    Messages:
    6,533
    Likes Received:
    31
    Trophy Points:
    0
    Let's be honest, those reviewers writing for tech blogs aren't exactly scientists and they usually don't have much time to get their review out. That's the reason you don't hear them talking about online gaming performance, they simply can't/don't want to bench it, since you can't create identical settings over several runs. I'm sure there is a way to create benchmarks from online play by just doing a lot of benches so the volatility cancels out, but it would be a gargantuan task to get results for 10 different CPUs with deviation of lets say +-5%.

    Anyways, multiplayer usually means fighting against a lot of other players at the same time, that means a lot of detailed models that represent these players on screen at the same time, that means a lot of polygons need to be drawn and texture mapped that means a lot of draw calls and memory access which is what the CPU does. Atleast that's the case for Battlefield, World of Warcraft and Guild Wars 2.

    Here are some good benchmarks in 1080p that compare Skylake i5 and Haswell i5, they are both at the same clock, so it does show the IPC progress:

    http://pclab.pl/art65154-17.html
     
  18. D.D.D. Destroyer

    D.D.D. Destroyer Member Staff Member Moderator

    Messages:
    9,509
    Likes Received:
    111
    Trophy Points:
    0
    There are games that feature benchmarks within them. ArmA 2 springs to mind.

    Whatever setup I used I got 9 fps on that night-time benchmark.
     
  19. flasche

    flasche Member Staff Member Moderator

    Messages:
    13,299
    Likes Received:
    168
    Trophy Points:
    0
    in arma 2?
     
  20. D.D.D. Destroyer

    D.D.D. Destroyer Member Staff Member Moderator

    Messages:
    9,509
    Likes Received:
    111
    Trophy Points:
    0
    Yeah, that benchmark was the shit. I don't even know what caused the bottleneck, you couldn't see too much.
     

Share This Page