Benchmarks released for the new Nvidia Graphics Cards: RTX 2080 and RTX 2080ti: First thoughts?

I was poised to pick up a RTX 2080 as I just got a 1440p 144hz G-Sync monitor, but I think I will look for a GTX 1080 once the prices drop!

Price to performance wise the step up from Pascal to Turing isn't as overwhelming as we all anticipated (I blame AMD's lackluster Vega line for this!)

Will anyone be picking up a Turing card this Christmas?

Cheers!

Comments

  • What are the benchmarks?

  • +1

    2 years for a 30% increase in raw performance and no real games with ray tracing implementation is shameful

  • For now the rtx is not really worth the money. Waiting for the 1070/1070ti/1080 to drop some price, so I can finally upgrade my card.

    • +1

      im with you!

    • -1

      Waiting? The prices are already incredible, I bought a 1080Ti at launch, thankfully before the cryptomining price hike, and if I didn't think the last almost year and a half of gaming were worth the price difference to now, I would be in tears!

  • +1

    Seems like the 1080Ti is faster than the 2080 in some applications. Overall, I'd put them on equal standing.
    We were expecting around ~20% performance between the two, and another ~40% to the 2080Ti. That was hopeless wishing.

    Nvidia is much more smarter than people give them credit.
    Instead of pushing for more and more cores, one smaller and smaller nodes… they're looking into specialising.
    That's where they can make performance gains without spending money on silicon, so its a more profitable choice.
    Not to mention, it makes them harder to compete with.

    I suspect AMD is going to be eager to push Navi and 7nm late next year, but its going to cost them a lot of money, which in-turn is going to make it expensive.
    That means Nvidia has a clear 2-year headstart to get Game Developers start optimising for Turing, DLSS, and Ray Tracing.
    So a potential "Navi64" might only be ~30% faster than the 2080 Ti, but cost the same…. at which point Nvidia can take their sweet time to let AMD absorb some silicon cost, then they could bring the successor (3080 ?) with minor improvements, on a 7nm silicon to steal the top spot back from AMD but doubling-down on their proprietary technologies (ie Mixed Realtime Ray Tracing v2.0 = ie properly working RTX). They'd probably only stretching the top-spot performance by another ~20% metric, whilst increasing cost to ~30%.

    It's genius.

    • intel's revamped larrabee ray tracing gpu coming in a year+ or so

    • Seems like the 1080Ti is faster than the 2080 in some applications. Overall, I'd put them on equal standing.
      We were expecting around ~20% performance between the two, and another ~40% to the 2080Ti. That was hopeless wishing.

      In games, the 2080ti performed quite a bit better than the 1080ti, ~30% in 4 games listed in the arstechnica link.

      • I think you misunderstood me…

        I was just saying that Nvidia tried to imply/pitch it like:
        2080 Ti: 160%
        2080: 135%
        1080Ti: 100%

        Yet we were expecting:
        2080 Ti: 150%
        2080: 120%
        1080Ti: 100%

        But in reality, its worse:
        RTX 2080 Ti: 130% (290W)($1,300+)
        RTX 2080: 104% (240W)($999)
        1080Ti: 100% (260W)($850)

        ….seems like the best value card at the high end is the tried 'n true GDDR5 specced GTX 1070 Ti (180W).
        The GTX 1070 Ti performs around ~70% of the performance of the 1080Ti but costs only ~55% ($500/$850) of the price.
        Compared to the 2080, its ~65% of the performance for ~45% ($500/$999) of the price.
        Compared to the 2080Ti, its ~50% of the performance for ~35% ($500/$1300) of the price.

        And yes, I did round the figures out. And I know you can get better Dollars-to-Frames with an RX 580, GTX 1060, RX 470, GTX 1050 Ti but those aren't actually high-end. Or get even better value from used cards like the R9 390X, GTX 980, R9 290.
        I think there's very little refinement going from the 16nm to the 12nm wafers, and also very little refinement going from Pascal to Turing. And whatever gains that were made, they were all spent to making a larger, hotter, more power hungry cards… all for the sake of adding new cores for DLSS and Ray Tracing. Nvidia wouldn't have been able to pull this "sidegrade" off (and price-jacking), if the Vega56/Vega64 performed 15-35% better and cost around 10-30% less money, and had regular levels of availability. Which are definitely doable, and the reason they weren't done was because AMD pulled engineers and funding away from Vega to work on other projects, and it seems to have impacted the development of the Xbox Scorpio too. Because in that alternate universe, Nvidia would have to relegate RayTracing and DLSS further into the future to make sure they are more competitive today, which also includes not-skimping on the memory choices.

    • Maybe.

      Personally, I saw what they were doing was stupid. It allows AMD to just go with a big bump in performance while ray tracing is in the worse stages then work on it for their next cards. Better yet, if AMD really wanted that top spot they could probably work with selling some big cards for approximately the same cost as the 10 series cards, so their cards would be cheaper while offering as good/better raw performance as the 20 series cards. Yeah, they'd be missing ray tracing. But realistically that'll just be the 4K of recent. Some will care, but more people will care about the actual performance.

      At least for the start. But AMD would have a bit before they'd "need" effective ray tracing. And by that point it'll he a worthwhile selling point.

      • True.

        With that said an R9 290 is faster than a GTX 1080 Ti…theoretically.
        There is a huge disconnect between the hardware's potential and what it delivers in real world. Just see Linux Gaming. Both Nvidia and AMD can make their biggest performance gains by focusing on the software, and making it more efficient.

        And that's what Nvidia's going for. And that is why they will not only survive, but thrive and dominate. AMD will fail if they stick to your suggestions. The market is not healthy, competition is non-existence, and the mindshare is a big part of that.
        The folks at Team Red can throw all the silicon and money they can muster, but it will not succeed because the software is not as optimised as Nvidia (basically due to Nvidia's large staff numbers/developers, accounted for in their R&D), and this will always hobble them enough for Nvidia to keep their fat profits (cheaper silicon). And if the Radeon division bleeds revenue, business is business, and there will be layoffs, or cancelled, or sold-off to another company. See Sony's VAIO division.

        What I foresee is this:
        - Nvidia holds their performance lead, and AMD yields them this
        - Nvidia doubles-down on their Software (and AI) divisions
        - Marketing division for Nvidia also grows
        - Classes of gaming becomes 1080p > 4K > 4K-RayTracing, as RT becomes mainstream in 2022
        - AMD focusses mostly on Ryzen, and reduces Radeon division
        - AMD's GPUs find most success in the PS5/Xbox V, some success as iGPU in Ryzen v3
        - AMD will release midrange GPUs (eg RX 550/470) at high enough prices to be profitable and competitive to keep the Radeon division going
        - AMD will release BIG Block GPUs (eg Vega64) in limited supply just to say "hey, we still exist" but they won't be cheap, nor market-defining
        - Radeon and AMD become more profitable, which will sustain them long enough to hopefully compete around 2030
        - Overall, we the consumers lose and it is because of our fault for not supporting the market (examples: 4870/6850/hd7970/r9 290…all were faster AND cheaper than Nvidia cards but they didn't sell much, and profits were low or in some cases made losses)

  • +1

    how long until AMD release their next cards?

    when i bought a 980GTX, it had hairworks and smoke. It became natural to disable hairworks to get better framerate. i saw the smoke once in batman arkham knight from the batmobile. the experience was more exciting in the youtube video.

    If the price hits, $400USD 1080s will be a great stocking filler

    • +1

      how long until AMD release their next cards?

      Next gen of cards, or a gen of cards with equivalent ray-tracing tech? The former is AMD's Navi architecture cards which are due out end-2019 to early-2020. As for ray-tracing… yeah I don't think even AMD knows right now.

    • hairworks, what a joke. in final fantasy xv you can barely notice it, and even then its not worth the fps

  • Looks good of course. Not skipping a generation is just silly for video card buyers, so when compared to the 9-series, yeah I can't be disappointed at all. But at the same time, I was floating the idea of getting one and giving the 1080Ti to my husband, but I think we'll just get another 1080Ti, in real world use on non-4K screens the value just isn't there price-wise.

  • First thoughts?

    Don't have any yet.

    Its "killer" feature and upgrade over current-gen cards is ray-tracing functionality. Ray-tracing will be the next big thing in gaming graphics, there's no doubt about it (as big a jump as 8-bit sprites -> polygons, or 2D -> 3D), but it's anyone's guess if that'll start showing up in games soon enough for the RTX20XX series to make use of it.

    And that's the biggest issue, because you're paying a premium over current GTX10XX series cards for it, and paying a large premium if you're buying now. If games with ray-tracing start showing up next year? Congrats. If they show up only in 2020 or later though, it means you could've really waited another 2 years to buy after the price has dropped a few hundred from its release.

    So……. yeah. As always though, early-adopters will take the hit for being early adopters, but that's nothing new.

    • +1

      Nope.
      I typed out 3 paragraphs of why the stuff you wrote is an exaggeration/wrong, then the Tab closed.
      Now I don't care enough to re-type it all.

      The TL;DR version was "Standard Rasterisation Graphics that is assisted with "Hardware Accelerated Ray-Tracing" is going to be an improvement more on the lines of Tessellation (if you're old enough to remember), and only in 2022 will it be useful/mainstream. So you're better off getting a RTX-4080 instead, which means sticking with a Standard Card (eg/ GTX 1070 Ti) until then.

      Here's some must-see videos:
      https://www.youtube.com/watch?v=dLjQR0UFUd0
      https://www.youtube.com/watch?v=jyRnPgZO09k

      • +1

        Hahahaha my condolences. I hate when that happens.

        "Standard Rasterisation Graphics that is assisted with "Hardware Accelerated Ray-Tracing"

        Ah, so not actual full-on ray-tracing graphics, and not that great of an improvement? Yeah - I was wondering how ray-tracing became so accessible when it was (and now I guess still is) reserved for big studios and their pre-renders.

        And yeah - I can see a situation where it won't really spread to the mainstream for a few years so the current gen of ray-tracing cards won't be that useful. It's a bit of a chicken-and-egg problem though, and I'm glad nVidia is taking the initiative: Games developers won't code ray-tracing into games until there's hardware that can take advantage of it, but it's a risk to make hardware as nVidia is doing now, for technologies that aren't yet widespread.

        • +2

          It's a risk, but a minimal risk to Nvidia since they didn't see good competition since the GTX 780Ti/R9 290 era.

          And Nvidia's much more smarter than people give them credit for.
          They will invest in Ray Tracing today when they have the chance, and in the future when competition is much worse they will have a differentiating factor, which will artificially inflate the value of their products.

          I see AMD copying Nvidia, and making their cards compatible with Ray Tracing, but not for Navi/2019. Probably for the series after, which will probably come again in 3-years after, so realistically we're talking 2022 for AMD. And at that point Nvidia would likely be far ahead, and Ray-Tracing would be becoming a mainstream technology. Although 2020's PS5 and Xbox V are out of the question for Ray Tracing, since neither are willing to pay to buy Nvidia hardware for their consoles…apparently even Nintendo are getting creamed by paying stupidly for the obsolete Tegra X1.

          Although I suspect more competitors to prop up in those years, created by the vacuum of AMD's absence, from the likes of ARM and Samsung scaling their small-cores to the larger devices. So who knows what to expect.

  • I'm waiting until they bring out the next one on 7nm. 2180 or whatever it'll be called. Already have a 1080 and its perfect for 1440p. Hopefully Cyberpunk 2077 will be out at the same time!

    2180 should have more performance as they can cram in ~40% more transistors per mm2 compared to 12nm process, which is really the older 16nm process refined, like intels 14nm++.

    Ray tracing is interesting but it needs to be running in games at least 1440p or above resolutions. 1080p is old hat now. At least without ray tracing, 60fps 4k is viable now with the 2080ti.

Login or Join to leave a comment