Buying A GPU To Last

So I built myself a new system about a month ago, and knowing that the RTX was around the corner I've held out on buying a GPU and have just got by with an old HD 7850 2GB OC GPU I have from a 2013 build. I want this GPU to last me the next 6 years, so I'm not going for a budget GPU. I'm pretty easily pleased (playing intense games like AC Origins with my old GPU has been a little laggy, but other than that it's been fine), so I don't expect to have Ultra settings on with this GPU in 2025 or anything, but I want to be able to still enjoy new games (even on lowish settings which, to be honest, will probably be incredible by that point).

What I'm not sure of is whether going for the RTX line is a smarter long term decision (given the long-term potential for expansion in ray-tracing in games?). Should I just get a 1080 when they get cheaper (I'd be happy with $500), or get an RTX and hope that the ray tracing technology takes off in the gaming world? I just don't know how much stuff like this will actually have an impact on developers down the line or if it will hold relevance beyond a few years?

What are your thoughts on my best course of action? I realise my CPU will probably bottleneck slightly on whatever GPU I get, but fingers crossed multi-core becomes a bigger focus for game developers aha. Specs below:

Ryzen 5 2600
16gb DDR4 3000MHz Corsair Vengeance Pro
MSI Mortar Arctic B350m
700W EVGA Supernova G2

Poll Options

  • 6
    Buy a GTX 1070/1080
  • 25
    Buy an RTX 2070
  • 1
    Buy an AMD GPU
  • 2
    Other

Comments

  • +1

    What monitor resolution / Hz are you running at the moment?

    • I'm only at 1080p currently, but given we're talking about 5-6 years from now, there's a distinct possibility of either 1440p 144hz or 4k 60hz.

      • 1080ti is all you need.

        • "1080ti is all you need." You say that like it's hardware and not a peripheral, lol.

  • RTX2080Ti

    IF ray-tracing takes off like NVidia hopes it will (because that's where the new gen RTX cards are best at), that'll last you probably 10 years, playing moderate settings near the tail end of that timeframe.

    But you're right, we can't really know and if ray-tracing doesn't get bigger, then the RTX line might just be a flash in the pan for NVidia and they'll lose billions. It is the next big thing though, it's a matter of timing.

    • +4

      that'll last you probably 10 years

      As long as you don't play games for the last 5 years.

      • I'm not sure. GTX470s can play modern AAA games on medium settings (I'm serious, not even potato-quality, check Youtube), and that was out about what, 8 years ago? That was the flagship card for that generation, and so a current-gen flagship card - the RTX1080Ti, should be able to last around that timeframe too.

      • I've been playing games for 6 years and can still get High on Shadow of Mordor with ~40fps. Not that it's particularly impressive, but for a 2012 GPU, it doesn't seem like too bad? Besides, I won't be particularly sad if all I can manage is "Medium" in 2025, which will probably be almost photorealistic anyway?!

  • Nah, wait for intel GPU release

    Blue team will stomps both Red & Green teams.

    • +5

      God I hope it's not a 3rd take on graphics processing. AMD is going with normal (vector?) graphics technologies. NVidia is betting on ray-tracing. That's bad enough because the gaming market isn't a monolith and the technologies are so different it might well bifurcate games into either "Good on AMD cards, sucks on NVidia" or "Good on NVidia cards, sucks on AMD" to such a degree that they become akin to console exclusives. If Intel comes in with a 3rd angle? Sheesh. Might as well abandon PC gaming altogether for a generation.

      • Don't agree with this. Intel's entrance will be good for the general advancement and stops the stagnation of nVidia while AMD can't keep up. It's a good thing to keep pushing forward. Games optimised for one don’t run poorly on the other, that’s down to drivers which AMD are notorious for being crap. There are some elements that are exclusive to each but that doesn’t affect overall performance.

        • and stops the stagnation of nVidia

          You could've said this about 12 months ago and I would've whole-heartedly agreed. But with AMD coming out with their RX5XX series (and Vega), and nVidia literally announcing the introduction of not just a whole new architecture but that architecture prioritizing a whole new graphics processing technology-tree (ray-tracing)…. that statement is no longer valid.

          It's a good thing to keep pushing forward. Games optimised for one don’t run poorly on the other, that’s down to drivers which AMD are notorious for being crap.

          Agreed, but games already run differently on AMD vs nVidia cards, while using effectively the same technology with only differences in drivers. With nVidia's new cards - yes, it represents a huge jump in technology, but it'll also split the market. A game that has a lot of ray-tracing will run VERY differently on RTX2XXX cards than AMD cards.

          It'll effectively create two captive markets, one for AMD / older-gen tech cards, and one for the newer ray-tracing technology, and currently we have no idea if one will be so much better for the other that it'll transition immediately. If it doesn't - the two markets will mean less competition for each, because you will only be able to run ray-tracing games on nVidia cards and vice versa.

          People will then buy cards depending on the games, which is the same situation we have for consoles and console exclusives.

        • @HighAndDry: >AMD coming out

          We've said this every single launch every single year. AMD simply cannot keep up with nVidia performance increasements. Therefore, nVidia has no reason to push boundaries higher than minor bumps.

          The RTX2070 will end up being a GTX1080 with raytracing. It's not a huge leap forward and on balance the same as every new generation update. Whether or not it'll be the next big thing or a gimmick to die away remains to be seen.

          A game with ray-tracing will not run differently with a system that doesn't have it enabled. It'll just be disabled. Just like PhysX used to be, just like TressFX etc etc.

        • @Hybroid:

          The RTX2070 will end up being a GTX1080 with raytracing.

          While this is effectively true, I think you're underplaying the potential of ray-tracing technology. It's not PhysX (physics, not really graphics fidelity) or TressFX (a graphics library). It's almost as big a jump, if not bigger, from current gen graphics as vector-graphics was from the generation before.

          It might never reach that potential, but considering ray-tracing is already used in Hollywood SFX and pre-rendered graphics in a lot of applications (because of its higher quality/etc), if game devs can bring it into real-time rendering, I see no reason why they wouldn't.

          I'm not a fan of the Verge, and they tend to be a little hyperbolic, but they're at least in the right ball-park when they say:

          ‘Ray tracing’ could bring the biggest graphics jump in a decade

        • @HighAndDry: Exactly, which is why I'm a little unsure of my initial plan to just get a 1080 after they drop! I would like to at least have it as an option, but if it ends up like PhysX I'm not going to fork out the extra few hundred for it?

        • @jakem742: Unfortunately this is a really bad time to buy a future-proof card, because of nVidia's announcement. I guess you can make a more informed decision, but without being able to tell the future, it's still an uncertain one either way.

          If you're comfortable installing and replacing video cards though, you could go for a lower-end current gen card (GTX1060 or 1070), and then after 4-5 years, see whether getting the RTX20XX series cards is worth it.

        • @HighAndDry: Not a bad plan, to be honest! It'd probably end up costing about the same, although the first five years would be sliiightly less powerful it would give me a bit better specs down the line! I just like the thought of not having to think about it until I start my next build aha.

      • +1

        it has to be ray tracing as intel had the tech over a decade ago on cpu, even went so far as later implementing it on their axed larrabee gp-gpu project

        https://www.youtube.com/watch?v=um-1fAVU1OQ

        • +1

          Then (again, if ray-tracing takes off) say goodbye to at least a generation of competition in graphics cards - again. AMD is going to have to pivot and pivot hard, and unless they've kept their own ray-tracing tech development in some ultra-secret lab, they're going to forfeit the market to nVidia and Intel for another generation or more of graphics card technology.

          And while Intel is a behemoth in the CPU industry, unless they partner up with AMD (which would be… interesting), they're going to be starting out in the consumer graphics market at worse-than-David-vs-Goliath odds. It'll be something like an ant-vs-Goliath situation.

  • i bought an old gtx460 not too long ago. its from 2010 i think? anyway it was cheap and im playing new games on it without issues. i dont care about 200fps 4k res or any of that shit though, but im getting decent fps at 1080p or so. the cpu is a first gen i3, the whole build was under $200 including the monitor and probably way shittier than the build you just replaced, so this info probably isnt very useful to you

  • -4

    The next consoles are due to be versions of the the AMD Navi architecture. Which means most games are going to be written targeting that architecture (not the ray tracing effort of Nvidia). Plus the Nvidia chips are just way overpriced.

    So soldier on with your existing card, then get a cheap Navi card when they come out and expect to update around 2022.

    • Anyone who looks at PC and console gaming right now knows that's just something you imagined.

      • What is it with people not doing their research and then confidently asserting others wrong?

        The Playstation 5 is strongly rumoured to have a Navi/Zen chipset https://www.theinquirer.net/inquirer/news/3034273/ps5-will-l… Microsoft are playing things closer, but they too have been talking AMD (which they currently use).

    • OK, given the extra details now available, and responding to the idiot downvotes.

      The new Nvidia cards are too expensive, don't offer much in the meat of GPU rendering. Everything has been thrown at raytracing - but because these are so overpriced and unlikely to be significantly supported in the consoles, that raytracing is going to be unimportant for a long while yet. Maybe in 4 years it will gain some support, but maybe it will just die.

      The reason here is that games companies are pushing towards streaming as a way of getting even more money, shifting the emphasis to different areas of graphics rendering than ray tracing on cards. The Playstation 5 might include a little ray tracing as a sop, and it's that little level that will get put into games - not 'all the rays'.

      AMD are working to do Ryzen type wonders with their graphics capabilities (my guess is multiple cores rather than monolythic), but it'll take till at least next year for that to pay off. Expect to see Navi derivatives in both main consoles, and therefore games optimised towards that.

      So, to the op, I'd wait till the 20x0 nividia cards are really in the market, and get a discounted 10x0 card - which will do fine in games for a few years. Then see where the market is, and you'll probably end up with an AMD based card if they don't screw up. Oh, and 7nm is coming along to make all these cards obsolete, next year.

      Ignore those telling you to buy 2080 TI, it smells like another Titan, overpriced, small sales, and therefore nobody coding for it.

  • -1

    NO SUCH THING

    • As what? I don't get the obsession with 100+ fps on Ultra? It's nice and all, but what's to stop me getting Medium at 60fps in 10 yrs?? I'm currently on a $200 GPU from 6 years ago, and I can still get by - why would an $800 GPU not be far better than this in terms of longevity?? My main concern, if anything, is getting RTX capabilities or not. I could get better value on a current gen 1080 than an RTX 2070, but will I be giving up a whole realm of graphical niceties?

      • You've asked a great but difficult question.

        Back in 2011 people were saying that the Core2Duo's were enough and no-one needs the latest Core i5. Also people said a system with 4GB DDR3 RAM and a 1GB VRAM were plenty. And saying the same thing with the GTX 570 and HD6950 (bios flashed). Those people were wrong. Such a system would've been cheap back in the day, but it struggles and fails to meet minimum standards today. Especially since those people missed out on the great leaps in innovation of Sandy Bridge and 28nm GPUs.

        The smart people they got the Core i5-2500k, 8GB DDR3 RAM, and the GTX 680-4GB. Especially during the fire sales/discounts during Xmas time 2012/13. And really such a system is practically equal to a i5-7600, 8GB DDR4 RAM, and GTX 960… which is almost equal to what you get out of the PS4 Pro/XBX. That means those people managed to get a great system that aged well for 5-7 years. So it can be done*

        The best way to answer your question would be to plot System Specs during the different eras and try to see patterns and make assumptions from there:
        2012-era
        Windows 7 Home
        4.0GHz, 4core/4thread
        8GB DDR3-1600
        2GB GDDR5-6000
        GTX 680
        60GB Sata SSD Bootdrive + 1TB 5400rpm HDD

        2015-era
        Windows 10
        4.5GHz, 4core/8thread
        16GB DDR4-2400
        4GB GDDR5-7000
        GTX 980
        256GB m.2 SSD Maindrive + 2TB 7200rpm HDD

        2018-era
        Windows 10 Pro
        4.2GHz 6core/12thread
        16GB DDR4-3200
        8GB GDDR5-9000
        GTX 1070 Ti
        512GB nVme SSD Maindrive + 4TB 7200rpm HDD

        2021-era
        Windows11 (minus spyware)
        5.0GHz 8core/16thread
        32GB DDR5-4800
        12GB GDDR6-15,000
        GTX 3070 (equal to 2080, equal to 1080ti)
        1TB nVme SSD Maindrive + 2TB Sata SSD

        The reasoning behind this is because the previous standard was 1080p/Med/60fps gaming, and we are currently in a weird midpoint of 1440p/High/60fps gaming, so the most likely standard next is 2160p/Very High/60fps gaming. Yes, the in-between segments of 1440p, Ultrawide, 144Hz, or even VR Gaming will remain "dead". Simply because of this high 4K resolution we would probably expect a large vRAM, that means 12GB instead of 8GB for midrange. As for system memory, 8GB isn't enough currently (bottlenecks) and the only reason we are stuck with 16GB is because of the absurd prices. They should normalise in mid-2019 and we should see a very fast upgrade to 32GB for most people as OS/Apps/Games become more bloated.

        Now what I've posted might be 100% accurate but I'm not condoning you rush out and get:
        r7-2700X, 4x 8GB DDR4-4300 Corsair Vengeance RAM, GTX 2080 Ti Founders, 1TB Samsung 970 PRO, and 2TB WD Blue3D Sata SSD.

        *I've found (and so did most people) that future-proofing isn't worth it. Its not worth it from a price perspective, and its not worth from a performance perspective. That default means from a value perspective it is definitely not worth it. What you want to do instead is look at "bang for buck" at the time of purchase, and make upgrades that seem meaningful. So get the expensive X470 motherboard with a used Ryzen v1 chip and upgrade it to a 8core version once the IPC increases (ryzen v3). Also get the 2x 8GB RAM kit now instead of the cheaper 4x 4GB RAM, as this allows easy upgrade in the future. Pick a decent Sata SSD like the new WD Blue3D with a decent amount of storage (1TB) which should vide you over for the next few years… and relegate that drive as a Storage Drive later once you pick up a faster nVme maindrive. As for GPUs they're overpriced now, especially the new generation… I would stick to a GTX 1070 as that's more than capable of handling all 4K games now at a customised graphical setting. It should take another 6 months for prices to drop. A better card for cheaper will come by sooner or later, and you would be able to save quite a bit of money on the way.

        So what you should do now instead is:
        r5-1600, 2x 8GB DDR4-3200 Corsair RAM, GTX 1070, 1TB WD Blue3D.
        ….then 2 years later you could:
        +r7-3800, +GTX 2080, +2 RAM 2x 8GB DDR4-3200 Corsair, +1TB Samsung 980 Pro, -r5-1600, -GTX 1070
        = overall, better performance than today and cheaper too !!

        • Mmm, I see your point about swapping things out, and it's not a bad idea. I guess I just was thinking I could wack some money into something decent now and then I'm set but you're probably right about swapping out down the line! I think I got in at the right time with my previous build, you're right. I have an old 8gb ddr3 + i5-3570k + 2GB HD7850 which I'm upgrading from. I can get ~30 FPS on Low/Medium settings in AC Origins, but that was really the first time it's felt so slow. I think playing that is what's made me think it's time for an upgrade aha. I'll probably try to sell it for ~$500 and contribute towards the new build I've got.

          Cheers for the advice though! I like the 1070 + 2080 suggestion as a long term plan! :)

  • +3

    Why 6 years? Pay half now, get something good, upgrade in 3 years.

  • I'd go with the 1080, then a few years later replace it with the 1080 equivalent of the time, with matured ray-tracing technology, for a similarly cheap price. Personally I refuse to play a PC game on anything less than ultra at 1440p (soon to be ultrawide), though I don't care about going over 60fps. You only have 1080p to worry about for now, which is nothing really for a modern card, and new technology in games takes a long time to become the norm. I'm still pleasantly surprised to see tessellation in a game and that's quite old. My last two builds had two cards in SLI, they both lasted 4-5 years playing new games on Ultra (second build was actually 3x1440p screens in surround). Now I have just the one card, because they're so much more powerful now, and graphical fidelity seems to be slowing in progress, partly because of the long console generations these days. If you're happy with medium settings etc, I don't think you'll have any problems no matter what you choose, even if you upgrade your screen later on.

    • +1

      It sounds like you have a lot of money to burn, which kinda defeats the purpose of this whole plan aha. SLI / replacing with high end cards every 2 years is not my preferred plan of attack. Ultra settings are over-rated… ;)

  • Here's a list of 21 games planning on implementing ray tracing:
    https://www.pcgamer.com/21-games-will-support-nvidias-real-t…

    Here's an article about how there's no current benchmarks available:
    https://www.polygon.com/2018/8/20/17760448/nvidia-rtx-prices…

    Historically whenever a new DirectX feature is introduced the only GPU's capable of acceptable frame rates are the fastest flagship cards. If you buy the most affordable card you'll almost always get unplayable frame rates. The RTX 2070 is the slowest card on sale that supports ray tracing. I don't like your chances. Hedge your bet with the flagship card RTX 2080 Ti as it would be serious egg on face if Nvidia's fastest card wasn't able to run the new games.

    • Good call about the lowest RTX. That said, chances are decent they'll come out with an RTX 1060, surely? I might follow other advice and get a 1070 now and 1080 ti in a few years once prices drop down.

  • ray tracing is a marketing point. wait for benchmarks see the performance/$, but just get the 1080 if a $500 bargain appears.

    • Is that all it is though? Looking at the real-time videos of it in-game, it looks pretty incredible! I probably will just go with a 10 series and see how it all pans out. Safer and potentially gets me a better specced card down the line. :)

  • if i were you, i would wait until real benchmark coming out of this RTX GPU and see the real difference in old GPU handling RTX computing. If it is head and shoulders behind, try to get 2080 one as this is uncharted area at the moment.

    either way the 1080 and the ti cards will come down when RTX releases anyway.

  • +1

    Good question, and hard to say.

    As others mentioned above, we need to see real world benchmarks re 20 and 10 series cards, look at the percentage increase performance for your intended use, and then compare to the price differential.

    Then we need to make bets about whether ray tracing will be widely implemented in games.

    I'm still sitting here with my 970 wondering what the future will hold.

  • Does this mean current prices on gtx 1080 series will drop soon?

Login or Join to leave a comment