• expired

Gigabyte Radeon RX 7800 XT Gaming OC 16GB GDDR6 Graphics Card $749 + Shipping ($0 to Metro Areas/ VIC, NSW, SA C&C) @ Centre Com

260
FAF40

40 bucks off an 7800XT that has 3 fans, making it just as cheap as the dual models, but better performance. Best value 1440p card at the moment, roughly equivalent to a 4070 performance wise.


Surcharges: 0% for bank deposit, Afterpay & Zip Money. 1.2% for VISA / MasterCard & PayPal. 2.2% for AmEx.

Free shipping excludes WA, NT & remote areas.

Related Stores

Centre Com
Centre Com

closed Comments

  • +1

    Price in title pls 👍

    • +1

      fixed now sorry

  • +3

    This is worth about $500 today.

    • +2

      GPU market is dismal right now, from what I've researched, this is the best value card for 1440p right now

    • +2

      It performs maybe 10-15% slower than a vanilla 9070 (~$1000), so $749 isn’t too bad imo - I’d buy at $600-650 maybe

      • and then the value of the 9070 series goes out the window once you compare it vs last year's 7900GRE/XTX performance and prices.

        • +1

          Can I use your time machine please?

    • +4

      All GPUs on the market are 'worth' less than their current asking prices, I am sure.
      I judge this to be fair value, considering the pricing of the other options.

    • +1

      Second hand is worth $500, so I don't know if paying 50% more for a new card is worth it… but thus is the current state of the market.

    • +4

      as someone who came from a potato laptop, the extra vram is more important than better raytracing performance for the future

        • +17

          Jesus you are hard on the NVidia bandwagon.
          I agree that DLSS3/4 or FSR4 are worth the extra money but he is right that the extra RAM is much more critical here. No point having any ray tracing in your weak card if you cannot even make the game move at decent FPS without it. This tier of cards is already too weak for any ray tracing, including the 5070.

        • +2

          You know all these can only be utilised somewhat ok with enough room to spare for VRAM…. right? Plenty of discussion and testing were done by numerous tech savvy ppl online

          • -4

            @CloudySteps: And they show you edge cases where visual quality increases of specific settings can be barely perceptible for massive VRAM increases, and they skip over a lot of the new tech coming through.

            12GB should prove fine for path tracing at 1440p due to neural textures, neural assets like neural materials, and neural radiance caching.

            8GB is where things are more of a struggle.

            tech savvy

            Enthusiasts with cameras.

        • +2

          Path tracing on a 5070 rofl.. you have to be taking the piss.

          • -3

            @Grish: At 1440p with DLSS balanced? 60-90 FPS should be achievable with some elbow grease in the settings, which lays the ground work for frame gen.

            Not going to be eSports ready, but no GPU is.

    • +12

      What makes up modern rendering? Do you mean ray tracing? Cause you won’t be able to do it with the 5070. You’ll run out of VRAM.

      • -2

        Upscaling.

      • -2

        The key ones are ray tracing, then upscaling, ray caching and denoising being rolled into ML models to infer detail, enhanced data compression through inference, other compression techniques, and refining the pipe to layer all of these workloads.

        Do you think compression needs more VRAM if the compressed asset is then run through an inference model in real time?

        • +3

          There is only so much you can compress before quality is lost. Also, those models need VRAM.

          The final nail in the coffin is the fact that you don’t need any of those AI workarounds at 1440p for the 7800xt, and also the 5070/4070 if only it had 16gb VRAM.

          You’re also saying all this, expecting NV to roll out new, future features to older generations, which they famously have a history of doing exactly the opposite and locking them down to new generations only.

          • -1

            @jpeg-jpg:

            There is only so much you can compress before quality is lost. Also, those models need VRAM.

            The final nail in the coffin is the fact that you don’t need any of those AI workarounds at 1440p for the 7800xt, and also the 5070/4070 if only it had 16gb VRAM.

            Yes eventually a given compression method crumbles, but based on your username, I'm sure you're somewhat aware of how far we've come in terms of squeezing perceptual quality out of every last bit.

            In this case, you're looking about 6-7x additional compression when you aim for similar visual quality, so the model brings a saving. You can see examples of this in action, the model is trained specifically for the game as well. This will eventually be a broad standard, but most people will instead extend beyond current quality levels while saving disk space and VRAM.

            You’re also saying all this, expecting NV to roll out new, future features to older generations, which they famously have a history of doing exactly the opposite and locking them down to new generations only.

            The SDKs are in the wild. The limitations are requiring the necessary hardware features to do inference-on-sample, instead of just unloading it in memory. Certainly in the case of textures, all RTX owners get the bandwidth and disk savings, but only 40 and 50 series get the VRAM saving.

            Some features cannot be backported for hardware reasons, which is why DLSS 4 upscaling backports all the way to the RTX 20 series, but DLSS 4 frame generation does not (current lack of sufficient processing power from the tensor cores), and DLSS 4 mutli-frame generation does not (lack of correct pacing when generating 3 frames).

            How far back does FSR4 ML upscaling backport? If the hardware isn't physically present to provide the correct processing and execution in the targeted time frame (latency), it's not going to work well, so they won't enable it.

            That's why GSync was its own thing: they implemented a methodology for driving display responsiveness across a variable refresh rate range, which at the time required an expensive FPGA to speed up the process. They helped build out the standard, and they wanted to be paid for the work, so they sold it as an exclusive then revisited it once the display industry caught up.

  • +1

    Would it better to buy this for $750 or a used RX 6800 for $450ish off marketplace?

    • +2

      assuming youre not talking about a 6800 xt (which i would do if so), i think the 7800 xt is worth the price over a 6800 if you have the budget

    • +1

      Don't pay $450 for a 6800, that's 6800 XT money.

      And don't pay more than $500 for a 7800 XT (second hand).

      • No current used GPUs are in that price range on eBay, do pay a premium though for purchase protection, and marketplace isn't an option where I live unfortunately. Cheapest recent 7800XT I've seen was $550

    • +1

      This one, then you have a warranty. Who knows what a used 6800 has been put through.

  • +1

    Just wait 2 weeks 9060 XT 16gb $599

    • +2

      about 10 of us will get it at $599, then it'll be $799 for the next ~3 months

    • Is it actually better than the 7800 XT though? I was getting the impression that the 7800 XT would still be the better buy?

      I'm looking for something for 1080p gaming that's <$1000 (anything over seems overkill for 1080p), but want to be able to play Indiana Jones and get some longevity out of it - still very confused which to get, so waiting to see how the 9060 XT stacks up when it's out.

      • 7800xt will be better, 9060xt will have fsr4 though

Login or Join to leave a comment