• expired

XFX Radeon RX 6900 XT Speedster MERC 319 Black 16GB GDDR6 $1999 + Delivery @ PLE Computers

1110

Decent price for a 6900 xt with stock available, I was gonna get this but thought I should wait for a good deal for the 6800 xt

Related Stores

PLE Computers
PLE Computers

closed Comments

  • +13

    Waiting for a 3080 deal, see one for 1.9k, need one this week - might have to reconsider getting this instead! Will do more research but thank you!

    • +2

      This one might require a higher psu is that correct?

      • Minimum 850W I'd say.

      • +3

        Yeah its recommend you use a 850w+ PSU
        But any high quality 750w should cut it
        I ran my reference 6900xt with a EVGA 750w GQ
        It ran fine, but don't take my word for it, better to buy a 850w PSU as this card has very erratic spikes in peak power consumption, especially when overclocked
        But I went with water-cooling so I changed over to a 1000w Corsair unit

        • +8

          My 5950x and 3090 build only using 550w max

          • +12

            @Daangaz: Why save $200 with the potential risk of damaging $4000 worth of kit?

            • +9

              @johnttt: I have a 1200w power supply, not risking anything. Just saying it only using 550w of it

            • @johnttt: Ummm if your PSU is not powerful enough to power the system it will just cut off like it is designed to do? Wondering how damage is possible ?!?

              • @frugal investigator: Not an electrician, but i imagine when an psu is running on the verge of its absolute limit for a some time, the psu may short out and generate a surge of current before it dies which will absolutely destroy your mb or graphics cards? I think the damage will be random, not saying it will definitely cause damage when it dies.

                • @johnttt: Most power supplies these days have OVP (Over Voltage Protection) which shuts down the PSU if excessive voltage is detected.

            • +1

              @johnttt: That's like saying you need to buy a big car to haul your family, otherwise the motor in the small car will work too hard, blow up and toast your family

              • +1

                @eddyah: headgasket??

              • @eddyah: ….it might go out with a loud bang when it stalls which could be considered "blowing up"? And toasting your family is possible if the motor sets fire in the engine bay that spreads into the cabin….

          • @Daangaz: I dont this this is true. My 5950x goes up to 198w with just using pbo and no manual overclock. 3090 uses more than 400w at full load according to reviews. Evga 3080ti goes up to 450w.

            And then rest of the system uses power too. VRM efficiency loses, fans, ram.

            5950x and 3090 will easily consume over 700w at full load. Check this using OCCT power test

            • @InternetExplorer: My Zotac 3090 maxes at 358w and the 5950X hasn't gone over 130w, the rest of the system uses roughly 80w. This figure comes from the wall outlet, so I doubt it would be wrong

      • -1

        My Asrock 6900 xt requires a +900w PSU
        If you got the money for these cards a +1000w PSU is a worth while investment.

        • How 900w? Its a 320w reference design. What else are you running?

          • +1

            @BargainKen: Asrock RX6900XT PGD 16GO on the website 900w recommend power supply.
            It has 3 x 8-pin power cont.

            • @Forth: Nah I have the same card, it's power limited to 320w anyway despite having 3x8 pin connector.

              • +1

                @Mister Popo: Do you reckon they just list a massive power supply as a requirement to get people excited about the card?

        • recommended does not equal required. manufacturers are extremely conservative in their estimates and assume you have your system fully loaded up and have crap quality PSU. a minimal system with a good psu would get away with as low as a 600w psu.

          • @gromit: It's always better to go with the biggest and highest efficiency as you can get, you get larger efficiency curb, it's called head room.
            Why would anyone drop 2k on a GPU to save $200 for a 600w PSU over a 1000w PSU.
            It's defies logic. But people can do what like, in the face of the facts.

            • @Forth: its called right sizing. I would not actually recommend 600w, just saying it is possible. you can also go too big with psu as you lose efficiency beyond the optimal curve.

              • @gromit: You have a shallower efficiency curve and hit the efficiency drop off faster with lower end low quality PSU, but hey you do what you want sir.
                This is the last thing I say on this good bye.

                • @Forth: Why would you ever use a lower quality PSU. The point is the recommendation from the manufacturer is based on low quality PSU, which you should never use, especially on a machine with expensive parts.

      • This particular model only need a 650w psu

      • +1

        It's also worth mentioning that Power Supplies degrade overtime.
        So what starts as a 750w PSU, may well be closer to a 650w after X number of years.
        So if your PSU is borderline "enough" today, it won't be adequate for the same task after a certain amount of time.

        • +4

          But it's also worth mentioning that PSUs from good manufacturers are overengineered - ie a corsair 750w can often deliver more than that, which means it might not be borderline to begin with

    • I got the 3070ti instead because I can't justify spending $500 more for (maybe) +20% more performance.

      • +4

        Y not Rx 6700xt? It is 10% weaker then 3070ti but $300 cheaper at $1200.

    • +1

      For anyone curious: I bit the bullet on the 3080 for 1.9. It'll probably drop be a few hundred but needed one soon so was happy to pay the premium.

      Main selling point for me was seeing the performance of the 3080 with video editing compared to 6800/6900, so that was a dealbreaker for me.

      • Nothing wrong with buying when you need it. It's only if you are not in need of one, it's best to hold out.

  • +5

    Great GPU, decent price. PLE coming in clutch once again.

    • not really i got my same 1 last month june 23rd for $1699 from ple! was afraid price might still go down RIP! Price went back up! $300 more now what i pay for!

      • +7

        Not likely unless you got a refurbished card. Far more likely you got the 6800 XT.

      • +1

        Amd reference 6900xt cost $1599-1699 at launch
        No way an AIB card is as cheap as a reference model

  • How does the XFX cards compare to the others say Sapphire, ASRock, Asus etc?

    • they used to be one of the best Nv card manufacuturers but then something was broken between them and Jensen.

      In team red I don't think they are on par with Sapphire and Powercolor but not too far away either.

    • I own their 6800, and this 6900. Both coolers work very well and quietly. Ensure you have enough space in your case.

      Have owned XFX cards going back to 7970 and never experienced any issues with them.

    • I own their 6700xt and it runs cool.

    • +3

      xfx are one of the best amd gpu oems. similar tier to sapphire and powercolor.

        • +7

          Lol evga isn't a amd oem, gigabyte are cheap skeet with their vrm padding. Msi are bare minimum okayish. Asus overcharge just for their name.

        • +4

          Here is some advice since you seem to think Gigabyte and MSI are so great… Most popular ≠ the best

          Have you ever owned an AMD card in the first place? Sapphire and XFX are the best AMD AIB.

          EVGA are not an AMD AIB, would be great if they were but they only work with NVIDIA as an AIB.

          Gigabyte seem to have a really high chance of having a major oversight related to thermals recently.

          MSI is pretty much the definition of second tier recently for both AMD and NVIDIA cards, even their x570 motherboards have terrible VRMS when compared against similarly priced ASUS motherboards. If I remember correctly, their 2080tis had thermal issues as well.

          ASUS are good, but are usually incredibly overpriced when it comes to GPUS. (PLE has their TUF 6900xt at 3199 vs the 1999 of the XFX 6900xt)

          You claiming XFX and Sapphire are second tier is hilarious, you are very clearly ignorant on the subject.

          • @Vinodra: Gigabye AMD cooling is plain $hit on everything (bar their Master models) I've touched since the rx580.

            They seem to always favor smaller size to better cooling, making their coolers less efficient & noisier than their competition.

        • +2

          EVGA is practically owned by Nvidia and will never make AMD cards. They were only just recently allowed to make AMD mobos following what was likely years of begging.

          The ASUS 5700 xt coolers didn't even make contact with the GPU die, denied it was an issue for months and then blamed AMD for their mistakes. Probably the worst GPU SKU ever released coupled with the worst company response.

          MSI are hit and miss, but the best out of the three you've listed that actually make AMD GPUs.

          Gigabyte quality is terrible, for both Nvidia and AMD.

          The best AMD GPU AIBs are Sapphire and Powercolor, with XFX just below them. For the 6000 series Powercolor seem to have done the best job with their coolers.

  • I think this is a decent price for this moment ($1842)

    https://www.ebay.com.au/itm/GIGABYTE-GeForce-RTX-3080-EAGLE-…

    • +4

      You'd recommend the 3080 over the 6900XT?

      I've been doing a bit of research and on paper it seems like 6900 XT > 3080, but Nvidia's features (dlss and ray tracing) seems to outweigh it?

      • +2

        That's $1805 with eBay plus (theres a 5% coupon at the moment, or at least there is for me).

        I would go 3080 over 6900XT for DLSS, don't care about ray tracing though (although nvidia cards are better at that too).

        • Must be an account specific deal then, don't have anything on my end.

      • +12

        Different strengths for each.

        3080, More mature RT so you take less of a hit to your FPS to enable RT, relative to the FPS you had before. DLSS support (helps even more with RT titles). Stronger at 4k resolution. 'CUDA' support, which takes many forms but if you can/do leverage that, it's a massive tick for the 3080. NVENC encoder for streaming (I stream from my 3080 equipped PC to a Shield Pro in the lounge, this feature is awesome).

        6900XT, an absolute beast at rasterization and can dabble in RT if it interests you, but takes a bigger relative hit to FPS to turn it on. 16gb VRAM over 10gb, so if that VRAM is useful to you now, or you plan to keep and use the card for quite a few years. A bit stronger at 1440p and even more so at 1080p. Radeons perform better if you're CPU limited, like smashing 80-90%+ of your CPU usage in games, as GPU scheduling is done by hardware, where Nvidia cards need some CPU resources for scheduling.

        • It has 16GB VRAM but the 3080 has much higher memory bandwidth for its 10GB

          • @twister292: Indeed it does, plus texture quality can be tweaked as time goes on to keep the 3080 running just fine.

            Generally speaking, there is very little visual difference between 'high' and 'ultra' textures, and in fact quite a few games will just allocate as much VRAM as it can get it's hand on, with no visual difference, and essentially no performance difference for not having as much VRAM as the game would like to fill. So take in-game VRAM 'usage' with a pinch of salt, it could just be 'allocation' and not a concern at all.

          • @twister292: It also runs very hot and consumes significantly more power, even when idling.

        • Is the 6900XT better for rendering than 3080 then due to the extra VRAM?

          • +2

            @jrjr: depends on application. But if the application is optimised for CUDA, then obviously Nvidia have an advantage. You need to do research for use case

          • +1

            @jrjr: for blender, nvidia is superior

      • Pretty much it. 6900 XT will beat the 3080 in most games quite easily but the 3080 closes the gap more at 4K. If you turn on ray-tracing then the 3080 will beat the 6900 XT quite easily however both cards will suffer from a severe framerate penalty. DLSS is Nvidia exclusive but FSR is coming to more games so while DLSS and FSR isn't directly comparable from a quality/implementation perspective they both have the same goal of upscaling to gain back more performance.

        • An important note there would be that with the 3080 you can use both, but on the 6900XT, only FSR.

          • @foxpants: In theory, should Nvidia cave to demand and implement it on their cards.

            • +1

              @Timboj: It already works on Nvidia. It's mostly about game developers implementing it now.
              For that to change Nvidia would have to actively work to update their drivers to make their card not play nice with FSR, which would not seem like a very good marketing strategy, but they did it with hashrates, so who know I guess.

            • @Timboj: FSR will run on almost any card, I tried it on an old Gtx 970 with Terminator Resistance and GTA 5 they are some of the few games with it at the moment. It's basically free frames.

              • @Forth: I mis-spoke; to get the most out of the tech Nvidia would need to optimise their drivers for the implementation (not switch in on, per se). Their willingness to support it remains to be seen, but I expect that will come with market pressure.

        • the 6800XT also beats the 3080 in a lot of ways

  • +14

    P.L.E ! P.L.E ! P.L.E ! P.L.E ! P.L.E ! P.L.E ! P.L.E !

  • -2

    lol 2000 bucks no than you

  • +3

    I don't think i want to upgrade from a 2080ti to a 3080 - not sure if the jump is big enough.

    • I have a 2080TI too, have been looking at all the new cards but not sure…

      • +8

        I'm waiting for the 40xx series. The 2080ti will last us a while yet. Plus nothing much is out worth upgrading for.

        • +1

          2080 Ti is still a beast at 1440p high refresh and 4k60. I'd personally wait another generation too. Rumours suggest the 40XX and RDNA3 series will be another huge jump in performance.

          • @Yuri Lowell: only worth upgrading to the 3000 series if you need HDMI 2.1 which the 2000 series does not have, to be honest its not worth the scalped prices unless its below RRP for the small performance upgrade.

            • @Dogfight: Yeah i kind of do need hdmi2.1 since i have a lg c9.

              But the cablematters dp1.4 to hdmi2.1 adapter will have to suffice i think, i just can't get VRR working :*(

              It is a decent price for the 3080 though from tech fast - i paid $1649 for the 2080ti 2 years ago, so that $1849 price doesn't seem so bad.

              But i also have my 2080ti on EK waterblock, so would require a new EK block.

              $1849 is ok, but too much hassle and money for not enough gain for me personally.

          • @Yuri Lowell: Yeah it's even not bad at 4k120fps which is what i'm playing Days Gone at. getting about 90-110 fps without overclocking the card.

            But yeah, any raytracing would bring it to it's knees pretty much with 30-60fps.

      • +8

        That's just plainly not true. Why would you just come on the internet and lie like that?

        • -1

          Tt makes such little sense, If it was underwhelming going from a 5700XT to a 3080, a 2080Ti would be more underwhelming. Plus there is no such power delivery 'nerf'. But I can see that if you already had a 2080Ti the leap probably wasn't worth it, depending heavily on personal preferences and performance requirements.

        • -8

          Sorry buddy, if that's the level of effort you put into stuff, it's no wonder you haven't got a 3080 yet.
          I do understand that you can only be angry at those who got to live your shitty dream & found it to be just that.
          I didn't mean to burst your bubble, but I wasn't actually seeking your readership or ridiculous opinion (you can tell by the way I was replying to someone else, that u jumped in on).
          Simply tryna help a fellow OzB community member save a buck, by relaying the facts, garnered through my first hand experience. No truth or lies required.
          Why do you come on the internet & lie to yourself?
          Actually, I already covered that. Apologies.
          Carry on.

          • @Unsafe: Not sure if you're addressing me, or dsiritz, but I have a 3080, got it at launch so … jeez about 10 months worth now. I guess my bubble is burst? Sorry to say it's fantastic, no power nerf, and a good chunk faster than a 2080Ti, facts.

            • -1

              @foxpants: I was addressing dsritz, guess u didn't read or understand the context of the question I asked in my reply to him.
              No worries though, using this site as intended, would be my advice to you, as I did.
              Someone asked a very specific question. I answered, using my very specific 1st hand experience, that's more than likely quite rare, to save them dollars. These are facts.
              If you have 1st hand experience, or anything other than argumentative opinion (with which, I guess, you aim to bolster your purchase decision, to yourself), state it & make sure it's relevant to the question asked. Else you are helping nobody but yourself.

              • @Unsafe: I am so very happy your series of terrible replies is now a matter of public record.

                Enjoy your 2080ti buddy, after all, it's not that far behind a 3080 right.

    • +1

      1070ti here. Although it is still capable for daily use, it’s very slow compared to a 3070/80 or literally anything modern.

    • I've been trying to nab a 2080ti because I see them pop up on gumtree for sub-$1000 every now and again. Absolute beast of a card, best value for money second hand card out there (especially when people want $700 for a 2070 or $1500 for a 3070).

      Which is why I wouldn't recommend selling it, the secondhand market is crap for them. I've seen a few sit on gumtree/facebook for a while at $1.1k when 3070s were $2k new, doesn't make any sense. But then people aren't sensible.

    • Unless the rest of your setup is already godly and the money is burning a hole in your pocket, skip a generation. I'm in the same boat and it makes no sense to upgrade GPU at these prices IMHO.

      If I could get a 3080 for 1.9k and sell my 2080ti for 1.1k or so, that'd be 800 bucks for a 10% performance increase. I reckon for most builds there are better places to spend $800 on. For me, with a 9600k, it'd cost me way less than $800 to get 10% better performance via the CPU upgrade route. Would also much rather get a 4k monitor so that I actually put the 2080ti to work. Etc. There's just no angle where I see my GPU being the place I want to spend $800 at the moment.

  • +13

    The budget for my last PC build, back in 2013, was $2000.

  • +8

    I have a 6800xt merc and it is a great card, but warning

    This is a thicc card.

    2.8 slots, 340mm length.

    Check your case for compatibility before buying.

  • I have the 6700xt of this card, and it is good but HUGE.
    Didn't fit in my Node804 at all, fortunately fits snugly in my SG11 SFF case.

    Be mindful that this card is a beat.

  • they have discounts on 6700 and 6900, why not 6800? come on PLE, let me pull the trigger

  • +1

    I will wait until these falls around 1500… won't be long I hope

    • Some reason the 6700xt and greater variants seem to be holding relatively steady price wise.

    • +7

      Paid $1549 for my 6900 ($1049 for the 6800) in December, so price should fall below that at some point.

      • reference or AIB?

        • Both XFX cards, pre crazy price spike.

          • @xuqi: It's hard to believe AIB designed 6900xt @ 1549, even pre COVID, unless a reference XFX

            • @wxwsf: Models:
              RX-68XLALBD9
              RX-69TMATFD8

Login or Join to leave a comment