• expired

Gigabyte Radeon RX 9070 XT GAMING OC 16GB GDDR6 GPU $1296.25 ($1225 with eBay Plus) Delivered (Excl. NT) @ smarthomestoreau eBay

740
FYMY15FYMY20

Original Coupon Deal

Excludes Northern Territory, WA Remote.

Shopping express is blessing us with a humongous $20 discount.
Stay tuned for the overpriced 9060 XT 8GB/16GB announcement tomorrow!

Edit: 5060 8GB just dropped: $560, what do we think?

Related Stores

eBay Australia
eBay Australia
Marketplace
Shopping Express
Shopping Express

Comments

    • +10

      Which part specifically? If you mean the background noise cancelling in your audio then, they have that.

      • Been using this since switching to RX6800 from 2060 Super with no issues. One thing I miss about GeForce experience is that the instant replay is more seamless than AMD Adrenaline. If I save a 90s clip and then save another 15s later on GeForce I could join the two clips and it would be seamless whereas there would be a 1-2s gap on adrenaline.

      • i personally use it for the background blur (when not using zoom).

      • All of it. Not just background noise cancelling. Also the fact that Adrenalin still doesn't have a hotkey for toggling instant replays on/off whilst ShadowPlay/Nvidia App does is kinda annoying.

    • -6

      nah I think I will stick with my RTX 5070 TI as much as I love AMD cards I am tired of the software issues i get with it.

      • What software issues do you experience?

        • -3

          Mostly new drivers sometimes they screw up a driver update not every update just some

          • +11

            @kungfuman: Google “nvidia 5000 series driver issues”

            It has been basically every update screwed up by nvidia lately. Even my 3080 has had issues start coming out of nowhere and I had to roll back.

            • -3

              @[Deactivated]: never had a single issue i feel those people that have problems don't do clean driver installs.

              • +6

                @kungfuman: The good old sample size of one and some feelings.

                I could link to nvidia driver update notes or even developers telling gamers to roll back drivers, but your feelings obviously trump that.

                • -1

                  @[Deactivated]: well I haven't had an issue so…… who has been having issues?

                  • +1

                    @kungfuman: I've had issues with 50 series - upgraded to a 9070 couldnt be happier.

                  • @kungfuman: anything past 566 will randomly give me a long flicker to black and back until i restart pc, disconnecting and reconnect dp cable doesn't fix it, and googling shows i aint the only one. This is with a 4070, no issue when i roll back to 566 though.

      • +1

        Lol AMD drivers have been fine for years, wth you on about.

    • Never heard of it, tried the studio voice thing just now, sounds like absolute ass, like an AI voice changer

  • +8

    5060 8GB, let it rot to sub $400

    • +21

      I cannot support NVIDIA as they keep acting in bad faith

      Evidence from gamers nexus and jay2c

      They also wouldn't let reviewers test 5060 before release

      • +1

        Good for you.
        Unfortunately, gamers don’t matter to Nvidia right now, and it will probably not matter for the foreseeable future as long as AI is around or even beyond that, considering they are looking to replace gaming with robotics as their backup in case the AI bubble pops (which likely won’t happen or be that bad considering AI is actually more useful than gambling on cryptocurrencies).
        Seriously, all Nvidia gets is the measly patronage of sweaty basement-dwelling gamers compared to the men in suits that will pay 300x what gamers will pay.
        So unless other GPU designers can actually compete with Nvidia or CUDA ends up being open-sourced by a higher power, Nvidia can disregard the sweaty gamers.

        • +3

          Exactly. Nvidia can just pull out of the gaming GPU market and it wont even matter to them.

          People just need to accept that Nvidia is not a gaming GPU company anymore.

          And if AMD had the traction in the AI space (they are working hard towards it) they would also abandon/careless about the gaming GPU segment.

          they may once in awhile release some gaming GPUs just like how Toyota once in a while makes some enthusiasts cars like the yaris/corolla ZRs but their main business is stock Camrys/Corollas and Rav4s….

          • @Jaduqimon: On the bright side, they allow their engineers to cook once in a while to make those enthusiast cars, at least to sell the brand, because the car market is still fragmented.
            But with Nvidia, they’re in a market that isn’t fractured at all, and where they’re the top dog anyway, so what we’re getting is mainly a bunch of scraps.
            Also, apparently a bunch of their employees are already retiring or quiet quitting because of the millions they made from their stock options when they were first employed.
            Of course Nvidia can’t pull out of the gaming market yet, but that’s why they’re building up robotics to replace gaming.
            Once that happens, they can put GeForce on the backburner and wait until either a competitor outcompetes them or a higher power upends their AI and robotics businesses.

        • +2

          Robotics isn't a backup, it's one of the goals.

          AI computational speedups have their value in simulation and 3D graphics, so that retains value for GeForce. Enterprise is the priority, but the benefits flow through to gaming.

          The current impact is a combination of software developments costs (currently NVIDIA R&D spike, poor VRAM optimisation by developers due to time to implement), and the ongoing hardware costs (TSMC 4nm, 2GB VRAM chips vs 4GB while 3GB remains low volume and expensive).

          This in no way justifies NVIDIA's behaviour towards media and non-media partners, but there are solutions to these hardware problems that will emerge over the next 12 to 18 months (affordable 3GB chips, TSMC 3nm & Samsung 2nm, AI compression, perhaps nanoinprint lithography as an alternative to EUV photolithography).

          Over time, we may also see a technique emerge to combine interpolated frames (frame generation) with extrapolated frames (Reflex 2 in the case of NVIDIA) to produce a lower latency experience than native frames with OG Reflex, but that's probably 2-3 years from any announcement. The hope would be that such a technique would be backwards compatible to the 50 series in some form.

          • @jasswolf: Robotics isn’t a backup right now, but they are indeed investing in it as part of its goals, and eventually their robotics efforts will surpass gaming (as in ‘strictly’ gaming), which will lead to GeForce becoming deprioritised further.
            If AMD can cross Nvidia’s moat on the PC-based content creation market (Macs are different), they can displace Nvidia in the entire consumer section, but Nvidia will still likely have workstation and enterprise customers for content pcreation, as well as datacenter and robotics.

            • @FujinShu: We already have real-time path tracing, and AI models for things like fluid dynamics, subsurface scattering, texture and asset compression, and frame rate acceleration technologies, so there isn't exactly a huge amount of work to do in pushing real-time graphics forward.

              In 5-7 years we're hitting a pretty big plateau, the rest will just be networking up shared experiences, maybe offering ultra-low latency cloud gaming.

              • @jasswolf: I do foresee Nvidia pivoting their GeForce brand to primarily cloud gaming and single-unit devices, while also selling their GPUs and other computing devices at a high markup through their enterprise branch.
                The innovations that Nvidia made for game graphics can be translated to animated media such as films and TV, as well as digital sets, VFX and more advanced CGI that can be done faster for “live-action” media.
                Hardware-accelerated path-tracing will cut render times down further, allowing for more iterations or a faster time to market, while neural rendering allows for more detailed objects to be used in production.
                Eventually, AMD will be the undisputed winner of the home console and x86-based “gamer” market, while the others will find different niches that can’t be touched by Nvidia as easily.
                Perhaps the next indicator of promising technology will be which hardware designer Nintendo partners up with for their next system.

                • @FujinShu: I don't necessarily think that they'll stop selling chips for consumer use, but I can see a scenario where some games require cloud access, and that may incur a charge (though likely through a game subscription or third-party cloud service).

                  MMOs and other online open world games stand out for that, various metaverse platforms will also rely on such things for large scale environment simulation and rendering.

                  AMD are on the old console market, and haven't really made any in-roads to mobile nor any results in terms of cloud. NVIDIA are still primed to dominate, they'll just be accepting thinner margins in a bidding war against AMD, but will maintain the ability to price higher due to software stack and support.

                  NVIDIA will continue to dominated the dGPU market so long as they continue to lead in the push they've been making, it's just the dGPU market is going to continue to stagnate.

                  • +2

                    @jasswolf: NVIDIA has made their VRAM so expensive to a point that stacking Mac mini becomes a more economic choice for local LLM setup. Couldn’t ever imagine that.

                    Wish some anti-trust stuff could hit NVIDIA hard for even just once.

                    • @DevilsInDetails: In what sense? You've got the 16GB 5060 Ti if that's the road you want to go at the moment, but DGX Spark is working its way down, starting at a 5070-equivalent with 128 GB, but that's probably in the range of $3000 USD.

                      There's a MediaTek SoC coming that might have the option of being mounted in a mini PC with a variant of GB206.

                      • +1

                        @jasswolf: Oh my bad. Didn’t notice 5060 Ti was released.

                        Would be interested to see the MediaTek variant.

                    • @DevilsInDetails: Unfortunately, American companies are protected right now from anti-trust.
                      The FTC and DOJ are completely against anti-trust, and if any other country files an anti-trust lawsuit, then America will just cripple that country.
                      Right now, the best hope I have is in China (or Europe) suing Nvidia for anti-trust, and because they have a military and all the manufacturing power of the world, hopefully America will actually listen.

                      • @FujinShu: China already did (and to Google, too), and ended with happily nothing. Europe could give us more hope, but I don’t think NVIDIA would really give a f**k to it.

                        • @DevilsInDetails: Nvidia would, especially when China effectively banned H20 imports due to environmental regulations.
                          Nvidia was arguing with the President about allowing H20 exports from the US to China, but then China tried to rip the rug out from Nvidia anyway once they felt their domestic AI chips were good enough to stop relying on US tech.

        • +3

          This is the answer. Nvidia made 7% of their revenue from gamer last quarter. 92% from AI and "other" was the remaining percent.

          AMD and Intel keep making unforced errors, so Nvidia remains the best, if expensive choice for most.

  • -1

    to hodl or not

  • +7

    I have this card. Draws about 330w stock, 360w with max power limit. I believe it’s the only “base” model card to do this. Cooler can keep up with the higher PL but will be quite noisy due to being 90mm (or quiet but toasty; still within safe margins).

    • Did you do the undervolt on it as well? What sort of perf increase were you able to achieve?

      • +1

        Mine was unfortunately a dud in terms of UV, only managed to be stable at -75mv. On the games I play, this translates to about 3050-3150mhz in-game clock speed. Haven’t tested the memory yet.

        For reference, the card will easily do 2900+ just with the PL bump.

        • What power consumption does that correlate to?

          • +2

            @Wicko: That was in MH Wilds. Power consumption does not change with a UV, it stays the same. So if you do stock it will always be around 330w, and if you do 110% it will always be 360w, regardless of UV value.

            I just did a quick re-test using Horizon Forbidden West as everyone seemed to be getting better clocks and I was curious. This is a quick and dirty 5 minute test, standing still in Plainsong and staring at the same spot. This is the same area that HWUB uses I believe.

            Everything stock (~330w), I was getting low-mid 3000’s. Bump power limits up to 110% (~360w), i was getting low 3100mhz. UV’d at -75mv with power draw at ~360w, I get 3250-3300mhz. It’s getting better clocks than MH Wilds.

    • Are you happy with the card? Any whine?

      • +1

        I've got this card. No discernible whine. Only a minor undervolt on mine too.

        Now we wait patiently for more FSR4 support…

        • Are you able to override FSR3.1 games to FSR4 using adrenaline or some other way?

          • +1

            @zachhambo: I have not bothered with trying to enable FSR4 as the games that I play run natively at 4k, although i’d say the brief experience I’ve had with FSR3 at 4k is pretty good.

          • @zachhambo: Tbh I've only really been playing Cyberpunk since I got the card. Need to use Optiscaler to enable FSR4, but I'm pretty blown away with how well it works/looks for a third party solution.

            I had a quick look at Space Marine 2 but that had native FSR 4 and it looked phenomenal.

        • Mine has quite the noticeable whine. It’ differ from card to card. That said, the fans get louder than the whine so imo the whine is not an issue.

  • +17

    I asked my Mrs if I can buy this and she said no. 🤨

    • +2

      You are doing it wrong… buy it first and then deal with the consequences later. Maybe in your case the benefits outweigh the consequences.

      One tip… if your saving for a kitchen or bathroom renovation, don't buy anything without approval.

      • +2

        Saving for a euro trip, does that matter or a new GPU matter more?

        • +8

          Euro trip… you never know if you ever can do that again.

          • @jlogic: Especially with these rotten prices, don't buy unless you have to.

        • +3

          Euro trip all the way, gaming is fun but travel is better

        • and when in Europe, she points at the LV or the Hermes, it is going to be a NO from you too?

        • +2

          As a middle aged man, let me tell you from experience, that travel feels hollow if your PC doesn't have a great video card in it.

        • +2

          Save up for a 5090 and see the sights of Europe by playing Euro Truck Simulator in glorious 4k 144hz Max settings

      • +1

        "It's easier to ask for forgiveness than for permission"

    • Man up.

      • Teach me your ways master and I'll be in your debt.

    • +7

      I asked her again and she said yes

      • +3

        Wow! How dare she say no to me!

        • You shouldn't have asked

    • Do your work? Can you afford do purchase without worrying struggling financially? Do you need it in the sense you will get use out of it?

      If you've answered yes to all 3, then purchase it. I still don't understand why you need approval to buy things with your money.

      • +3

        You clearly didn't get the joke.

        I have a 6750XT that does the job and more and I was just being silly.

    • +2

      Just say your finger slipped. Works in many scenarios

      • haha I like the way you think. Will report back.

  • This card looks like a perfect candidate for SFF build.

    • +5

      The PowerColor Reaper is smaller and only requires 2 x 8 power connectors… it's also the only 9070 XT that would fit in my ~8L ITX case

      • +2

        According to PcPart Picker, this Gigabyte is 1mm shorter than the Reaper…

        • +9

          And the Gigabyte is 5mm wider and 14mm taller…

  • +5

    https://www.gigabyte.com/Press/News/2272

    I think gigabyte cards have thermal paste issues.
    just FYI

    • +4

      That page literally explains that there are no issues, just mess from excess paste, probably written up after they received some complaints.

  • +1

    Come on, let’s get that Sapphire down as well.

    I’ve got a 5070ti on order but I’m pretty tempted to go for a 9070/9070xt..

    • +1

      After going from a Sapphire 570 to a Gigabyte 6700XT, I won't be sticking with Gigabyte. So much coil whine, and the fans are shite.

  • +1

    2x DP / 2 x HDMI FYI.

    • +2

      Edit: 5060 8GB just dropped: $560, what do we think?

      First of all its $569 for the single fan model and most 2 fans models start from $599. I cant believe a low end 8gb vram gpu is priced as such ($400 is more a fair price for 1080p gaming), and you dont even need to wait for reviews to know its trash for the price its asking for. You have been warned.

      • Haha I know, the 2 fan one is $565 on eBay though, still terrible.

        • Oh you mean this one?
          https://www.ebay.com.au/itm/127123142170?

          Yeah my bad. I didnt know it was added on ebay already and was looking at pccasegear prices. But its still trash for that price.

          I also dont think 9060xt will be much better. Even budget gamers are screwed.

        • +1

          $100 more than a B580, will likely offer similar performance at 1440p, while falling apart completely in cases where 8GB isn't enough.

          And depressingly it will probably outsell the B580 100:1.

          • @JBark: https://www.pcgamer.com/hardware/live/news/nvidia-rtx-5060-r… seems to be 40% better than b580 at 1440 upscaled, 25% better than 4060.

            • @zachhambo:

              seems to be 40% better than b580 at 1440 upscaled, 25% better than 4060.

              I wouldnt trust pcgamer who are skewed to giving better performance.

              Hardware unboxed reviewed it and its barely better than B580 @1440p
              Although similar to 4060ti @1080p. Why would you pay $700 rrp to play @1080p and gimped on 1440p. Hell nah.
              https://m.youtube.com/watch?v=2e1a2-VxxvQ&start=692

              • @Bang1: Wow, and had to waste time to review this waste of sand while at computex in a motel

                • @zachhambo: Edit: sorry my bad. Rrp $599. But thank god for the reviewers showing the misinformed to stay away from terrible products.

              • @Bang1: Ha, nailed it. Avg framerate for 18 games at 1440p the 5060 is only 3fps higher than the B580, but 1% lows are 3 fps worse. All for a card that's roughly 20% more expensive, though I suspect 5060 prices will fall rapidly.

  • +1

    Literally just paid $1349 for this an hour ago… refunding online order now…

    • +1

      You should have "hold"

  • +1

    Haven't been following PC hardwares for some time … when did Radeon change their product numbering scheme to copy Nvidia lineup ?

    Shouldn't this be RX 9700 XT ? and Nvidia would have RTX 9070 some time later ?

  • +1

    Really hoping for one of the XFX cards to drop in price

  • +1

    Sapphire Pulse is $1249 from the same seller and the Nitro+ model is $1375. Honestly I’m tempted to pay extra for the nitro model, I have the nitro 7900xtx and it’s been faultless and super quiet.

    • +4

      I imagine the pulse would be as well?

      At 1375 I think the 5070ti is a compelling alternative..

      In saying that at these prices the 9070 is much better value. Even though I play at 4K, I’m wondering if spending less on something like that is the go

Login or Join to leave a comment