• expired

ASUS TUF Gaming OC Radeon RX 7900 XTX 24GB GDDR6 Graphics Card $1479 Delivered ($0 MEL C&C) @ BPC Tech

470

Deal updated
Free shipping on ASUS graphics cards until 30th Jun 2023
NVMe bundle finished and replaced with straight cheaper price on card
All text below is the original deal


Panic stations at ASUS based on the flash X670 deal from Amazon and now this bundle + free shipping on all ASUS GPU's

Drive is $79 and has been posted before so effective card price is $1520

Note: BPC are also offering free shipping on all ASUS GPU's as seen here

Free Shipping ALL Asus Graphics Cards! Limited one per customer!
From: 17th May 2023 To: 31st May 2023

However, the free shipping and even standard $5 metro shipping has not been activated yet on this bundle so recheck tomorrow if you are after delivery

Related Stores

BPC Technology
BPC Technology

closed Comments

  • +10

    It's panic stations at ASUS because of their F'up on the 3D chips drivers. In case people don't know…. https://m.youtube.com/watch?v=cbGfc-JBxlY
    A LOT of fallout has happened since then. Losing sponsor deals, fans/loyal customers. So yeah…. they should be panicking. They're in damage control right now.

    • +6

      That video was more about their MB's failing to cut voltage to shorted pins and causing CPU meltdowns, from memory. Haven't seen anything about their GPUs or 3D or anything.

      In any case really bad time for them to be releasing a portable, I'd have thought.

      • +10

        Yes it was about the mobo's and AMD chips (X3D in particular) having voltage problems. However, it doesn't matter what product it is, the brand name is in the titles. People hear about ASUS doing blah blah blah, some people might not trust the brand for in the near future. Have they fixed it? Apparently yes. But the brand took a hit…. especially saying "download our beta driver, it'll fix the problem BUT you'll void your warrant on your brand new $X-amount of thousands setup".
        They have since changed the wording but still, pretty big hit imo

        • +1

          The only people who know about that story already follow hardware news, are subscribed to GN, Linus, HUB, etc. Those same people are also savvy enough to appreciate that these are two entirely different product categories.

          While it'll be nice for us to get discounts and permanent consumer-friendly policies, people will forget about this in a week. Asus has backtracked, and Gamers Nexus has already moved onto the new outrage that is 4060 pricing.

        • +4

          The warranty voiding was the scummiest BS I've seen from any company in recent memory.
          That, and just deleting the supported CPUs from their troubled bios versions without any kind of note about the change, that was also f**king scummy.

      • +1

        It happened on their HIGH END motherboards ALSO and they gave the customers a special F#CK OFF WARRANTY

    • +1

      Unlimited power

      • BLAST THE VOLTAGES!

    • "panic stations at ASUS"
      Motherboards yes and warranties, not the video cards sir. lol

  • +36

    How's the warranty on these?

    Just kidding.

    • +1

      Lmao!! Nice one 😂😂😂

    • Good question

  • +2

    I might get neg for this, but would the rtx 4080 be the better option?
    I am currently on the fence with either getting the 4080 or the 7900 xtx, as both their price has come down to around to 15xx-17xx mark.

    The TUF 4080 is 200 bucks more, but it comes with Diablo IV (which i will be playing), and in general the 4080 performs better.
    What are the general consensus on these 2 card?

    • Ray tracing performance and DLSS quality is consider to be better on the nvidia than AMD.

      Comes down to price and which features you want.

      If you were going to buy Diablo IV, then that reduces the price difference even more..
      so might not be as hard to choose nvidia.

      Generally I like nvidia (cos they have better features and comes out with the newer tech first and AMD follows or does not have it all)

      However their prices are currently a rip off / highly inflated..
      And their cards they releasing are not good value for money vs what you can get with AMD..

      e.g.
      https://www.youtube.com/watch?v=ocAi9y4n1UQ

      So you need to decide for yourself.

      • The question would be closer with Nvidia and AMD high end video card offering would be for video editing and 3D rendering, cost and performance.
        I doubt anybody with be buying a 7900xtx or RTX4090 only just to pay Diablo. lol

    • +9

      From what I've seen the 7900XTX performs solidly better in raster than the 4080 but worse in RT, though with less of an RT gap than early era RT in games like Cyberpunk 2077 would suggest. Also the XTX has more VRAM though both have "enough" for the probable lifetime of the card - might be a factor for AI, which some buyers (hilariously) claim they will be using the card for.

      DLSS is the wildcard as usual since AMD's FSR has a lot of catching up to do in quality. Also the Lovelace's AV1 encoding is a factor for some people.

      • -2

        If by "solidly better" in raster you mean about +3.5% at 4K, then sure, I guess (source: TechPowerUp average FPS.)

        DLSS and RT make the 4080 a far better buy for only a little more cash.

      • yeah the xtx have 24gb if Vram, that the upside vs an 4080 which only as 16gb.
        I read some games now are hogging about 20gb of ram, not sure how future proof and 4080 will be.

        But for 4080 DLSS seem to provide more frame rate than an XTX with FSR.

        I just kinda want to play D4 on 4k with 100FPS, not sure if XTX can reach that.

        • -1

          Some games allocate out as much VRAM as they can. No game actually needs 20GB of VRAM today.

        • Testing seems to show that FSR and DLSS deliver about the same performance increase - the issue is that DLSS consistently delivers a superior image (source: Hardware Unboxed https://youtu.be/LW6BeCnmx6c, https://youtu.be/1WM_w7TBbj0).

          Also, Diablo IV is hardly a graphical powerhouse. I was able to get smooth performance on my 2070 Super at 1440p during the server slam so I'd expect even a 3080 would be able to deliver close to 100fps at 4K. Either the 4080 or the XTX would be overkill.

          • @dudleydoright: IIRC I was getting about 100-110 using a 3080, but wasn't running 4k. 3840*1600 (75% of 4k res).
            Pretty sure a 4080 or XTX will be fine for 4k100+fps maxed out.

            • +1

              @DeToxin: They'd probably get closer to 200fps. But really who the hell would get either card because they wanted to play Diablo IV in all its glory?

              • +1

                @dudleydoright: Well I agree buying a 4080 for d4 is overkill but they will probably play plenty of other games too.

                • @DeToxin: i am currently using an rtx 2060 and used that to play D4 beta.
                  I will be sinking alot of time in D4 as but it's mostly the reason why i want a new gpu.

                  It's probably an overkill getting an xtx or 4080 just for it, but i want to experience a tripe A game running on 4k 100+ FPS.

            • @DeToxin: are you using AW3821DW?
              if so i have the same monitor, but i only have an rtx 2060 so i was running at a weird res(2845*1600) it was not doing 100 fps and it sounds like an jet engine.

              hence i am thinking of upgrading my GPU, i will be happy if it can do 3840*1600 at 120fps. do you think an XTX or even and XT can do that?

              • +1

                @STHD: I have the same screen.

                I played both beta's with a 6800xt - On medium textures it would be ~120fps but there were definitely frame drops in certain area, in town dropped to like 60fps and other busy areas would see spikes up and down. Definitely playable. Bumping to Ultra textures was no go it ran out of VRAM and was jittery.

                Upgraded to a 7900 xtx for D4 mainly to be able to have breathing room for the Ultra texture stuff… Will wait and see but from other sources it was buttery.

      • +1

        worse in RT, though with less of an RT gap than early era RT in games like Cyberpunk 2077

        No it's getting worse as ray traced workloads become more complex, with the ultimate example being pathtracing, now in Cyberpunk with the RT Overdrive mode. For that, RDNA 3 is dead in the water. RDNA 4 might be a different story, but it's also 18+ months away.

        RDNA 3 is fine with very light raytracing workloads, where it will perform either below or on par with the RTX 30 series, but when you get to features like RTGI it will fall away hard due to all the ray bounces.

        Given how quickly this is ramping and how far RDNA 3 performance fell short of internal expectations, this generation is a bust for AMD unfortunately. If RDNA 4 isn't hugely focused on bumping lower latency cache hits, RT performance, and ML accelerated upscaling and frame generation, it's going to be another bust too and Intel are going to start moving into the #2 spot.

        I get that some people don't think these technologies are super important, but games are on the precipice of RT and PT being the basis of their lighting engines, rather than something tacked on later. Fortnite is the first, and clearly any new UE 5.1 from now on will have it in-built, but everyone will follow as it makes lighting work in games so much easier once it's bedded into the engine.

    • +4

      AMD cards offer more performance per dollar than NVIDIA, and on top of that NVIDIA seems to think a weaker card with less VRAM but extra tech justifies the extra cost.

      • +1

        While I agree. Nvidia has some advantages on the hardware side with their tensor cores

        Also software wise Nvidia has a bit more to offer when considering DLSS, their video upscaler, Nvenc, Nvidia Broadcast

    • +4

      want raw 3d gaming power? go AMD

      otherwise you can choose NVIDIA if you're fine with less VRAM, fake frames and fake resolution powered by AI

      also, ray tracing, apparently? had a laptop with a 3070Ti. turned RT on.. turned RT off.. never touched it again

      • +1

        RT has basically been in beta since 20series.
        Turning it on tanks performance and DLSS doesn't look anywhere near as good as native. I turned it on a couple of times with a 3080 but haven't used it since.

    • The RTX 4080 is generally going to be a better buy than the 7900 XTX assuming close(ish) pricing. Raster performance is very similar, but it smashes the 7900XTX in RT, and has DLSS support. The only real downside of the 4080 (besides perhaps a small additional cost) is less VRAM, but I am pretty sure 16GB will be fine for a good while.

      I paid $1679 for my 4080, so only $80 more than this deal (not withstanding the SSD). I bought it before the D4 deal started, but at a similar price it would be an even better buy with D4.

      nVidia aren't doing themselves any PR favours this gen that's for sure, and some of the products they are releasing on the lower/mid range are rubbish for the price. The 4080 is a genuinely good product that at local street pricing is only fractionally more expensive than the 7900XTX and performs similarly in raster (and much better at RT, DLSS and AI.)

      • did you play D4 beta on 4k?

        how did the 4080 perform, did it lag in certain scenario?
        Do you think with the 16gb Vram is going to be an issue with D4 (mobs in open area)

        • +1

          Nah I didn’t play sadly. I missed the first test weekend and was overseas last weekend when the second one ran.

          16GB VRAM won’t be an issue with D4. There will likely be plenty of performance bugs, but it’s safe to say none will be due to 16GB of VRAM being insufficient.

      • -1

        Very similar raster performance? You mean the 7900XTX is better in everything besides gimmicks like RT and DLSS, right? You can be biased because you bought the over priced 4080, but benchmarks don't lie. Productivity is a different story of course, but most of us are gamers here, I assume.

        • Childish attitude.

          Raytracing is a gimmick? You do realise that rasterisation at the highest levels gives poorer performance than ray tracing. It is the current and next method of rendering many, and soon aspects, of a video game. DLSS offers fantastic image reconstruction while giving performance back and sometimes at its highest quality can offer better than native image quality, while giving back performance.

          • +3

            @FabMan: let's face it, RT is a gimmick no one asked for, and very few people care for it now that it's been here a while.

            RT cores are present in every 3000 series GPU, but it's really usable only in 3080 (barely so) and up. it's there just to mock the consumers and force them to buy more powerful GPU's they don't need.

            same with VRAM, 3070 / Ti would have been a great card overall, but 8GB VRAM murders it. 10GB 3080 variant isn't far off either

            NVIDIA took a few wrong turns in the past couple years, aimed to screw with us! even while I can afford to shell out $$$ for, say, 3090 and be done with it, they won't get any of my money, period. they can game on their 3090 in 4K with RT on if they like, I don't care.

            • +1

              @shabaka: "let's face it, RT is a gimmick…"
              Nope, currently there are 169 PC games that either have support or have announced support, so publishers and developers are happy to use it, plus many gamers are happy to utilise it in their games. It has already exceeded the total number of Kinect games :)

              At a certain point, classic rasterisation techniques become less efficient than just switching to raytracing. So in future instead of wasting performance on trying to improve visuals with rasterisation, making the switch to raytracing will offer better visuals and performance.

              I 100% agree that many NVIDIA RTX GPUs cannot utilise raytracing effectively, but Cyberpunk 2077 RT Overdrive works on a 3070Ti at 1440p with an average of ~40fps with DLSS performance. I think that combo of visuals and performance on a 3070Ti with only 8GB VRAM is pretty impressive.

              I have and I am happy with my 6700XT, doesn't mean I can see the future of gaming.

              • +2

                @FabMan: I agree that maybe, in some distant or not so distant future we'll be playing with ray tracing exclusively, but I am not going to be paying $$$ to beta test this technology for Nvidia and game developers. when instead of being the future of gaming it becomes the very much accessible present of gaming, yeah, it will be a different landscape.

                at the moment it's not, and RT is just a trick NVIDIA pushes to sell more expensive GPU's (often crippled with not enough VRAM).

                somehow I didn't make it into that number of "many gamers who are happy to utilize it [RT] in their games" and don't know anyone who does. I prefer smooth fps as opposed to realistic lighting and reflection handling. if I wanted that, I would just turn my PC off and head out for a walk. wow! unlimited polygons and realistic reflections, how impressive. I kinda like smooth visuals in my games, that's what high refresh monitors are for.

                but I guess that RT saturated population does exist, otherwise team green wouldn't have a case for it right now, when it's just a distant future of gaming.

                • -1

                  @shabaka: If I wanted smooth visuals I would just turn my PC off and head out for a walk. wow! Everything updates at a very smooth rate. Do we need to carry on with such childish comments, or are you done?

                  Companies have been making the decision for people for years and that is the reality of business. Were people crying out for video games before video games existed? Nope, a company thought they could sell them and they did. When the first PlayStation T-Rex demo was shown, people were shocked and impressed, people had no idea a console could deliver such visuals and people were not expecting or demanding it. People weren't asking for the Wii or the Switch type gaming consoles, but they sold like gangbusters and people loved them.

                  There are many examples in gaming and outside of gaming of companies offering products people didn't demand but bought. Raytracing is next, whether you want it or not, as seen by GPU roadmaps for AMD, NVIDIA, and Intel.

                  • +1

                    @FabMan: so what? take a trip to the museum of failures, each of those products was on a roadmap but for whatever reason didn't take off.

                    at any rate I am not saying it's not future of gaming, on account of not caring. when it becomes present of gaming, then we'll revisit this conversation. till then this is just speculation. companies make decisions, it's their game. mine is to support them or not, and that's what I do.

                    • @shabaka: "each of those products was on a roadmap but for whatever reason didn't take off."

                      You serious? The PlayStation was a failure? The Nintendo Wii was a failure? The Nintendo Switch is a failure? The only one mentioned that was a failure was the Magnavox Odyssey and that spawned the home video game industry.

                      I could look at NVIDIA, Intel, and AMD all having raytracing in their GPU roadmaps, the many studios that have or are implementing raytracing, or I could believe you, you who thinks the Nintendo Switch is in a museum of failures. I'll go with the experts thanks. Time for that trip outside to see all the polygons and realistic reflections you are talking about.

                      • +1

                        @FabMan: XD good luck. not even reading what I wrote, lol

                        • @shabaka: Read it, but I guess I went full think head and misinterpreted.

                      • -1

                        @FabMan: If you'd take a second to hop off RT's weewee, you'd realise that PC is the best gaming platform, not for these cruddy AAA new releases, but for small indy games with small teams of developers that actually care about the game. Don't need RT for those, because they likely don't even have it. Additionally, most of the best franchises were not released now, but years ago, and are continously supported with patches and expansions (think Hearts of Iron, Cities: Skylines, and multiplayer games with huge playerbases, like War Thunder). AAA games have been absolutely rubbish - look at what has become of franchises like Battlefield, CoD, etc.. down the drain. Half the games that benchmarkers, like Hardware Unboxed, use in their benchmarks for RT… no one even plays or cares about them. The best games are not to come, they're already here, or they're ones that won't even use RT. With that said, I would recommend people invest in actual useful hardware… like hard drives, over spending 1.5-2k on a GPU.

                        • @Shootinputin89: Right, that was near a useless comment.

                          As stated, I have a 6700XT, not powerful enough to play RT on any games, so I just enjoy old school PC games and new ones without RT. I was clearly pointing out Raytracing isn't a gimmick, it is the future of gaming, nothing about all the history of rasterisation being bad.

                          "no one even plays or cares about them"

                          Again, NVIDIA, Intel, and AMD all have it on their roadmaps. 169 games have raytracing. Epic's Unreal, Unity and GODOT all added raytracing support, Unreal 5 uses Lumen which has pathtracing technology. Literally the major players in gaming tech, many game publishers and developers are supporting raytracing. So it is their roadmaps and announcements vs you and a few in your echo chamber, sorry, but you don't count for much in this discussion.

                          How are AMD GPU sales compared to NVIDIA? Might give an indication as to what people want.

                          • @FabMan: AMD GPU sales are pretty damn fine, considering their GPU are shipped in every console walking out the door. But yours is a near useless comment in itself, as a) GPU sales are down right now across the board (Nvidia has just dumped the 4070 because sales are so bad, and both upcoming 4060 are DOA), with the only truly fantastic current gen card being the 4090 (when its connectors are not melting) b) people buy GPU for more than just gaming, Nvidia has the market cornered for productivity driven tasks c) AMD still suffers from Nvidia's marketing that AMD has driver issues. If you're trying to correlate that Nvidia's sales are some-how linked to their better RT performance .. it just isn't so. People are more likely to buy current gen cards for AV1 encoding/decoding than they are RT.

                            • @Shootinputin89: What I'm trying to correlate is that you have no idea what you are talking about. However. AMD, Intel, NVIDIA, Epic, Unity, CD Projekt Red, Valve, Ubisoft. Microsoft, Sony, and many others all have Raytracing as part of their future. Or I could listen to you about this, a nobody.

                              • @FabMan: lol, rightio, chief. Time to face reality of what gamers want. Who cares who is on board with RT if gamers are more excited over smaller indy games rather than what trash AAA game is coming next, and even if they do play the AAA game, they'll probably do it without RT. RT may be the future, but no average gamer cares about it. You keep talking about the future, that's great… in the future. If they can stop making bad AAA games. Look at you trying to act all superior though, it's pretty cringe. Also funny you say your card is not powerful enough for RT, with it being the equiv to what is in the XBOX Series X and a little better than PS5. Gamers don't care about RT, I'll say it again. 4k gamers who want RT and all that are a minority. Even then, it will be used in single player games, an even smaller minority (as those who care about MP, care about frames.. not RT).

                                • @Shootinputin89: Acting all superior because I listen to experts over the echo chamber you haunt? Jesus, people like you is why we have problems in politics and the recent vaccine. You ignore experts, listen to the people in the forum's or social media you visit and think you've got it right.

                                  What is your point here then if you think RT might be the future? You agreeing withe but still arguing? More useless comments from a nobody.

                                  • -1

                                    @FabMan: You're citing commercial greedy companies like Nvidia as the experts. That is the issue. I am right. Keep harping on about the future when we're discussing GPU of the now. You're the nobody. Imagine getting this worked up on a bargain website. Oh, and a word of advice, if you have to resort to personal attacks, you've already lost any debate or argument. I'll leave you with this, though, my card is much better at RT than yours, but I'll never use it.

                                    • @Shootinputin89: "I'll leave you with this, though, my card is much better at RT than yours"

                                      Haha, thanks for this. It tells me you have a child's mind.

                                      You are right that I do listen to greedy companies like AMD, as they make the GPUs, and RT is on their roadmap.

        • +1

          Techpowerup average FPS at 4k across their full game benchmark suite has the 7900xtx at +3.5%. So yup, very similar raster performance.

          Neither DLSS nor RT are gimmicks.

    • +6

      I’ve had the XTX more or less since launch and while it’s been a great card, even after 6 months of driver updates I still often get crashes/permanent freezes /greenscreens, on some games a lot more than others, and forces you to restart the entire PC to get it responding again.

      There’s one or two settings you can change to reduce the frequency of them, but it still doesn’t fix it 100% and depending on the game you might still be crashing a couple of times a week, and after spending that much on a graphics card it really sucks when my macbook can run something like StarCraft 2 more reliably (albeit less prettily) than my $3k+ gaming pc.

      I’d get the nvidia just for reliability if you can get the two at a similar price

      • man that sucks to hear, i read about high power consumption with multiple monitor at idle for 7900 series.
        But didn't know about the crashes.

        Did you play the D4 beta? did it have any issue while you are playing.

        I was hoping that i could save 200 bucks by getting the xtx and get D4 which i at the end would still save me 50 bucks or so.

        • -1

          My 4080 sipping 13.6W at idle right now with dual monitor 4K/60 & 1440p/144. Glorious.

        • +1

          nah haven't played Diablo 4 sorry, although they'ree both by blizzard, I think sc2 multiplayer is a fair bit more reliable than singleplayer that I was playing around with before (and crashing a crap ton in), and it's a newer game so I should think support would be decent and it shouldn't be too bad. For graphically-intense(ish) games that aren't specifically bugged I might get ~1 crash a week which isn't as bad.

          Yeeah the high power consump is definitely annoying as well, though I think some driver updates have made that sliightly less bad now than it was before. I think the crashes are also a more common issue with multiple monitors as well, so if you're just running the 1 (I have two, 4k144 and 4k60) it might be more alright?

      • Which macbook can run StarCraft 2?

        • modern M1's certainly, just that you have to run low/medium graphics which is a pain

          • @toxicpuffle: Thanks. I will wait for the M2 Air to go on sale.

    • +1

      I would never base a buying decision on what game comes with it.

      • +1

        ok, if that's the case.

        Which card would you get if the 4080 is 100-200 bucks more expensive than the 7900 xtx.

        • +2

          Honestly, even with my above comment of 7900XTX beating 4080 in raster, and me probably not using RT and other technologies, I'd probably play it safe and get the 4080 - it's just a safer bet, stability, power consumption, productivity wise. But I would be happy with both. Both are safe from all this fear mongoring about 8GB/12GB cards having issues in the future, and both support stuff like AV1 encoding/decoding.

    • Moar VRAM for me plz

    • I bought a 4080 also because i play diablo :) generally been an nvidia guy given my experience with AMD hasn’t been great

    • I recently went through this question, ended up with a new 4080 last week. Was going to be a 4070Ti but then some discounts landed and the price gap between them shrunk enough.

    • I was going to get the 7900xtx as I prefer AMD as a company, and don't care about ai generated frames or ray tracing, BUT theres been an issue with these cards since release where they are stuck at the max memory speed using 10 times as much power as they should be (100w+) on idle if you have screens with different refresh rates connected.

      Its been in every single release note of their drivers since as a known issue, with zero comms about any eta of a fix. Don't think ones coming. On top of that the 4080 is just a lot more power efficient over all, so if you care about power consumption or heat generation (I do as I have a tiny itx case, heat is bad) thats another reason to go 4080. On top of those two things (the things I care about) you ALSO get better ray tracing and dlss and all the other stuff, at the cost of some extra vram headroom. I am of the view that by the time 16gb of vram becomes a bottleneck, the 7900xtx as a whole will probably be a bottleneck

  • ASUS 🤮

  • I have a 2060S and don't care much about RT, what card should I look at as an upgrade that will give me best bang for buck? I was looking for 2nd hand RTX 3070 Tis, is there anything good in the AMD camp for the same price? 1440p UW

    • +1

      RT aside you also have to consider DLSS, although FSR isn't too bad and works with any GPU. AMD will give you best bang for buck in rasta, especially last gen a 6800 16gig is a better option than a 8gig 3070 longevity wise. I personally wouldnt get a card under 12gig vram as I think the trend of higher vram need is going to continue.

      • sold off my 2060, and got 6700 (non-XT) at first, but 10GB VRAM was bugging me, not much headroom left playing Last Of Us it came with. sold off 6700 and got a used 6800 16GB (non-XT) for $500. mind=blown. on a 4k monitor there's nothing that I cannot comfortably play native, forget about FSR

    • +1

      Right now, in that tier and price range, I'd imagine waiting a month or two for new cards to hit the market would be the best move…but otherwise if you can get your hands on a good 2nd hand 6800XT deal, probably a better buy.
      AMD have always, for as long as I can remember, been better bang for buck. In the past it's been drivers and software that's been their bane, but since…Polaris? they've been working very hard on rectifying this.

      • thanks - i'll watch out for a 6800XT deal

    • 6800 xt will do you fine for around $800

  • -2

    ASUS wrecked

    MSI hacked

    What’s next?

    The power of Asian is waning

    • +3

      EVGA making a come back?

      • -1

        EVGA is overrated

    • +2

      Voodoo 3Dfx making their move

    • Someone will discover Jensen's jacket is actually pleather, then the whole empire will come tumbling down.

      You'd think defective products. denial and clumsy PR was something new to the computer parts world the way these youtube drama llamas are harping on about it.

    • Kyro III GPU

  • +1

    Is 24GB enough?

  • +1

    What's with these cards being over 3.5 slots?

  • +3

    Why are people only pissed at them now? Why not when their motherboards went from $300 to $600-$700?

    I would've drawn the line then, the current situation seems extremely trivial in comparison. I could buy a 5800x for $350 anyway.

  • This or $1000 on a used 3090?

  • Waiting for 7900 xtx sapphire to go 1400ish, although after seeing some of the wattage it uses idk anymore

  • +1

    Go away Asus

    • +1

      im not one to stand behind a brand for whatever reason, but I dont see why so much hate? Why you folks are going so hard on them when other companies have done similiar if not worse things before. Is it just because the asus issue is current and yall saw it on the news so you directed all your anger towards them?

      If your aim is to buy brands that 'didnt mess over consumers ever'. I doubt you would be able to find any alternatives for most electronics or really most consumer products out there mate. Because all of them had been or are doing dodgy things one way or the other, just they either havent been found out yet or the news is already past season.

      This is a good deal for a 7900xtx and thats that. Heck if you wanted to direct your angry nvidia is the bigger fish you should go for as they are the main reason a 7900xtx for 1.6 grand is considered a good deal nowadays and not 1k where it should be.

      • +1, the warranty issues from hwub/nexus don’t even apply in au, we have way more strict consumer laws and i recently had my motherboard rma’d without any issues, people blow things way out of context

        • +1

          It doesn't matter about our consumer laws protecting us against such a problem, the point is a multi-billion $$ company should never treat their customers like that and should have much better QC over their products. Nothing should leave the factory broken, bent pins, or even bad drivers.
          You're spending thousands, you'd expect your s**t to work properly.

  • 4060 Ti 8GB and 7600 8GB benchmarks leaked

    7600 with launch price of ~AU$479 struggling to beat the $340 6650 XT and won't match the $436 6700 non XT

    • +4

      Both poor choices, low end cards pedalled as mid tier.

    • Do we know the launch price of the 7600?

  • +1

    "Panic stations at ASUS based on the flash X670 deal from Amazon"

    Nope the ASUS Prime X670-P WiFi-CSM It's low tier board. Even if at 200 it's not worth it.

  • +1

    Hopefully the Ally gets a price drop :)

  • Deal updated, NVMe bundle finished and replaced with straight cheaper price on card

Login or Join to leave a comment