Back again, seems like GPU prices aren't going up after all.
Thanks PGrid for alert
Surcharges: 0% for bank deposit, Afterpay & Zip Money. 1.2% for VISA / MasterCard & PayPal. 2.2% for AmEx.
Free shipping excludes WA, NT & remote areas.
Back again, seems like GPU prices aren't going up after all.
Thanks PGrid for alert
Surcharges: 0% for bank deposit, Afterpay & Zip Money. 1.2% for VISA / MasterCard & PayPal. 2.2% for AmEx.
Free shipping excludes WA, NT & remote areas.
Will this be a decent upgrade to a RTX3070. I have a Ryzen 3700x. Love my GPU but has only 8 GB RAM…ray tracing is not being used as I play CS and AoE. Am still on AM4.
The 16GB RAM will help me with running AI models on my PC so looking for a new card.
Probably good for games but if a side grade. But 16gb ram definately helps on some games. I upgraded from 3060 so I felt the difference.
It's 20-30% faster than a 3070, not really a game changer in that regard.
Look at whether the AI models you're running actually have good amd support, a lot of stuff I tried looking around for wasn't great support wise or simply didn't support it at all. Even a workaround like Zluda is still very far from viability.
Yes will be great for 1440p gaming over the RTX 3070 espeically with the 3070 getting choked out in some games at even 1440p unfortunately because of VRAM.
As the other comment goes yea Zluda is there but is a pain to setup and use. Yea AI stuff will work but u have to be paitent imo if AI driven save the hassle and go with nvidia lol.
But yea for raster performance on games its really good and has good UV/OC potential! :D
Your 3700X is also getting a bit old there since you are playing CPU heavy games if you want to stay on AM4 try for a dirt cheap 5600 which funny enough should be similar in performance to the 3700x but imo performs a bit better in gaming situations. Tbh though for the games you play I'd 100% try the cheap AliExpress AM5 motherboard with a 7500F with dirt cheap 16gb DDR5 5600 ram from somewhere.
Its not a very good market at the moment except for GPU deals in aussie aye lol. xD
I wouldn't bother upgrading just to play those games. But if you want to dabble with AI, then it's a decent choice (and is why I bought one). It is a bit gimped/slow vs the 9070, but its still decent, especially with the 16gb ram (and thus fine for dabbling, where the lower power-draw/heat-output is also handy). The Intel Arc B580 is a cheaper options, but with "only" 12gb vram, it's not quite good enough IMO (if you have the money to spend).
One other thing you may not have considered is media hardware acceleration. In this regard, AV1 decode is good to have for future proofing - but it turns out the NVidia 3000 series has that, so you aren't gaining in that regard. The 9060 does have "newer" AV1 encode etc, but from my googling it seems AMD is usually worse than Nvidia and Intel [1], but they did apparently make good improvements in this generation. I plan to test it myself, but haven't got around to it yet (the 9060 has been sitting on my desk for 3 days now…)
[1]: https://rigaya.github.io/vq_results/ - which shows the rx7900xt having appreciably worse performance compared to the rest - and the rx7900xt is the newest AMD GPU in the data at this point in time.
If running AI models 100% though its not always as simple with non-nvidia cards (I dont run my 7900XTX gaming rig with LLMs I use different machines with 3090/4060 Ti / 5060 Ti for that)
For gaming, I wouldnt even consider it, have a 9800X3D machine with a 3080 10GB that we wont be upgrading yet as performance difference unless you are spending big money, just isnt worth it.
If running AI models 100% though its not always as simple with non-nvidia cards
That's a good point. I'm not up-to-date on the AMD side of things, but I've seen that with Intel GPU's: that whilst they might have "official support" for certain models, it's often only with their specific fork of some other project on github - and ofcourse their fork is like 6 months behind in updates (but working with the model you might want).
But on a more philosophical level - if people like myself didn't buy FX-8350 CPU's, then you wouldn't have your Ryzen. So sometimes you need to do the needful.
Many deep learning models are CUDA-only. But for LLMs, Vulkan support in llama.cpp has come a long way.
You definitely still pay a penalty for using Vulkan over CUDA, but models like gpt-oss:20b are surprisingly viable on a 9060XT 16GB.
Thanks all. Great insights. Did research on AMD cards and issues with running AI models so will pause for now. Also need to think if i shud get 5600x or 5800x 3d cpu first so I can be on the best am4 card since they are relatively cheap. That way the new gpu won't be bottlenecked by the 3700x
AM4 X3D cpu won't be cheap (unless AMD release some new ones).
Check out some 2024 era benchmark charts for the 9800X3D. The 3700X does 68 FPS playing Dragon’s Dogma 2 at 1080P high.
Consider the options;
** Spend a few hundred $$'s now on AM4, and then another few hundred $$'s in 2 years time on anything
** Spend a few hundred $$'s now a GPU, and then another few hundred $$'s in 2 years time on anything.
Thanks for this. So if I get you correctly, your advice is to upgrade GPU first? And direct leap to AM6 later?
It would definately give more headroom. Your GPU would be running 100% with memory use maxed pushing 1440p as mine was.
Got a 4060 8gb. Should I upgrade? Do I need new PSU?
Partner runs this on a 550w. If you can beat that youre golden Ponyboy
Upgraded from a 1060gb to an RX9060 XT 16gb. Big difference, it's pairing quite nicely with my R7 5700X.
Yeah big difference haha. I'd imagine it be like 3x more frames
Yeah about double in some games with significantly higher graphics settings. However not getting the most out of it… PCIE 3 MOBO, 16gb of system RAM… so could be better bu happy for now.
Is this old stock? How hasn't DRAM prices affected AMD yet?
buying RAM is prohibitively expensive => less people are building PCs => less people are buying GPUs, so yes this is likely old stock
No CUDA = no AI malarkey pricing.
I can only think that it's GDDR6 vs GDDR7 Nvidia is using.
I read a good point in a post the other day, which was that our slightly-improved dollar might be enough to offset RAM related price creep - but I would only think that would be for "existing stock". And that might be part of the ploy with selling this stuff at this price, in that the sooner they get this "old stuff" off the shelf, the sooner the "overpriced" new-stock is "the stock" that they can use in their budgets/forecasts/etc, which will look better on the books.
I reckon the same thing might have happened in the week before black friday discounts, when DDR5 RAM was sold "at discount".
Thankyou, grabbed myself one.
F me I bought it for 1020
That seems unlikely
Oops it was the 9070XT one I bought
Damn.. dropped $30 since I bought mine a couple of weeks ago. Great card for the price and surprisingly well performing at lower wattage and temps. I use this in my eGPU case to replace my old RX6600XT. It's much better performing..
Sames-ish, coz I bought one last week for $550 from PCCG. But I don't recall them having surcharge, so that's ~$5.50 saved - and there was a little bag of lollies in the box, so +$0.50 win. Thus total == close enough lol.
Doh.. you're right. I just remembered I bought my 9060XT (for my laptop as eGPU) from PCCG lol. I mixed up my 9070XT (for my desktop) from Centrecom. and yeah, I did get the small pack of mini Haribo gummy bears haha.
If only this had Crimson Desert
Awesome card, paired with my 7700 it's never hits over 60degrees even when thrashing it doin vectorisation.
Finally took a dive. Was intently eyeing the recent 9070 xt deal but this one is just too good to pass up. Replacing my beloved 1660 ti after all these years.
Now am positively hoping I can see and play RE9 for all its glory.
Thanks OP.
Yeah prices are not going up on the shit cards.
$529 seems like a lot for an entry level GPU. Wish we could go back to the days when such cards were under $300.
Yep. I've got my own receipts.
* 280X for $319 in 2014 = $429 in 2025 money. (died years ago, but did alright. It was an XFX GPU, and I think I got in a bundle with a $99 XFX Gold PSU - and the PSU is still going!)
* RX570 for $319 in 2017 = $409 in 2025 money. (still works. was still my nephews daily up until 18 months ago)
* 2080ti for $1385 in 2019 = $1715 in 2025 money. (is my nephews current daily)
* I bought an Intel Arc B570 for ~$299 a few months ago, and it's no doubt plenty good to play all the games worth playing.
for some of us, the good old days were a bit further back lmao
I wish I was doing PC Gaming back then seemed like a good time. 😢
@KARMAAA: It was. Things were moving so fast and yet it felt normal… every game felt like it was a big jump forward. Age of empires (blocky sprites, limited) to fast detailed fps Bioshock in 10 years, while the computers themselves halved in price. The internet still felt new, a retreat from the world rather than just another facet of it. I used to spend my nights afterschool playing Team fortress (the first one) with russians on a 300 ping US server.
Fun times.
I still have my dual 12MB 3dfx Voodoo2s. I didn't buy them new, but together they would have cost $1500 in 1998, or $3200 inflation adjusted.
Bought a GTX 660 for $220 back in late 2012.
That was a pretty good bargain back then. It was retailing at like $239 in the US.
@KARMAAA: I think the US dollar and AU dollar were about the same back then. Anyway that GTX660 lasted me for over 8 years, finally upgraded in 2021.
@MrZ: Late 2012. 1 USD was about 1.09 AUD. So yeah pretty similar. Still though you got a pretty good deal considering the AUD was trailing and you got a lower price than in the US.
Lol.. a 9060XT isn't entry level. 60's series for both AMD and Nvidia (RTX5060) are classified as Mid-range cards with "Entry Level" being either 30s-50s suffix or cards from a previous generation or two (7600 or 6500 for AMD, 3050/1660/1030 cards for Nvidia range and there's still Intel Arc B580/B570 stuff ).
To be honest the dies are basically both entry level from NVIDIA and AMD recently in the 60 series. The midrange has been so heavily watered down to the point where the 5060 or a 4060 is basically an entry level chip disguised as midrange.
The RTX 5050 for instance is basically what would have been a XX30 tier GPU a couple of years ago. For comparison, the GTX 950 was a 228 mm2 die, its predecessor the GTX 650 die was 221 mm2, however, the GB207 die in the RTX 5050 is 149 mm2. That's a big difference.
So what about the 60 series mid range cards? GTX 660 had a 221mm2 die. GTX 960 had a 228 mm2 die. Really… the real '5060' is GB205 which is used for the 5060 Ti it's like 263 mm2 which is a lot closer to the old 60 series midrange chips, remove the extra space to fit the ML specific silicon and it about lines up with the old 60 series die sizes.
Ever since GTX 10 (Pascal) NVIDIA has basically made the stack move down a tier and nobody really cared because the 10 series was so performant. They did it again with the 40 series, moved everything down a tier, except people noticed this time because the performance just really wasn't that great of a leap unless you bought the top die like the 4090. AMD is a bit better, the 9060 XT is closer to a 60 series die size, but it's still trailing a bit but only by a little. They're at least close to 200mm2 unlike NVIDIA. But in my eyes, AMD isn't really that much better they're just following whatever NVIDIA does.
I suppose I agree from technical perspective on that.. only thing is that "mid range" is probably more determined by pricing to performance ratio and more around the sweet spot for the market is. I would agree with the previous guy that the "old" sweet spot was round $300 when GTX1060 6GB (like the ultimate mid range card) cards was around this price, but the mid-range market has moved up in pricing due to memory and manufacturing and inflation as you say to more around the $500-600 now.
But I think we can all agree though that the RX9060XT isn't exactly "low end" GPU though. I'm actually very impressed with it compared to my RX9070XT
Remember when a 3D accelerator that could do 800×600 at 30 fps was considered high end?
Gamers these days don't know how good they have it.
Voodoo 2.. running 16bit colour. You need two of them ($1200) running in SLI to get 60fps. lol.
Not to mention this one's the 16gig…
how does it compare with 5060 ti 16gb
Been reading and watching reviews that the ray tracing capability of this card is a bit iffy. Main reason of replacing my beloved 1660 ti is to use and see such capability in its full glory. Oh well, see how it goes. Centre.com accepts refunds if not satisfied with card's performance?
i doubt anywhere would refund for change of mind.
ray tracing has set the gpu industry back 10yrs for nothing imo
if you want max ray tracing you want a 5080/5090
Then full blown ray tracing is out of the question for me since it seems to be only for the rich and privileged individuals. Moving on now.
Centre.com accepts refunds if not satisfied with card's performance
Absolutely not lol. I don't think anywhere in the country (other than Amazon) allows change of mind returns. Even big retailers. Some may offer it with a restocking fee, but even that would be in the minority.
The margins are way too small in PC hardware and all of the smaller guys would go broke if they did change of mind returns. Hell, most of them don't even allow you to return unopened items.
This seems like a normal price. It's kind of shocking actually.
Is this worth upgrading from a 6070 xt? For context I game at 1080p, paired with a 5800X3D
No, performance difference is fairly negligible. Main advantage is lower power consumption - and for me, size, which is why I got one for a mini-ITX build.
https://technical.city/en/video/Radeon-RX-6700-XT-vs-Radeon-…
https://www.videocardbenchmark.net/compare/4369vs5957/Radeon…
Edit: And newer FSR support I guess.
Edit 2: 'Relative performance' section on Techpowerup's 9060 XT page says 6700XT is 80% of a 9060XT, but that's still only a 25% jump. Probably not worth it still.
https://www.techpowerup.com/gpu-specs/radeon-rx-9060-xt-16-g…
Thank you my friend. I’ve actually upgraded my monitor to 1440p, so may look at a 9070xt upgrade instead then! There’s some good sales going around
Given the deals going around and the fact that my last upgrade was four years ago I was tempted to get a 9070XT but have been happy with the 9060 given it actually fits in this tiny case and it’s a noticeable jump from the 3050 it replaces. From all accounts the 9070XT is a smashing 4K card so you’ll be more than set for 1440p.
the 9060XT is as fast as a 6800 or 7700XT :) but with way better features and ray trasing
Picked one up for my son today, you can look forward to a further $100 price drop now…
Noice, thanks!