• expired

ASUS DUAL OC GeForce RTX 4070 12GB GDDR6X Graphics Card $986.25 Delivered @ Amazon US via AU

800
This post contains affiliate links. OzBargain might earn commissions when you click through and make purchases. Please see this page for more information.

Amazon already discounting 3 4 models of the 4070, ASUS OC best of the lot
Reality of launching a US$600 12GB card in 2023 sets in
All the AUS retailers who jacked up their prices just after launch to $1050+ can start asking Synnex, Ingram and Dicker for those sweet discounts

ASUS Dual OC Edition
ASUS Dual
MSI Ventus 3X
Gigabyte WindForce OC

DUAL-RTX4070-O12G

Boost: 2550MHz, 12GB GDDR6X (21000MHz), PCI-E 4.0, 3x DisplayPort, 1x HDMI, 2.55 slot

Price History at C CamelCamelCamel.

Related Stores

Amazon AU
Amazon AU
Marketplace
Amazon Global Store
Amazon Global Store

closed Comments

  • +44

    It is gonna be $889 in 1 week.

    • +7

      Does the crystal ball say anything else?

      • +28

        The price will drop even lower than $889 in 4 weeks.

        • +9

          That's fine, it needs to start its decent to the $730 price tag it should be sold at over the first 6 months of its life cycle.

      • +1

        Signs point to yes.

        • Yes especially considering how the 3070 brand new is around that. I see your logic

      • Yes

    • +3

      its worth 850 at best

    • +1

      not when rtx3080 are still being sold for $1000… i dont understand why ppl are willing to pay stupidly high prices for 30 series still

      • +2

        Theyve been going from 700-800 from what i see. Most of then that got it up for over that are not selling.

        • +1

          @bart2 that’s second hand prices though. New still is $1k+

          • @CY: My bad, didn't even know they still sold them new

  • +5

    Waiting for a drop on the 4080

    • +1

      That is the dream.

      • +1

        Dream does come true some time.

    • +2

      Same. I'm hoping it hits that $1,499 mark some day soon.

  • +14

    Nah, 20% "better than 3070, waiting for 7900xt and 7900xtx to drop.

    • +8

      Yeah an XT for $999, me gusta.

      • +7

        its inching along to that

        for me $1K XT, $1.5K XTX (non reference ones), whatever drops first to that price point, I'll get and my 3070 goes to Ebay.

        • +3

          Me also.
          I got burnt on the 8gb 3070 when they first came out…

  • Would the ventus be quieter because of 3 fans?

  • +5

    Mwave have a 4070 for $999. I realise it's more expensive but for a few bucks more, it's AU stock.

  • +38

    if I wanted 4070 level of performance, I could just grab RX 6800 with 16GB of VRAM for less.. and it's been on the market for yonkies

    • +3

      Depends on the games and your future needs. The 4070 (what should really have been the 4060 Ti) has its advantages in terms of encoders, upscaling quality, frame generation, and eventually further speed ups through SER and more specifically in RT/PT workloads with opacity micromaps and DMM.

      SER should be fairly quick to see adoption (it's in the Cyberpunk tech demo), while the rest will take about 18-24 months. These latter technologies, plus mesh shaders and DirectStorage, massively decrease CPU, RAM and VRAM requirements in games willing to support them.

      • +11

        sounds great, but is it really that much great? I mean how many dollars more am I happy to pay while losing 4GB vram?

        • -8

          My point alongside this is that the card should quickly drop towards $730-$750. That's the buy point, and I think if the current pressure on TSMC, NVIDIA and AIBs continue, it should revert to 'normal and fair' within the next few months.

          But once the tech relating to SSD-to-VRAM asset and texture streaming is in full swing, 8 GB VRAM should meet 4K quality needs again, subject to speed of said SSD. If a PS5 can operate on an effective 12GB VRAM and the XBox Series X 10GB, there's really no issue to be had once the dev work is done.

          • +2

            @jasswolf: yes, I agree there. at these prices team green are clearly dreaming. at $730 I might consider 4070 12GB hoping that ssd-to-vram improvements will come

            but will it be supported on my gen9 Intel rig? am I better off just playing old school way on something that I know is sort of future proof even 2 years after its release, as opposed to something that may be better at a better price point at some future point in time if the advertised awesome improvements will come along at the right time and be supported on my rig..

            • -2

              @shabaka: I can't speak to what you're upgrading from GPU wise. At that price I might consider it as an upgrade to my current 2060, and then skip two generations to likely 2nd gen MCM tech from NVIDIA, or I could just as easily wait for RTX 5000 and then skip just the one generation.

              I'm considering gaming, AI, and hardware video encoder and decoder support for my needs, and the latter two are the most in flux right now.

            • -1

              @shabaka: 12gb memory will be fine for the next 4 years. Especially with dlss.

              • @Micsmit: I am close to limit today with my 10gb card playing The Last Of Us, that's with FSR enabled

                I will hit the limit in Hogwarts. that's today. if I upgrade I won't choose a 12gb card

                • @shabaka: Because nearly all major PC releases are tied to consoles these days. Until the next consoles release and cross gen is finished. (at least 4-5 years. ). There will not be a big bump in requirements like we are getting this year with the end of cross gen from the prev. Gen

          • +3

            @jasswolf:

            If a PS5 can operate on an effective 12GB VRAM and the XBox Series X 10GB

            Because the hardware is known, the game developers are coding the game to fit that specific hardware. Why would they code a game that uses more than 12GB of VRAM on a gaming console that only has 12GB VRAM.

            PC is a bit different. Developers just code the game and can max it out. Then people can use low/mid/high settings or whatever.

            • @plasmoske: Yeah that's not how game development works if you're trying for visual fidelity or performance; they don't just wet all over the floor like puppies because it's PC. Consoles have been using versions of this tech for two generations now, the latest one taking advantage of SSDs.

              Apple, Khronos (as in Vulkan), Microsoft, Qualcomm, Intel, AMD, NVIDIA, and more, they're all pushing it forward. It's the new graphics pipeline.

              • @jasswolf: It is actually how it works. PC can have up to texture 4000x4000 which results in needing more powerful GPU to run it.

                With PS5, it's limited to what the console can handle. Lags too much? Lower textures and optimise the code.

      • +20

        your future needs

        Future-proofing is a total farce that's just designed to illicit FOMO. It's even dafter as a concept when you start applying it to mid tier products.

        The 40xx series excluding the 4090 remain bad products at bad prices, period.

          • +3

            @jasswolf:

            It's not a farce, you've just been lulled into a weak groupthink because we're in a significant transitional period for computer graphics. In 4 years, we've gone from basic low latency ray tracing of scenes to being on the cusp of something visually indistinguishable from full pathtracing, which is incredible.

            What you've said shows the gimmick about future proofing. Imagine someone buying a 3090 Ti last year to "future proof" only to find out it can't run Cyberpunk smoothly because Nvidia released frame generation that's only available on the 40 series.

            • +2

              @BROKENKEYBOARD: Except that there'll be an open source solution later this year via at least AMD, and maybe we'll see a competing derivation of DLSS frame generation that makes use of the weaker optical flow accelerators on the 20 and 30 series.

              Even disregarding that, you've got DLSS ultra performance seeing rapid improvements, so everyone's got a functional option, as in you can literally run this on an RTX 2060 at 1080p30. It looks low res, but it runs and isn't visually broken. Whatever performance a given RTX GPU was getting on Portal RTX, it's getting on this.

              But here's the part that doesn't seem to be registering for people who are crying foul over this: what's being done here is giving developers the green light to build games out using these lighting systems. So, much like how Unreal Engine has software Lumen, you'll see other major engines just use raytracing and pathtracing. For fixed geometry, voxel-based raytracing works pretty good and is lightweight, just has a latency penalty so you don't want to use it on hugely dynamic lights or reflections.

              So you have voxel RT, other software RT, and hardware RT, then hardware PT, so a completely kitted out lighting solution set that moves on from almost all the rasterisation tricks. Now your hardware RT card from 2018 might not get to run ultra lighting that's pathtracing, but because the lighting artists in the game are building everything out via that framework, you're going to get a basic lighting system with way less visual compromises than what you're seeing today, and it's going to get even better optimisation paths.

              What's more, with all those technologies I've been describing here, there's some that benefit RT performance on even Turing, and that translates to more FPS or more light bounces.

              But putting all of that aside, if you honestly think I've been telling people to buy a 3090 Ti at any point, you've been reading my comments with even less care than I thought you were. :)

              • +1

                @jasswolf: i love heated discussions like this cause both sides have some good points and you learn a bit more to educate yourself to make the right decision

                i think @BROKENKEYBOARD is correct in the sense that early adoption is a fallacy given that unless they make software features backwards compatible, then you arent really futureproofed

                i know modding and custom firmware etc are options to unlock that, and optimised/open source solns are coming, but that is a half way approach/outcome and not truly futureproofed. Theres also just plain old hardware limitations, with last gen having weaker optical flow as you said

                so all those people who bought arent getting the polished benefits of next/latest gen, they will only get features will help max out their hardware to its fullest potential or in the case of low tier hardware, allow them to still "work" for newer and newer games thanks to AI

                And that last bit i think is the key part. Experienced PC gamers know that your flagship this year will get floor wiped by next gen, thats why the xx70ti cards tends to be the best value, being the last gen #2 flagship performance, at reasonable prices but with current gen tech. Then it lasts even longer now thanks to ML allowing you to maintain performance as the card ages.

                Thats the real benefit of all this short term. thats the "future proofed" part. to be able to use the card for a long time, but not necessarily be able to use the flagship features year on year. Its a fallacy to think hardware limitations could be overcome.
                Im sure the open sourcing and optimisations etc etc will only help stretch this out more

                So you are both correct, just in different ways

                The issue with the 4000 series cards is that the software and hardware features are the best we've ever seen, but the issue is the price to performance. The shoddy business practices by nvidia rebadging lower tier cards makes me not want to support it but also makes these cards poor relative value unto themselves for those who do want to buy it.

                you are paying almost a grand for a mid tier card, vs a card closer to the traditional xx80 models in the past. Thats a forced artificial way to gimp your "future proofing" no matter how you look at it.

                Watch nvidia release the 5070 next year without the dodgy rebadging and claim 2x performance

                • @furythree: My concept of future proofing is the card being viable for 3-4 generations, ie. 6-8 years.

                  I think that's something reasonable to expect out of the 30 series despite all the current claims. I think most of the problematic games we're seeing at the moment will make these technical jumps, and any game that has DLSS 2 and is in some form of engine development will get DLSS 3, which covers a lot of bases.

                  Watch nvidia release the 5070 next year without the dodgy rebadging and claim 2x performance

                  Given it's set to be on the same silicon node with an architecture design driven by machine learning tools and pathtracing performance enhancement, the performance delta could vary widely.

                  Hopefully it leads to each chip being more individually tweaked, but that would likely be better news for the high end options, where throwing more SMs at the problem hits diminishing returns (eg. 4090 offering 60-65% more processing power than the 4080, but only being 30-40% better in gaming).

                  From then on, the next few generations will likely be about pushing performance rather than many new features.

                  Anyway, 4070 cards are nearly $200 AUD down from local MSRP, so…

                • @furythree:

                  i think @BROKENKEYBOARD is correct in the sense that early adoption is a fallacy given that unless they make software features backwards compatible, then you arent really futureproofed

                  Yes there's two issues: There's "compatibility future proofing" which is an issue that Nvidia exacerbates because they like proprietary things. Then there's "performance future proofing" which is feasible but in general the higher end you go, the worse the value is. Often for longevity it's far better to buy 2 mid range GPUs years apart instead of sinking it all in 1 high end GPU and hoping it lasts. Although I'd say it's still not a great time to buy because neither AMD or Nvidia have released next gen mid range GPUs yet and pricing still seems affected by the mining and covid booms.

                  • @BROKENKEYBOARD: agree

                    though its hard to ignore the flagship performance of the xx80 cards sometimes lol

          • +1

            @jasswolf:

            we're in a significant transitional period for computer graphics.

            being on the cusp of something visually indistinguishable from full pathtracing

            more marketing, real time raytracing is still so far off "good" "full" pathtracing it's silly.

            • @ssfps: Only if you continue to believe that ML acceleration, denoising, and frame generation techniques weren't going to be part of a solve to get it all to 1000 FPS this decade.

              We're not out here making films or physics simulations.

        • +2

          Up until very recently I've always suggested go with AMD for gaming value, but AI tools are absolutely exploding right now for those with the know-how to set them up, and just need better UIs built before other people begin using them, and there's many more in the pipeline. AMD is useless for AI on windows, at this point I'd strongly suggest going with an Nvidia card with the highest vram you can find. You almost certainly will want to be running AI apps on your PC within the next few years, if not now.

          As an example, we're not far off dynamically written and voiced RPG characters in games using AI, though it may take a bit of pre-caching in early days kind of like the long loading times of early games in other eras. The 4bit 13billion-param Vicuna model can already do convincing character roleplay on a local machine with ~11gb of vram, and that's before the community has really had a chance to optimize things further.

          • +2

            @CodeExplode: I wouldn't say AMD is useless for AI, last time I heard it was twice slower, with room for improvement in terms of optimisation, so watch this space I guess? NVIDIA doesn't have the exclusive AI technology as people try to make it sound to justify them keeping higher prices. mining boom is over. build a bridge, get over it!

            • @shabaka: Most AI tools only work with an nvidia card on Windows unfortunately.

        • +4

          Well, I think the 40 series itself are great, with better performance, lower watt and heat. Just really bad price.

    • +1

      Where are you finding it?

      All out of stock from the stores I've checked.

      • yeah I just checked too. looks like mostly used market, goes for about $600 these days

        what I am saying is team green is rolling out a product that feels obsolete before it hit the market, has less vram on a narrower bus and whoa look at the price! what a magnificent release.

    • +1

      Where can you get an RX6800 for less?

    • I haven't seen anything comparing the 6800 favourably to the 4070 or the 3080. What are you guys reading? The 4070 has more than double the tflops of the 6800 and more than 70% fps advantage in 1440p in most games

      • Yeah, more like 6800XT to compete with 4070 or 3080. Adding in a bit of ray-tracing graphics in more modern games, and the 6800XT will fall way behind in performance.

  • +3

    MSI Ventus 2x card is $979.00 on Computer alliance eBay with the targeted SNS4 coupon code.

    If you want a Three-fan Gigabyte card, it's a little more $1,035 + $9,99 postage

  • +6

    I don't understand the pricing on GPU. How does a 4070 drop to this price and a 2060 is still selling for 600$
    . I mean it's just fckn nuts isn't it?

    • it is. but no one should be buying 2060 at this price unless they are completely insane. new RX 6700 has 10GB vram and beats 3060 hands down (almost 3060 Ti performance), and look, it was $419 the other day.

      I got one just for fun to protect PCIE slot from dust, but the amount of meaningful 1440p gaming (upscaled in FSR to 4k) I am getting out of it is insane

      2060 is $200 on eBay that's the real price for it. not a bad card for almost decent 1080p

      • +1

        no one should be buying 2060 at this price

        you should see the entitlement of marketplace sellers on those models 🤦‍♂️

      • Assume you mean second hand. I've had a look on eBay just because I was curious and the cheapest new I found 2060 was 569$.

        • of course! some of them still with warranty (like mine, I'm listing it soon tho)

      • Also just to mention despite great frame rates from 6700xt it seems a few productivity apps like 3d modelling or video editing apps running even on a rtx 2070 out performs 6700 xt in those types of apps.

  • +1

    Why does this have less outputs than my 3070 Asus dual?

  • +23

    HODL to punish Nvidia for insane pricing

    • +3

      But HODL also punish some of us (with old GPU) for not able to enjoy newer games now.

      • Hmm sad but true

        Should check 30 series if not concerning about power efficiency?

        • +3

          Then you'll run into VRAM capacity issues as several games are already experiencing this.

      • +1

        The best thing I ever did was embrace low framerates. Staved off the urge to upgrade for a good while :P

        • -1

          Judging by your avatar, seems like you also embraced Lower Settings too. How far are you taking your GeForce 8800GTX or was it the ATi HD5870 ?

        • +1

          1fps low? Like watching a slide show? :p

          • @edfoo: I gotta say I do enjoy a good slideshow

    • I'm sure NVIDIA will do just fine with or without you HODLing. The only way to properly punish them would be to buy AMD and avoid second hand models.

      Even then, it would take a huge group of users for meaningful impact. NVIDIA are also doing fine selling higher end cards for ML.

    • What does HODL mean. I google it and it comes up with a mining expression.

      • +1

        Hold on for dear life

      • +6

        Don't purchase until some arbitrary point at which time you feel NVIDIA have received enough punishment from you not purchasing earlier.

        • Noice explanation

      • +9
    • +4

      Agreed, 4080 has dropped to $1679 on ebay from $2219 since launch, a lot more to go!

  • +1

    12GB VRAM…. yikes :S

    • +3

      Sounds amazing. I have 6GB.

      • Almost as good as 20GB… Almost

  • -3

    good card now all I need is a POWER STATION TO RUN IT!!

    • +3

      4070 is power efficient. 🤷‍♂️

    • +3

      Huh? No? These are the most power efficient out of any of the recent releases. Seriously… 185W on fe card.

      • Pretty sure it's 200W

        • Max power draw on spec, the aib models may consume more based on number of fans they have. j2c mentioned it consumed 185-188w on their test rigs 17:00

  • Don't you have to add GST to that price when ordering from the US Amazon store?

    • +2

      Nope, whatever price you see whilst browsing on amazon.com.au will be the final price (excluding delivery ofc). Amazon has this weird setup where you can order US, UK or Germany stock but do so through AU, but it works in our favour since sometimes you can get decent deals, like all these high performance gen 4 ssds that have popped up.

  • Does 3x fan edition perform better than 2x fans edition?

    • runs slightly cooler but consumes more power

    • No, might be a bit quieter. The 4070 sips power (<200w), so a 3-fan cooler is not needed unless for aesthetic purposes.

Login or Join to leave a comment