• expired

PNY GeForce RTX 4070 12GB GDDR6X Graphics Card $999 Delivered ($0 NSW C&C) @ JW Computers

360

The cheapest RTX 4070 launch I can find, though not an ideal price. Relative to the RTX 4070 TI and RX 7900 XT, $900 would be a fair price to performance.

Two fans should be fine cooling such an efficient card.

There is also a $1109 card with a 2TB CS1031, worth around $109 from recent sales.
https://www.pcbyte.com.au/p/bonus-2tb-ssd-pny-nvidia-geforce…

Related Stores

JW Computers Online
JW Computers Online

closed Comments

  • +18

    What worries me is the 12gb vram I liked how Linus just brushed past it like it's not going to be a problem in the future..

    • +24

      Personally that's why I took the RX 7900 XT deal as I've been burnt by the 8GB on the 3070.

      • +21

        It's alright if you don't mind turning down texture resolution in 2 years (now in some cases), or play CPU heavy titles (most competitive shooters). When I run Red Dead Redemption 2 at Ultra 1440p it uses up to 14.5GB in some areas, so it depends on how much willing you are to sacrifice graphical settings after paying 1k on a GPU.

        I agree that the RTX 4070 is much better positioned than the TI, as that will quickly run into the VRAM buffer for the amount of raster it is offering.

        • +6

          lol you typed the exact same thing i was typing :)

        • yes, and windows OS reserves some of the GPU memory aswell.

        • +2

          I buy a six pack of beer. As you go through it - you care less and less over graphics specifics… Just full immersion into gameplay and nothing else!

        • In the bad old days, system RAM was the limit for PC games. Games had to have a copy of the textures in GPU Ram and System Ram, and 32 bit games were limited to 2GB (in reality they would crash once Ram consumed more than ~1.6 GB, which would happen if you modded games like TES Oblivion or Fallout Vegas with 2k textures). With a 64 bit OS, there was a 4GB flag you could activate on the 32bit executable, and you could reach about 3.1 GB Ram before the game crashed. 32 bit RTSes with large amounts of units also benefited from the 4GB flag.

          A 2k texture with an alpha map is 5.33 MB; 4k is 21.3GB (sizes are half without an alpha map). 4k everything is very demanding.

        • -6

          rdr 2 does not use that much vram, stop talking out your ass.

          So many people think they know, but don't actually know, which is hilarious because you are the people that say everyone needs a 3k+ gpu to play games, which they don't.

          Anything you uninformed people have said is refuted with a simple look at the steam hardware survey results. Nothing you say lines up whatsoever.

          • @Willy Beamish: Due to VRAM leakage it hits 12.7GB in Saint Denis with not much happening on screen at all. You can see the 1% lows are quite smooth with the 6900 XT as there is enough VRAM. This is where the 3070 etc. fall apart and you get stuttering and a noticeable performance hit. When I run the benchmark test I get 14.5GB as there's much more happening on screen (police chase, dynamite thrown, shootout).
            https://youtu.be/EFDxWgFEsUs?t=403

        • What?? I can play rdr2 4k ultra (except water physics) 60fps locked on 3070ti8gb

      • +26

        New mainstream games like Hogwarts, RE4, or TLOA already chewing up VRAM. You think this situation will get any better in the future? 12GB is the bare minimum in 2023. No one should have to dial down settings only a year after paying $1000 for a card.

        • +3

          Can confirm - Hoggies runs like a dog on my 3070ti.

          • +8

            @NedStark102: Is that good or bad?
            Don't dogs run faster than hogs?

          • +3

            @NedStark102: 3080 10GB here. Until they patched it recently I had to nerf texture quality in order to play at 4K (with DLSS Balanced) or it would run like a powerpoint presentation. Now they've fixed it by randomly unloading all textures and playdoughifying the game intermittently. Starting to regret buying this card hahaha.

            • @linx1398: Yeah I feel that, i'm running it on mostly high / med texture quality with ray tracing off and i'm getting anywhere from 40-80FPS depending on where I am in the game. The worst part is i've read of people getting far worse on similar 8gb/10gb cards..

          • @NedStark102: I've got a 3070ti and Hogwarts runs fine if you know how to set it up. 4k dlss balanced high/ultra. 50-60 fps

            • @Micsmit: I've used some of the recommended .ini fixes in the Hogwarts Legacy subreddit, as well as using the above settings at 2k resolution yet still getting drops to below 60fps and intermittent stuttering. Any recommendations?

          • +1

            @Willy Beamish: Such quality comments from you. Keep up the good work. XX70 cards have never been midrange, they're for people whose budgets squeeze between the actual midrange 60 cards which traditionally dominate Steam usage charts, and high end 80 series. But sure, keep advocating for Nvidia's price creep and lowered expectations that brainwashes people into expecting 'midrange' longevity from a $1100 card.

      • +14

        If i paid 1k for a graphics card i would probably expect 4-5 years of use, preferably with high graphics settings. Look at how long the 1080Ti lasted, having a look at the old deals people where paying around $850-$900 for the 1080Ti back in mid 2017.

          • -4

            @eggboi: Not sure why i’m getting negged lol. Literally makes much more financial sense to get every second x070 card and flip it while it still has significant value, than every 4th x080 card and flip it when it’s near end of life. Eg current ebay used prices (sold): 2080 $350-400, 3070 $480-500.

            • +3

              @eggboi: Your strategy works best for GPU maker, keep buying/upgrading overpriced GPU.

            • +4

              @eggboi: I think you're getting negged because you fail to see the flawed logic in your argument. You're telling someone they should just flip the card after a couple of years when it starts running into issues playing games because of lack of VRAM. But who are they going to sell it to? Why would someone else want to buy your second hand 12GB 4070 two years after it came out if the prime reason you're getting rid of it is because it can't play modern games?

              Sure, maybe someone just wants a low-end card to play Minecraft on, but if that's the case you're certainly not going to be able to sell your 4070 12GB for anywhere close to $1000 two years from now.

              The value proposition just doesn't work for this card. It just barely makes sense for games releasing today. Do you think all those people that spent $1300 on RTX 3080's would have spent that money if they knew just two years later it would no longer be able to play some modern releases? The situation is only going to get worse over time, not better.

              • @joshau: I get your logic about vram limitations, but i don’t ask why buyers are paying good money for something i no longer want. They will always be there, everyone has their own reasons i guess.

                I also agree that we shouldn’t be making excuses for gpu manufacturers, as others pointed out. But we don’t live in a perfect world.

                My one and only point (which still hasn’t been reasoned against) is that you’re probably better off paying $1000 for a 4070 and selling it in 2 years for $700, then adding the $300 you saved (by not getting a 3080) and getting a 5070. As opposed to getting a 3080 and keeping it for 4 years by which time it will only sell for about $500 (if you’re lucky).

                Personally i think gpu prices are way too high regardless. 4070 should be $500-600, just like the 1070 and 2070 were back in their heyday. Even with inflation, you can’t reasonably say the 4070 is worth $1000.

          • +1

            @eggboi: Because you can buy cards from other brands with 16GB of RAM for a similar price and get better performance in more titles and in likely a card that is better in more titles for longer.

            My 1070 has needed replacing for 1440p gaming for awhile but everything has still been playable, my next upgrade will have 16GB because that gives me some headroom as only the most demanding games are really needing that now…just like 8GB was when I bought my 1070.

            • @nmartin84: yeah this is objectively a better choice if what you say is true about 16gb cards being available. (all else roughly equal). I do hate seeing what nvidia has been doing lately, deliberately handicapping the 70 series cards to push people up to the 80 series, both in laptop and desktop variants, though in different ways. The laptop one just plain sucks. reminds me of the 970 3.5gb vram situation. that was even dodgier because it wasn't made clear in marketing materials.

          • @eggboi: Well, claiming a GPU on tax is really pushing the rules, unless you're a game dev.

            • @AdrianW: Yeah look, i mean if you do half gaming and half motion graphics, or game dev, or video editing, or 3d modeling (or anything else that makes money and requires a gpu) you can claim half the card on tax. and if it's part of a whole computer system that you bought for work+play, you can claim the relevant work portion, even if you don't do gpu-specific work (assuming you're either self-employed or required to provide your own hardware, like a contractor might be). *not tax advice.

            • +2

              @AdrianW: Why?

              Tradies can claim a 100k ute as a work vehicle, real estate agents can claim their leased Bentley convertible as a work vehicle, why can't I claim a $1000 graphics card as a device needed to display spreadsheets?

        • +1

          Exactly!!

    • +12

      Linus has become a for profit shill over the years, his first few years he and his info was good, now I don't waste my time on him.

      Thanks Steve !

      • +6

        yep, Linus isn't on my trust list. he gets excited about things exhibiting same flaws he was ripping on the day before, so thumbs down to Linus, he knows a lot alright, but the advice he gives these days is just random.

        • Except his water bottles, have you heard about his water bottles? Is water cooling worth it? Speaking of which, check out the LTT official water bottle!

          • @eggboi: water? what is water? lol water is so stone age. everyone and his schnitzel dog knows you need to drink Red Bull to be successful

      • +3

        imho Linus was good maybe 5 years ago, his quality has severely dropped and its all about getting hits on his videos and making profits now.

        The tech reviews or actual tech tips are buried in a sea of adverts and teen goofing around. The downside of having a large employee base is pumping out videos constantly regardless of quality and appealing to a younger audience to make profits.

        • You can tell when previously passionate youtubers get bored and just switch it to a semi-passive income stream; their videos become formulaic, heavily sponsored and mostly superficial commentary. He could probably get chatGPT to write all his future scripts based on his recent videos and nobody would notice.

      • +3

        Watch his WAN shows for an idea of what they are like outside of the typical youtube videos. They have good practices against doing sponsored stuff is the short version.

      • -5

        he will lose every cent he invested in framework. very much looking forward to it.

        • +5

          Lol what. He invests in an idea that’s good for the consumers and you want it to fail.
          Linus isn’t perfect but his framework investment and his testing labs are great endeavours that favours consumers over manufacturers.

          • -1

            @FireRunner: People want something shiny and new every five years when their laptop breaks. They do not want to tinker on their old laptop hardware. The only meaningful benefit I can see from the framework laptop is the new 15in where one can add an extra battery (i can already do this with an external power bank) or a GPU.

            it also hurts Linus' credibility to have a vested interest in a product that competes with the products he reviews.

            • +2

              @Gdsamp: The fact that they haven't folded already and are expanding into 16" laptops says not everyone thinks the same way as you.
              I'm excited to see the 16" model with upgradeable graphics card. Just hoping prices are good.

              As for credibility, framework exists in it's own product class at the moment. I don't think it interferes with his ability to review laptops.
              If other manufacturers start making their own modular laptops (which hopefully happens), then a conflict of interest might become a concern.

            • +1

              @Gdsamp: Bro you could say the same thing about custom PCs

              Many people want to just buy a pre-built, many people also like to build their own

              Will be the same for laptops

    • +2

      Im pretty sure all the reviewers did..

      Im going to predict this age as well as the RX470 and RX570 4GB models did

      More so if gaming at 2k or 4k

      A good gpu chip hamstrung by low vram..

      ..i would be betting that nvidia is putting low ram on there gamer gpus, to force people using them for ai, modeling, etc to purchase the more expensive workstation gpus

    • Was waiting for this card to consider upgrading, but sucks to be limited on vram 😩.

      Then again I don’t want to get into amd at it’s current state with unstable drivers, voltage spikes, high power usage. 🤷‍♂️

      I also want to use the card to ml tasks for which amd has very poor support 😤

    • +17

      More VRAM is better than less, but the reality is any GPU is going to have to turn down many graphical settings in the future no matter what it's specs are today. If you can't deal with the idea of not running max textures, then maybe you want something with more, and maybe that pushes you to either a higher end card like the 4080 or to an AMD card.

      I've lived through a lot of GPU upgrade cycles and my general advice is to ask yourself how often you upgrade and what you can or cannot live with. I don't necessarily recommend the RTX 4070, but the grounds for this are not because I am especially worried about it's VRAM, it's because the overall pricing on this whole generation of cards is not very good. And while AMD's pricing is slightly better, it's still historically bad. I think that PC gaming in general continues to become unaffordable to many in a way that wasn't true 10 years ago and I'm quite sad about it.

      If you're actually worried about the notion that VRAM requirements are going to spiral out of control in a linear fashion such that you can't run games at all or that a 12GB card will become totally obsolete in a few short years, I'd say that's not correct. There are two main factors at play.

      The first is that developers are currently transitioning from a cross-generation console development paradigm (PS4/XBO + PS5 / XS) to a current-gen-only one, and this means taking more advantage of the higher memory pools on current consoles.But Sony and Microsoft delivered an ahistorically low memory increase, going from 8GB to 16GB. PS1-4 historically increased by 16x each generation, and this one only doubled! Of this memory in PS5 and XSX, roughly 13.5GB is reserved for games, and which needs to house both CPU and GPU data. The Xbox Series X has a split memory pool where of this, 10GB is "fast memory" ideal for graphics, which gives you a sense of what console developers are optimizing around in terms of GPU memory. Meanwhile the smaller Series S only has 7.5GB available to games, and we would expect something like ~6-6.5GB being used by graphics.

      The other factor at play is that ray tracing requires the game to add extra things to GPU memory. This is surplus to any other requirements the game had previously, so it adds GPU memory pressure.

      Developers do not generally waste time authoring assets of dramatically higher quality on PC versus console, and must develop asset streaming suitable to scale down to Series S and up to Series X/PS5. So until the next generation (PS6 etc), we should expect PC requirements to roughly stabilize. This doesn't mean they won't offer "much better than console" settings for you to select that allow the asset streamer to load in better textures further from the camera instead of waiting until you're close etc, but this is a bit of a different thing to the question of whether or not these games will run, or will be able to look good when you're using a 12GB or perhaps 10GB card. Most games should certainly be able to do that.

      Some people like to over-cater on VRAM as a precaution against poorly optimized ports, which is fair enough if you have the spare money for that or are willing to sacrifice on your GPU in other areas. But it's just one type of insurance, really. Ports like TLOU P1 are garbage even if you do have tons of VRAM. OTOH, ports like RE4 remake can be made to crash at higher rates if you over-allocate VRAM, but you can also just not do that and it still looks and runs quite well - so is that really a "bad port" per se, or even a bad experience? I wouldn't say so.

      • +2

        but the reality is any GPU is going to have to turn down many graphical settings in the future no matter what it's specs are today

        The issue here is that a card which still has computing capability such as 2060/3060 and 4060/70 gets limited by their memory to have very poor performance as seen on HU’s videos. On pc markets due to the million different hardware configurations people use, games even triple A titles are barely optimised because they go by what’s allowed by the game engine.

        • +2

          There are 0 games that don't allow you to fit games in memory by lowering either texture quality or reducing the vram streaming pool. Over-allocating memory causes you to constantly swap out memory which tanks the performance, but you can lower memory usage to fit which clears this up.

          The question is therefore not whether you can possibly fill more (you could even years ago), the question is - is the overall experience running the card inside the appropriate memory footprint for a given GPU good, or is it bad? For an 8GB card, I think it's reasonable to conclude that it's going to become bad (currently it's only bad in a number of games you can count on one hand, but that will increase over time). The RTX 4070, however, is a 12GB card.

          The point of me discussing the console specifications was to explain both the origin of reasons for increasing memory pressure, and where the upper bounds are. A 12GB GPU versus the current consoles is the approximate equivalent of a 6GB GPU versus the previous ones - meaning not that it will forever run max settings, but that it was a perfectly reasonable memory configuration to own and you could have good experiences running games on that the whole gen.

          The question of "what is enough VRAM for a card" is a lot more dependent on what a person is willing to deal with and how long they are expecting to keep a GPU. For instance, if you only upgrade once every 6 years and want to absolutely maximize how good you're running the latest games in 2029 during the early years of the PS6, then yeah maybe you do want to go for the cards with more memory. Similarly, if you can't stand the idea of downgrading your textures, then look for something with 16GB or preferably more. But if you're not too fussed by that notion and are attracted to other features of the card (e.g. DLSS and frame gen, the overall efficiency, the solid RT perf) it may be perfectly fine for you.

          I will also repeat that I don't think the 4070, or any RTX4000 or RX 7000 series GPU is actually a "good purchase" currently. The price/perf ratio is overall very bad, even at the lower end.

      • +1

        Agree with this. Someone makes a poke about VRAM and all of a sudden everyone is harping on about how much of an issue it is - but it really isn't.
        It's like toilet paper during covid, or everyone chasing more megapixels for cameras on their Nokia phones… you think you need it because everyone else is needing it so you must need it too!
        Stupidity snowball….

  • Last night BPC also has 999 for Ventura both two fans and three. Seems like they increased the price now

  • +40

    This is a dogshit price for a card with 3080 tier performance 2.5 years later.

    • +5

      Entirely the consumers fault the past 2 years. Just imagine another proof of work mining boom. They'd be $1600+ and out of stock for months.

    • +2

      Lol and the rat retailer has the audacity to increase the price further. Was already poor value at $999.

  • +11

    $1K for underwhelming mediocrity.

    Will wait for 7900XT or xtx deals

    • +5

      Yeah, 7900xt is already better value. A used 6800xt will beat the 4070 at half the price.

      • Used 6800xt is not even that old, many will still be under warranty.

  • +4

    Yeah no, this cards way too close in price to 4070 ti, preowned 3080 it is

    • Agree. I've been waiting on these prices before pulling the trigger on a 4070 ti. Now that there is a tier below 4070 ti at this price I can't see 4070 ti prices dropping much in the short term.

  • +14

    Why would anyone buy this to only be barely useable within two years?

    You're better off buying AMD cards with a similar price/performance that will last longer.

    Or stick with the 3080

    Stop accepting inferior products from any company.

    • +4

      SFF is one answer. Squeezes 3080 performance into two slots while pulling less power than the 3070, which makes it a good fit for older SFF cases while perhaps helping stem the trend of new SFF designs having to keep increasing in size to accommodate new norms in GPU thickness. Even if the case has 3 slots, it can be handy to have one free to provide the option of adding a backplate with additional USB ports, S/PDIF out, or whatever.

      Other than those sorts of exceptions though, I do agree with more people giving AMD their money and NVidia their middle finger.

      • Except the saddest part is founder edition is not sold in AU - the only 4070 that worth getting given it’s size

  • +2

    Terrible card for the price.

    I remember the 1070 for $600 at launch. Following that way the 7700 rtx is going to be $1600

    • It's a pretty good card just not a good price and could do with a bit more vram. This is the current landscape tho … AMD will release a 7800xt (which is really a 7700xt) and 7700xt ( which may be a 6650xt class)
      …. And they'll all cost way more than their class of cards should. 🤷🏻‍♂️

      • +2

        Same price performance on launch compared to a card that came out almost 3 years ago (rtx 3080).

        This card is as bad as the ti model. It should be $699 maximum.

  • -3

    12gb for a non Ti 4070 isn't that bad imo, since it doesn't have the grunt to take advantage of more than 12gb vram anyway

    • Doesnt have the grunt? It has the grunt of a 3090 ti which Nvidia tried to sell as an "8k card"

      • -2

        Hmmmm, not sure about that one pal - you'd have to point me towards some reputable proof of that claim.

        My understanding is that the 3090Ti is about as strong as a 3080…

        12GB is fine for the 4070 but anything above that and I'm not sure sure.

  • +10

    still in stock. good to see that people aren't buying this sh***.

  • +8

    Lol i remember when 1080tis were about this price.
    But then this does have a whole 1gb more vram. What a deal.

  • +4

    Centrecom had a gigabyte 3 fan for $999 at release, but once they seen other stores charging more, they quickly added $90 to the cost. Not just greedy nvidia and amd, but pc stores trying to cash in max profit as well. Let them rot in there stores to teach them that there greed is now too much. Should be a $700 aud card at max.

  • +2

    Just going to hold onto my 3080 for as long as I possibly can, at this rate I'll be purchasing the next gen 50 series or a cheaper 7900 XTX by that time. It's all about VRAM now!

  • +2

    Not enough ram at this price point.

  • +12

    Hold for $1000 7900 XT brothers

  • There are 4 different 4070 selling for $999, why this one is special?
    https://au.pcpartpicker.com/products/video-card/#c=550&sort=…

    • +4

      You might want to check those links. Stores changed there prices from $999 to 1089 once they seen other stores charge more, some where 999 for 20 minutes on release. Now there all 1089 now due to pc store greed. PCPP is slow at updating prices.

  • $109 discount from MSRP on day one? I like this new reality.

  • looks like my 6800 purchased in sept/oct 2022 was a good buy under $700, as cheaper 6800xt or 3080 performance pretty similar to $1000 4070. This card worth $800 at most, would worth $900 if it has 16G VRAM

  • +1

    The only reason I want this card is cause of low TDP and then it means I don't have to get new PSU and re do all cabling for an AMD card, hopefully its available around $800. I think for $600 its a good deal

    • $600 would be a great price, but I don’t even think 3060Tis have dropped under $600 yet

      • even $600 is very expensive but perception of price has been destroyed by AMD and NVIDIA over past few years

        • Nah $600 is definitely okay for a X70 class card. They used to be around 450-500 and we've had a tonne of inflation since then

Login or Join to leave a comment