• expired

Gainward GeForce RTX 3080 Phantom 10GB (Non LHR) $1788 + Delivery @ TechFast

2150

Update 2pm 19 Jul: My order which was placed 1 minute before posting this deal just got a tracking number, for reference

Luke from TechFast reached out to me about this incredible deal on a RTX 3080 non-LHR model, a good $150+ off from the recent popular deal

Gainward may not be a well known brand in Australia, but they are a subsidiary of Palit, the largest GPU manufacturers in the world. From a quick look around, this gpu cooling is decent enough to mine ethereum without any further modification if you so choose

Upon further inspection, this card seems to have the same pcb as the Palit GameRock 3080, (Source), and the Palit GameRock is renowned to sometimes be able to mine Eth at 100Mh+ @ 175W, (Source), so you may get lucky :)

From Luke:

Cards will be shipping now through to the end of July. These will be the final non LHR cards we have and are getting.

Related Stores

TechFast
TechFast

closed Comments

  • +3

    Beat me by literally 6 seconds

    The phantom seems to be a very quality 3080 card compared to other AIB partners (https://www.youtube.com/watch?v=q_VC_00wcMU)

    • +1

      S tier

    • -1

      Guys be careful buying graphics cards from Techfast. They shipped me an RTX2060 mini when I paid for an RTX2060. Only found out when contacting Nvidia.

      • -1

        Is this a joke? An rtx 2060 mini is a 2060 its just a smaller form factor usually to fit in smaller cases.

        • +1

          The mini has 80mm fans compared to a full size 2060 with 90mm fans not to mention more surface area and fits in my supplied case. It also effects my resale value and it's not what I paid for. I even put it in the order notes! I would have waited or upgraded rather then being short changed.

          • -2

            @[Deactivated]: Did they advertise which 2060 model you were getting? Also why do you think a mini will have worse resale value? Many smaller cards are the ones people seek after for specific builds making them worth more.

            • @XiTaU: Yes, they advertised the RTX2060, not a mini version with small fans. It's why I bought from them. I've never known anyone to want a smaller card that can't dissipate as much heat.

  • +51

    HOLD!

    (But if you really need it this is not a bad deal)

    • +4

      Prices may drop further?

      • +11

        Unlikely for non LHR model as stock is depleted

        • +3

          1) Stock will recover. The writing is on the wall there.
          2) For gamers (as opposed to miners) it's far better to get an LHR model for cheaper.

        • Already dropped

      • +3

        Kidding? I have $1500aud allocated for my 3080ti

      • +9

        Prices are deff going lower. HODL!

        • +1

          NO, You should Sell now and buy it back at lower price later.

      • +9

        it will drop and you will get a better card then this, one more month

      • +16

        Yes. Crypto mining got banned in China and Ethereum is going to switch to "proof of stake" in H2/2021. Unless some other crypto currency promises similar profits, a lot of mining farms will sell out their GPUs to minimise losses.

        Not to mention that ~A$1800 is still ridiculously expensive for a GPU that was introduced almost a year ago.

        • +5

          might be able to get a second hand Chinese mining 3080 gpu for a bargain in a few months.

        • +1

          is proof of stake going to make mining harder therefore not worth using GPUs to mine?

          • +3

            @lycheetea: From what I understand proof of stake will remove the competitive aspect from mining and turn it into a lotto, with the odds of winning that lotto based on your total ownership of ETH. So it'll still use GPUs but the incentive changes from owning and running as many GPUs as you can afford to owning and "staking" as much ETH as you can afford.

            • +1

              @TheJerry: This is not entirely correct.

              Proof of Stake allocates transaction fees based on the size of the stake. The blockchain security is essentially then based on proof of the amount that the stakers have to lose, rather than proof of the amount of computational power that has been used.

              Once Proof of Stake is complete, there will be no need at all for GPU mining on the Ethereum blockchain. However, for Ethereum, it will looks like it will be rolled out in stages, and there is probably going to be a hybrid system in effect for a while.

    • +21

      Apes,Strong,Togetherrrrrrrrrrrrrr

    • +29

      Not a bad deal!? this is an insane price for a GPU. I can't believe people are paying this /end_old_man_rant

      • +5

        Some gamers have a lot of money to (literally) play with. But yes, it's insanely expensive. Not to mention that there are hardly any games that need this horsepower. PC games are optimised to run well on GTX 1060, which is still the most popular gaming GPU out there.

        • +2

          AMD FSR might also make that 1060 last a few more years.

          I'm still holding onto a 1070 laptop as my main gaming rig. It recently got me through RDR2 at 1440p, which could only be done with freesync at ~50fps, and did very well for Days Gone (high settings 60-75fps 1440p) so I haven't actually run into a game I want to play that I need better than a 1070 for. AMD FSR may make the 1070 last me a few years longer.

          • @studentl0an: 1070 for laptop was powerful they didn’t limit it like 2060. If you don’t see any artefacts then there is no need for upgrade. What 1440p laptop is yours?

            • @Wrongguy: Yeah, it's actually one of if not the only laptop GPUs which are faster than the desktop one as it has more CUDA cores and I've got an Afterburner custom frequency curve hitting 1850mhz at 110w. It runs faster than a lot of 2060 laptops that are capped at 80w or less (which is a lot of them).

              The laptop is an Asus GL702VS. It's not 1440p but I have it hooked up to a 1440p monitor. I do not recommend the GL702. I need to run it with the bottom cover removed and ontop a laptop cooler otherwise it will throttle because it has tiny vents on the bottom case. Repasting with cryonaught and changing the thermal pads for K5-Pro thermal grease did nothing, not a degree cooler.

              I'm going to holesaw cut out the bottom of the case where the fans are and put some speaker grill mesh over it as that seems to be the only way to fix the throttling issues without removing the bottom cover (there's a 100+ page thread on the Asus forums about the GL502/702 overheating, well before the TUF A15/A17 debacle). Asus have no idea how to put vents on their laptops unlike MSI.

          • -2

            @studentl0an: No offence to you but I have a 3070 with a 144hz 1440p monitor and it’s not enough to cap that 144hz close to ultra in almost any game…

            Whatever else you said is complete bullshit and I like my games like butter not 50fps 1440p.

        • +5

          "hardly any games that need this horsepower".

          Maybe if you exclusively play indies. Literally any AAA game this year and from the last couple of years you want to run decently at 4K or high refresh 1440p "needs this horsepower." Not everyone is satisfied with 1080p (and for good reason IMO).

          • +1

            @jaejae69: VR most of all needs it, DLSS is helping but it wont be implemented into all games. I'm waiting just a month or two more then I'll have to buy in eventually.

            • @Malemanjam: I just went from a 3070 to a 3080 as MSFS 2020 for VR was a stuttery mess on the 3070.

              I now get no stutters with the 3080 but that's most settings on medium. Sure the 3070 is fine for pancake gaming but with VR you want as much horsepower as possible!

        • +3

          Completely false? Every big budget game from the past couple of years would use up every ounce of power from a 3080 if you want to play at 4k or even 1440p on ultra settings at high refresh rates. So plenty of games need it. There is no amount of GPU power that can't be used up right now and likely will be the case forever. As soon as hardware producers eek out more power, game Devs use it up and more.

          • +1

            @Xastros: The fact that these games use it doesn't mean that they need it. You can always add some expensive effects that nobody even notices.

            It would be an insane commercial strategy to create PC games that rely on the horsepower of an RTX 3080. I mean, how many gamers can spend A$1800 on the GPU alone, if they can even get one? GPUs have been a scarce commodity for many years now. If you're game developer and want to sell into a large market, you damn better make sure that your title runs well enough on a GTX 1060. And then maybe add some effects to give owners of an RTX 3080 the warm and fuzzy feeling that those A$1800 weren't a complete waste of money.

      • +5

        this is an insane price for a GPU. I can't believe people are paying this

        It's a unique combination of being extremely profitable + incredible gaming performance. You can mine ethereum on them when you're not playing games.

        I paid around $1150 for my 3080 and it paid itself off after 4-5 months.

      • +4

        Wait until you see people spending $2500 on a phone.

        • Hahahaha yes so true.

    • Must have been my imagination.

  • +5

    Hi Luke (if you're here) - any news on upcoming 6800 / 6800xt updates re stock & pricing? Thanks for your help

    • +4

      No word on either still sorry!

      • +1

        How many of this 3080 are in stock?

    • +18

      Wow you summon Luke, you get Luke. That's impressive! Wasn't even his post.

      • Luke is omniscient my friend

      • +4

        Reminds me of Supernatural when Dean prays to Castiel :D

      • +1

        I bought a PC from Techfast a couple of years ago that had a GPU issue. Still blown away by how the dude bent over backwards to help me. Great customer service.

    • +9

      If you say "Techfast" 3 times while standing in front of your bathroom mirror and flick the lights off and on he will appear

      • +7

        🍬👨‍💻

  • they are a subsidiary of Palit

    Lol that explains why the fan shroud reminds me of the Palit Gaming Pro.

  • +3

    What's LHR? can someone TL;dr me?

    edit: LHR = LiteHashRate, (i couldnt find it by googling LHR, had to google "GPU LHR")

    • +6

      Lite Hash Rate, they basically cut ETH mining speed by 50% on LHR cards

    • Lite hash rate (50% less)

      E.g. non LHR 3080: 100Mh, LHR 3080: 50Mh

    • LHR means its probably cheaper to buy, cant easily mine on it and the resale value will suck a bit.

      • IDK if the resale value is really going to suffer on much PC hardware atm, there's still real supply problems and its unlikely there'll be a glut anytime soon.

    • Low Hash Rate

  • +16

    To people wanting a new high end GFX card, but want to know some high level differences between Radeon and Geforce, here you go.

    3080/3090, More mature RT so you take less of a hit to your FPS to enable RT, relative to the FPS you had before. DLSS support (helps even more with RT titles). Stronger at 4k resolution. 'CUDA' support, which takes many forms but if you can/do leverage that, it's a massive tick for the 3080. NVENC encoder for streaming (I stream from my 3080 equipped PC to a Shield Pro in the lounge, this feature is awesome).

    6800XT/6900XT, beasts at rasterization and can dabble in RT if it interests you, but takes a bigger relative hit to FPS to turn it on. 16gb VRAM over 10gb (on the 3080 but not 3090), so if that VRAM is useful to you now, or you plan to keep and use the card for quite a few years. A bit stronger at 1440p and even more so at 1080p. Radeons perform better if you're CPU limited, like smashing 80-90%+ of your CPU usage in games, as GPU scheduling is done by hardware, where Nvidia cards need some CPU resources for scheduling.

    • +12

      DLSS and ai applications will stop me from buying any amd card right now

      • +2

        Yep, this is similar if you work with cuda acceleration application.

        • What applications use CUDA that can't use OpenCL?

          • +9

            @studentl0an: Many, as an AMD user for years its amazing how often you hit the walled off CUDA garden. Sometimes there is workarounds that are OK… but most of the time it just wasn't worth it.

            Things like tuflow (floodwater modelling) many of the photogrammetry packages, machine learning etc.

            You will find a lot of research projects based around the Cuda SDK as well simply because its easy to implement and rather powerful. Even things like accelerated raytracing inside blender use to suck on AMD cards.

            Now I have 3090, I can do all the flood modelling I want. It takes about a 10th of the time (sure the card cost 10x as much as my old one) and can mine on it when not needed for work to keep the room warm and offset its cost.

            Sure I get this isn't what everyone uses their cards for, but CUDA is the only reason I picked up a 3090 over AMDs offering.

            • +2

              @Witchdoctr: Thanks for the explanation. While it's quite niche, you spelled out why the need is there for CUDA.

              • +1

                @studentl0an: Deep learning is essentially all CUDA (AMD ports of PyTorch/TensorFlow are not really usable).

              • +1

                @studentl0an: No Problem. You are right calling it niche. Though you will be suppressed at how many applications that can benefit from GPU acceleration will support CUDA because of how easy it is to implement using nvidias development tools.

                It is why the 3090 is actually a good deal for those of us who use CUDA. The other option is often Quadro cards and they are crazy expensive. (Even if they have their own use cases). Engineering software can suck

    • +6

      AMD are also absolutely useless if you want OpenGL (a lot of modern console emulators).
      I made this mistake a few years ago.

      • +3

        This is why I'm staying green. Shame that.

      • +6

        The Emulators should update to Vulkan already OpenGL is the past.

        • +2

          Some of them have now, or are in the process of it.

          But even then, many of them have AMD specific Vulkan bugs, softlocks, etc.
          Most of the emulator devs seem to develop on emulator hardware, and AMD aren't interested in fixing driver bugs affecting emulators (often of their customer's platforms).

          • @idonotknowwhy: Yeah that's true. But I think it's different philosophy, Nvidia are very good at bug fixes and work arounds in their drivers regarding bugs in our peoples software.
            Because Nvidia know people will just blame the card than poorly or out of spec coded software or OS.

            A clear case of this is AMD cards get blamed for poor performance in some DX11 games, where it's clearly a Microsoft coded scheduler issues in DX11 that where problem, but they are now they are fixed in DX12 so AMD cards get a bad rap.

    • may be worth noting AMD has launched their own equivalent to DLSS, however game support right now is pretty limited

      • +1

        I've purposely left it out as it will be implemented on a per-game basis and will work on either Radeon or Geforce, essentially making it a moot point when it comes to choosing between the two, especially since FSR appears to carry essentially the same performance uplift and IQ tradeoff no matter the brand of GPU. The other side of the coin being Geforce owners will get both DLSS and FSR in their respective titles, but Radeon users will not get DLSS.

      • +5

        No, AMDs Fidelity FX SuperResolution (FSR) is no DLSS equivalent. It doesn't even come close. Worse, FSR is no better than established temporal upscaling techniques.

        If you want AMD, buy yourself a PS5 or Xbox Series X consoles, if you can get one.

        • +2

          I'm running FSR now and at the quality setting it looks great and doesn't give me the ghosting that DLSS really shits me with.

        • +1

          Definitely not an equivalent. One is AI driven and can achieve image quality BETTER than that of native resolution. The other is not.

    • +5

      As much as I want to see AMD at the top again, the simple reality is that Nvidia has absolutely taken the right bet with their AI focused tensor cores on RTX cards.

      With RT and DLSS helping to push forward the state of the art in graphics quality… it'll be sure to be more and more relevant in the coming years.

      And if you're talking about games in the past and present, their performance is already more than satisfactory on anything but the biggest and most demanding monitors (4k 120Hz, or Odyssey G9).

      • +3

        A mate of mine has an LG CX OLED (4k120 + VRR) that I've had the privilege of plugging my rig into with a 3080, and honestly, 4k is right in it's wheelhouse already, especially with the VRR tech doing a good amount of the 'heavy lifting' to make the experience buttery smooth even if you can't hit that 120fps mark all the time.

      • +3

        i don't agree, i think amd has done a great job with 6800 xt, its faster then 3080 at 1440 p, for a lot of ppl that game 240hz and non rt its an absolute winner and the card is 9% faster now then when released a year ago through driver optimization, you can bet they will improve on that already, at 4k, its only 3% slower then 3080

        • +6

          I remember reading about that 9% and I believe it's quite misleading, the test was not apples to apples by way of including a different set of games than what was originally tested. Hardware unboxed chimed in to say they have found the margin now is ~the same as at launch between the two. Not to take away from the 6800XT being a great choice for a lot of people, but it hasn't 'finewined' itself to be 9% faster than it was at launch.

            • +2

              @botchie: https://twitter.com/HardwareUnboxed/status/14089757675354603…

              For a constant set of 18 games, HU found the June data to be the same as December. No change in the margin.

              • @Noodles93: You haven't even read what I posted
                If you have Sam there's a big improvement
                Hu uses different measuring tool

                • +1

                  @botchie:

                  You haven't even read what I posted

                  I've read the actual 3d center analysis which your article is basically quoting. I'm familiar with it.

                  Hu uses different measuring tool

                  What do you mean by this? Using the same set of 18 games is the most apples to apples method of comparing a performance uplift of which HU found 0%.

                  • @Noodles93: They are not using Sam enabled computer

                    • +3

                      @botchie: HU have tested SAM previously and found a 3% increase on average across 1080p/1440p in 36 games. https://www.techspot.com/article/2178-amd-smart-access-memor…

                      The 9% uplift you're referring to is most likely a comparison including different games from December to June as @foxpants has explained to you above.

                      If you can share a reputable hardware source (e.g. digital foundry, gamers nexus, computerbase, anandtech) that can demonstrate a 9% performance increase for the 6800xt through driver optimisation i'd love to see it. Hardware Unboxed/Techspot are about as reliable as they come and they found a 0% gain.

                      • @Noodles93: just compare recent benchmarks

                        at launch
                        https://www.techpowerup.com/review/amd-radeon-rx-6800-xt/9.h…

                        vs here recently
                        https://www.techpowerup.com/review/gigabyte-geforce-rtx-3070…

                        I cant find anything better
                        not sure if they used different comp set up but i dont think anything came out between that time to push gaming performance - I could be wrong but doubt it

                        • +1

                          @botchie: In those 2 links you included:
                          - the 6800xt went up from 186.9 to 199.3 (up 6.6%) in 1440p
                          - the 3080 went up from 166.2 to 176.7 (up 6.3%) in 1440p

                          Both cards went up the same amount so it doesn't prove anything…

                          Go look at the game selection in both reviews. It's completely different (e.g. Anno 1800/Project Cards 3 removed, Odyssey swapped for Valhalla, Watch Dog Legions added)

                          I cant find anything better

                          Because it doesn't exist. The 6800xt hasn't magically improved 9% from launch mate. It would be mainstream tech news if it actually happened.

                          • -1

                            @Noodles93: its actually mainstream news that AMD cards improve over time from driver updates
                            no one said that 3080 didn't improve as well so not sure what your on about there

                            • +1

                              @botchie: Finewine is more a fan driven term than anything, but AMD would be foolish not to adopt it, letting fans do some of that heavy lifting for them. 9% in less than a year is fairly ridiculous though, it would mean they were very unoptimized at launch. 4-7% over 3-5 years seems more in line with what Radeons have gained in the past.

                            • +1

                              @botchie:

                              its actually mainstream news that AMD cards improve over time from driver updates

                              Sure, i'm not saying it isn't and it might be true for older cards. But there's no specific evidence for the 6800 XT.

      • My experience with DLSS is blurry motion and artefacts, it's just a crutch for weak hardware that isn't able to run RT fast enough, something that will be rectified in the next generation of cards. I don't see much use outside of upscaling for the tensor cores in gaming, which begs the question. It might have been more beneficial to use that silicon space for more cuda cores, FSR has shown there is more ways to skin a cat when it comes to upscaling.

        But you can count the games with RT on one hand, so people are fussing over cards with no real next gen content to push them. I don't see that really changing much year end either. Speaking as someone who uses a 2080Ti and 6900xt, I have all this expensive hardware but no games to really push em.

        Unless you really need a GPU, I'd wait for the 4000/7800 series of cards which should see chiplets and a big jump in RT performance.

        • Given you could use FSR with a Geforce in any game that supports it, DLSS is just a value add/feature that Radeon doesn't get right now, and you could use or not use it as you see fit. I personally have 0 issues using it, in fact I've found more often that it gives cleaner fine detail vs native, but ymmv.

          • +1

            @foxpants: DLSS is proprietary and because of that support for it might get dropped as developers look to use the easier and more inclusive FSR, especially since console versions are still a big focus for many AAA games. I don't see it being a selling point for long. AMD did incredible work with FSR.

            • @neomoz: Yeah maybe, some features have gone and some have stayed over the years but that's crystal ball type stuff. If you wanted say a 6800XT or RTX3080 today, DLSS is a feature that is on the table for some current/recent games, and some upcoming ones to the 3080's favour, as well as being in a much more 'ready-to-go' state for UE4+5, Unity and other engines. If it is going to disappear, imo we haven't even seen it peak yet.

        • +1

          You can always sell your GPUs. The cost of upgrade is significantly less as a result. And if you time things right, the upgrades can be free!

          Also, your experience with DLSS runs contrary to my own.

          Better image quality on average (it removes thin line and texture aliasing if still an issue - at the cost of some sharpness) than native res rendering, and better framer rates. A true win win… for me at least.

          • @Zaptruder: I am very much agreed there, much better FPS (up to 75% better on Quality mode alone is what I've seen with my own eyes) and much sharper fine detail along with less shimmering on straight edges. A win win for my tastes in image quality with a massive FPS boost, and most users that have it available to enable absolutely would, it's what you'd call a 'no-brainer'. having said that now that FSR is out, it seems the minimal drawbacks of DLSS are unpalatable to some, where FSR has no such weaknesses, go figure.

Login or Join to leave a comment