• expired

Biostar GeForce RTX 3080 GDDR6X 10GB Video Card $699 (Was $799) Delivered @ Mwave

595

These aren't too easy to come by brand new anymore, but comparing to some 40/50 series models per StaticIce:
RTX 4080 starts at $1,800
RTX 4070 starts at $919
RTX 5070 starts at $899

To clarify, free nationwide postage is on orders above $99

Surcharges: 0% bank transfer, BPAY, Afterpay, 1% for credit/debit card & Zip.

Very well may be limited stock, and fwiw Mwave has a March Madness sale going on, so have a browse around anyway. Just don't look at RAM prices.

Some other notable mentions for nVidia GPUs:
PNY GeForce RTX 5070 ARGB EPIC-X RGB Triple Fan OC 12GB Video Card $929 @ Mwave
Gigabyte GeForce RTX 5070 WINDFORCE OC 12GB Video Card $949 @ Mwave
Gigabyte GeForce RTX 5070 EAGLE OC 12GB Video Card $939 @ Mwave
Palit GeForce RTX 5070 INFINITY 3 12GB GDDR7 Video Card $929 @ Mwave

Related Stores

Mwave Australia
Mwave Australia

Comments

Search through all the comments in this post.
  • +30

    Blast from the past

  • +3

    This 3080 or a 9060XT?

    • -1

      On paper the 3080 wins but I haven't known the state of AMD since they were pushing out the RX 500 series
      There was a discussion here : https://www.reddit.com/r/buildapc/comments/1m7ry6p/rtx_3080_…

    • +17

      3080 is probably 25% faster but uses 100% more power

      • -1

        These can be down volted to reduce 100watts and lose only 3% performance.

        • Yeah I think I got mine to about 210w and performance was the same with bump in OC memory and core with the UV.

        • +10

          And the 16GB 5060 Ti which sits 9-11% behind a stock 3080 10GB can be increased 10% for just 15 W, so this is a terrible buy.

          Sales for 8 GB 5060 Ti, 16 GB 5060 Ti, and the RTX 5070 all represent vastly more value, as do the recent RX 9070 XT sales. People should save their money.

          • @jasswolf: Agree. I would go with the 9070XT as mentioned below in the comments.

        • Ampere is the worst undervolting generation in a long time due to the Samsung 8nm node. If you won the silicon lottery you could maybe take it down to 230-240w with similar performance to stock, but 200w is too far. You could do that on a 6800XT though.

          Also, virtually any TSMC GPU is going to be able to drop its power 30% and maintain stock, so even if Ampere could do that, then it's just the same as the rest of the competition?

        • its called undervolting, and is this the non-LHR version?

    • +6

      9060 XT is only $600 but a little slower (in some games, faster in Forza). Not noticeable without an FPS counter onscreen in most games. At least according to this:

      https://www.youtube.com/watch?v=t_4cwbhd8Es

      I wouldn't worry about less VRAM in an actually-more-performant-despite-that card, since the whole point of having more VRAM is… better performance. But the 9060 XT uses a lot less power too.

      Probably comes down to how much you want the NVIDIA exclusive features like DLSS and CUDA, or those few extra FPS in those few games where it's 15% higher, and whether that's worth the extra money (and power costs).

    • +3

      5060 Ti, and turn on DLSS 4 or 4.5. You'll pull ahead of the 3080 just on that, and likely pull ahead further over time.

      • 3080 has DLSS 4 natively without any real performance decrease relative to 50 series and even gets the muti frame generation using the DLSS enabler mod if you're into fake frames. It can do DLSS 4.5 also but with a performance hit compared to 50 series. DLSS 4 looks fantastic as is anyway.

        • +1

          DLSS 4 on the RTX 3080 has enough of a performance decrease compared to the 5060 Ti that they perform the same, and that says nothing for ray/path tracing performance. DLSS 4.5 it pulls ahead.

          The FG injector you're talking about is FSR 3 Frame Generation, it is not multi-frame gen. Get off ChatGPT, and actually go look at the performance numbers for the 16GB and 8GB 5060 Ti. Ok I can now see the custom frame generation tech you're describing, but it will have massive issues with frame pacing, and more artefacts than what FSR 3 frame gen already provides. That's another tick in the RTX 50 column if that's a feature you're interested in.

          I'd personally say just wait for an RTX 5070 sale, but I've been saying that since launch to mass downvoting and ranting. The RX 9070 XT sales are also great if you only care about non-upscaled raster.

  • +6

    Remember it first came out it was hot with all the mining lol.

    • +3

      I can cash out my NFTs to buy this

    • +4

      Got a $2k rig from Techfast with 3600/3080 and mined for 12 months, and it paid itself back after power. Good times.

      • +2

        if you kept all the btc from nicehash and sold late last year, you would have got $10k worth out of it.

        • +2

          dont remind me haha

    • +1

      im guessing you werent there when amd releases the 7970/7950.. it was crazy for the miners at the time..

  • +5

    Can be found ~$500 used on Facebook Marketplace

    • +9

      Not really that much of a discount on a heavily-used years-old card tho I guess

    • +1

      Marketplace must be good in your area. Offering an abundance of slightly cheaper warrantyless used 3080's. Especially if they were used for mining.

    • yup, sold mine for $500 LHR version.

      • sold mine for $1500 the non-LHR version 6 months before ETH merged.

    • +5

      Would not trust the secondary market for any of the decently-specced 30-series cards. They released during the peak of the crypto farm hype and likely got thrashed. They were also VERY hard to come by at the time so only enthusiasts were able to get their hands on them.

    • Listings north of $600 in Melb at least. Somehow getting Covid 19 vibes.

  • +6

    If I liked one thing about this period of GPUs it was because of all the old brands coming out of the woodwork lol. Biostar, Manli, Leadtek etc

    • +2

      yea some of these names i havent seen since early 2000s

      • +14

        Leadtek was the GOAT.

        • Prettiest covers on the GPU!

  • +3

    the rtx 3080 is a hot beast so hope your case has space, id go for something newer

    • +3

      I'm not sure about this card specifically but my old 3080 ftw3 model wasn't bad.
      Just set a slight undervolt and you're good, IIRC my temps would sit around low-70s with the undervolt @ 1950mhz during cpu + gpu stress test in an nCase m1.

      Also, 30-series was the last to still use current balancing across the power cable wires, so no risk of the card pulling 350 watts through a single wire.

      • +1

        Issue was the lack of heatsink size (even though they were big for the time). If you look at the heatsink size of 5070tis and higher end 9070xts they are massive compared to a ftw3. I think the manufacturers knew they had under sized heatsinks for the 3080 and the exhausted heat was the cause of instability in some thermally challenged scenarios.

        • I'm now using a 5070ti (ventus x3) in the same case and get almost identical temps.

          Having said that, I've removed the stock shroud & fans from the 5070ti and I'm using some noctua fans, set up as exhaust, to dump the heat out of the case rather than intake.

          Even if someone doesn't have great airflow in their case they should be able to run slightly lower clock speeds around .830v and it will run much much cooler

    • +3

      As a 3080 owner I can confirm they run hot.
      Granted, I have an aluminium/glass case which might not help.
      However, I had to take off my side panels just so my pc wouldn't crash from overheating.

    • +1

      Get an 'open' case like the TT Core P3 and you've got a free heater keeping the room warm~

  • +4

    Is this card 6 years old?

  • +2

    Damn, Biostar is a name I haven't heard in a while!

    • +4

      Run bioshock on a Biostar for Bioception

      • +3

        Bring us the graphics card and wipe away the debt.

    • +1

      Is that you master Kenobi?

  • +8

    Sorry, but this car is 14-23% faster based on 3 different reviews at 1440p vs the 9060 XT 16GB.

    Which is $200 cheaper + uses a significantly less amount of power, plus it's not 6-years old.

    • +18

      I'm currently tossing between this and a Tesla Model Y, which is the faster car?

      • is any of them red?

        • which one uses less shocks

    • +2

      It also has DLSS 4, RTX video super resolution (in browsers or VLC ect), CUDA rendering (for programs that need it), NVENC encoder, Nvidia Broadcast and more

      DLSS Enabler also allows multi frame generation on 20 and 30 series cards too: https://www.nexusmods.com/site/mods/757

    • -1

      Cheapest 9060 XTs are around 599 now, right? So only 100 bucks cheaper?

      • +4

        529 for the powercolor Reaper 16GB

        • +1

          Not bad.

          Wonder why it's not showing on pcpartpicker

  • +3

    I paid $300 and got a 3070 in marketplace. It was a good deal

  • What's a good price for a used 3090 nowadays?

    • +2

      prices on marketplace are bonkers!, still going for $1500, yo'ud be better off with a 5070, or for a bit more, a 5080

      • Depends on use

        3090's are being used for AI compute. pair them up for 48gb.

  • WOW, good price for brand new………Just paid $500 for a second hand MSI Strix RTX3080 from marketplace!
    sold my year old (to me) second hand 2060s for $175

    Poured over the comparison price per perfomance for ages,
    a 3080 seems to be maybe 10-20% less performant than the 5070, but a LOT more than a 5060…

    and a LOT less expensive too

  • +8

    Did this fall off the back of a cryptomining truck? How are there brand new 3080s available in 2026?

    • +1

      You can get brand new of some-most cards. Stock is always somewhere.

    • +3

      mwave were stock HODL during cryptocraze

  • +16

    10GB and no fake frames. Do your math on what you need. I would go with 9070XT if that 8XX deal is still available. 130 to 150 more is a no brainer for a 16gigs latest gen card that runs cooler and performs better.

    • 12GB 3080 Ti and 3080 Super is better. 10GB is not future proof.

  • +3

    needs to $100 cheaper

  • Is 4070 better than 5070?

    • +2

      No, I'd say the pricing is just an anomaly due to the 4070 no longer being in production .

    • +1

      Performance wise nope unless you're playing PhysX games. But a lot of 4070 still used 8 pin power connectors so if you don't want to worry about a 12 pin you could go a 4070.

  • +9

    A 5070 is worth the extra $200 in my opinion. Better energy efficiency, higher performance and DLSS support.

    Maybe if the price difference was more like $300-$400, but at $699, that is just too high for a 3080.

    I did pick up a 3080 for my wife's old machine off FB marketplace for $400 about a year ago. There are heaps of good second hand deals given how old the 3080 is these days.

    EDIT: Also your price numbers for the 5070 and the other GFX cards are overinflated and based on regular pricing.
    Just look at previous deals on OzBargain, the last few deals had 5070s selling for $750-$800, which completely blows up this $699 deal. Given that, I see no deal here at all.

    • -4

      Even the 16GB 5060 Ti is a better buy than this, but agree with your take.

  • +3
    • +1

      I think that's a US imports issue with returns though. Someone correct me but I think you pay to return to US if it has an issue. Same thing with hard drives and there have been a lot of bad reviews about legitimacy (likely due to product bundling bad and good product together), probably just hoping that people don't notice or bother.

      • +1

        damn fair enough. but I still don't think it's worth $699 when for under $200 more you get a much better 9070XT

  • +6

    Ddr5 -ddr4-ddr3
    2tb ssd-1 tb-512g
    5080-4080-3080

    What kind of life is that

    • AI life.

  • 3080 is trash miner card. Get 9070xt from mwave for a brand new and warrany

  • +4

    Just remember, these older cards are supplied via RMA’s and returns… You could be buying a nightmare in a box.

  • I have a water cooled one in a box i haven't had time to set up for the past 2 years and thought it's irrelevant now but this gives me hope

  • +2

    HOOOOODLLLLLL!

  • -1

    Just buy an RTX 5060 Ti… it might have slightly poorer native performance but it will be better at ray tracing, path tracing, and through DLSS 4 and 4.5, as well as having future support that will see it pull ahead.

    You can even save money with the 8 GB version for 1080p or budget 1440p gaming.

    • The mindless downvotes are insane… the raster performance drop for the 16GB 5060 Ti is 10% compared to the 3080, and that's with stock clocks (you can boost performance 10% without increasing typical power draw more than 20 W). The 3080 also has a stock power draw of 320W, compared to 180W on the 16GB 5060 Ti.

      The 3080 loses 10% performance switching from DLSS 3 to DLSS 4, and then falls further behind with DLSS 4.5… this is a terrible purchase. The 8GB 5060 Ti with a simple OC on the same power budget would beat the 3080 at 1080p with DLSS 4 and higher.

      If you don't like the idea - I don't either - look at the RTX 5070 for path tracing and upscaling, and look at the RX 9070 XT for pure raster. Stop turning this site even more of a pile of slop than it gets with all the sockpuppeting and dog whistling.

      • -3

        You're nVidia's perfect customer :)

        • +1

          No, I'm a pragmatist that understands the technology. AMD have eroded all consumer confidence in their GPU lineup because they stubbornly refused to follow suit when it became clear they needed to pursue genuine AI and RT acceleration in their architecture.

          They took a losing position (CUDA, enterprise) and lost even harder, and now they can't afford to compete in terms of software development. Their CPU gains are slowly eroding now, and their console wins have long been a pyrrhic victory because of the sheer undercutting.

          The only saving grace they have right now is that Sony and Microsoft might tip money in to close the gap, but there's every chance some of that will be vendor locked.

          • +2

            @jasswolf:

            AMD have eroded all consumer confidence in their GPU lineup

            And GeForce hasn't?

            • +1

              @CrispyChrispy: Comparatively, despite the marketing and technical disconnect with DLSS 5, no.

              AI is worth pursuing, but that doesn't make all attempts and implementations of it worthwhile. That's science and engineering in reality: not everyone is going to pull in the same direction all the time.

              In terms of the value of DLSS 5 so long as it is trained on data or sourced from models that have been appropriately licenced, it's a reasonable entry point into neural rendering. Clearly they're having adoption issues with RTX Kit, so this bridges the gap between something like the functionality of RTX Remix (which requires a forward rendering pipeline), and a neural render pipeline where they are more manually assigning properties to models, assets, environments, physics, etc.

              In time, whatever they call this component of DLSS 5, it will be more of a fallback option, much like how games currently use raster/screen space, and/or voxel, and/or simplified mesh methods as a fallback for lighting when hardware ray/path tracing isn't a performant option. It may also be the backbone of a tool to help automate assigning PBR properties to materials in a scene during development.

              It's also not just about photorealism, but that is obviously far easier to train a general model for today. The hallucinations it made were mostly benign, the obvious exception being how it aged a teenage Grace in the RE: Requiem intro, and I doubt any serious developer effort was put into refining the output.

          • @jasswolf: just wait 5 years for 9090 Ti.

          • @jasswolf: "Their CPU gains are slowly eroding now" Lmao you lost me there big time, stop shilling.

            • @MaximusMilkshake: Hardware wise, that's what's happening. Market share won't turn quickly even if it continues.

              Intel have been building up their foundry capabilities after almost a decade of flushing it down the toilet for dividends, and AMD are getting attacked from all sides in the PC and enterprise market. This isn't news, you're just huffing subreddits, which is what made you come chase down a random comment days later.

  • +1

    is the 5060ti 16gb even available any more

  • Does anyone know that Biostar makes these from refurbished miner cards?

  • +1

    WTF is with the 10gb ram crap? I know we're in a price hike but seriously how much more will it cost with 16gb. I'd rather pay more than have an 80 series card with only 10gb ram.

    • +3

      It's worth keeping in mind that the RTX 3080 is a 5-and-a-half-year-old GPU now.

      Without launching into a really long-winded explanation, VRAM capacity is a direct function of the design of the silicon. GA102, the die used in the everything from the 3070 Ti through to the 3090 Ti, was designed with a 384-bit memory interface. The connection out to a GDDR6X module is 32-bits wide. That means NVIDIA had 384 / 32 = 12 "lanes" which they could connect VRAM modules to. At the time, 8 Gbit (= 1 GB) GDDR6X modules were the most commercially viable (at least from what I recall).

      For a 3080, to improve yields and keep costs lower, they lopped off two of those "lanes" for a 320-bit interface, giving you 320 / 32 = 10, and 10 * 1 GB = 10 GB of VRAM.

      • Did the original 3080 only have 10gb?

        • +2

          Yep. A 12GB model was released about 16 months later that also had a few more CUDA cores, but was still called a 3080.

  • +3

    Tiny amount of slow VRAM for $700, power hog, and generally lower FPS/$ than most cards on the market (only seems better value than the Nvidia high-end).

  • +3

    I love my 3080 10gb, but wouldn't buy one new at this stage if you can wait a while for prices to stabilise

    • +1

      Agreed.

      Though (crazily) prices won't necessarily stabilise…

    • +1

      I've been amazed at the price / performance on offer with my recently aquired $500 marketplace one. Would need to spend double to get notably more perfomance that's not double
      heck, you'd need to spend 50% more to get similar performance from a modern card.
      not a massive reason to upgrade really unless you're really chasing 4K everything maxed out - which fortnite actually runs at on the 3080!!
      (i'm not a big gamer so the CUDA cores were the draw for me for CAD rendering)

Login or Join to leave a comment