This was posted 9 years 1 month 23 days ago, and might be an out-dated deal.

Related
  • out of stock

3x Gigabyte GeForce GTX 980 WaterForce 4GB Video Cards in Tri SLI @ $2,099 Was $2,999 - Mwave

1300

Good Afternoon OzBargainers,

We're clearing the last three units of this powerhouse graphics solution.
Our previous price was $2,999 and until the end of this month it's $2,099 or while stock lasts.

This special price is even cheaper than buying 3x GTX 980 cards which doesn't include the water-cooling system.
Shop Here: http://www.mwave.com.au/product/gigabyte-geforce-gtx-980-wat…

Note: Mwave is the exclusive reseller of this product in Australia.

Benchmark http://www.mwave.com.au/images/400/ab59359_11.jpg from reference: http://www.mwave.com.au/product/gigabyte-980x3-ultimate-pc-a…

Related Stores

Mwave Australia
Mwave Australia

closed Comments

    • +4

      I wouldn't bother. I don't think Quake will ever have proper support for SLi.

  • -7

    All that GPU power with only 4GB VRAM.
    Have fun with that, I guess.

    Buyers remorse is a guarantee, with this purchase.

    • -3

      I believe it's 4GB of GDDR5 per card, making it 12GB in total.

      • +4

        This comment and the number of negs on my previous comment are solid proof that OzBargain, whilst it's great for many things, is absolutely the last place anyone should look for technical advice.

        • I prefaced it with "I believe" meaning it wasn't fact but an assumption. Whilst OzBargain is great for many things, expecting reading comprehension from g4m3rz!1! is clearly not one of them.

        • @ausmechkeyboards:
          I assume calling me a gamer, or should I say; "g4m3r", is supposed to be an insult? Not really sure why. 'Leetspeak' insults reek of teenage angst. Probably best to avoid them.

          In any case, you don't need to be a gamer to be tech savvy.

      • +1

        Nah mate only 4 GB usable since it's mirrored.

        He makes a valid point though. A lot of games are console ports and being console ports they most likely won't be optimised for PC. Having experienced it first-hand in Watch Dogs, Ryse, Dead Rising 3, etc., my plan is to wait for a GPU with 8 GB of VRAM (same as the consoles).

        • Consoles have only 5GB of RAM for the games while 3GB is always allocated to operating system. That 5GB is for the game engine and the graphics, even if the game is very heavy on graphics, they'd surely only need 4GB of video RAM.

          Your PC at home will most likely have 8GB or more of system RAM if gaming, add on a 2GB or 3GB video card and you'll have plenty for console ports. The poor game ports of the previous generation shouldn't be as bad since the consoles use x86-64 and XBox One uses DX11 soon to be DX12.

        • @FabMan: Yes, in theory we should have sufficient RAM and it should be much easier to port from Xbone/PS4 but having played a few ports, I have not seen any examples where this is the case.

          Oh well, Bloodborne out tomorrow!

        • @leonheart1:

          Just curious, any of those ports designed for the new consoles? I hoping they are not and those coming will be better, but going by the past, not.

          Just looked up Bloodbourne, engine looks more stable than Dark Souls.

        • +1

          @FabMan: I would say they are.

          Ryse and DR3 were xbone exclusives but fortunately only timed. WD came out on every console but pretty sure it was developed specifically for current gen based on hilarious ghost towns seen in PS3 and the watered down graphics for PC.

          I haven't seen any trailers/gameplay for Bloodborne and won't until after I play and finish it. Knowing the level is half the enjoyment when it comes to FromSoftware's games.

        • +1

          Leon knows what he's talking about. PC ports are requiring more and more vram these days.

          With the extra power of the current consoles, the devs simply aren't optimising for PC and just expect you to have beastly settings to run their game at max settings (and high frame rates).

          It's almost pointless even looking at the ram the consoles have and comparing.

          I new Titan X for half this price will server you SO much better.

        • @Phreakuency:

          I'd like to see a cross platform game out soon that requires a minimum of, or has a recommend specs of, 8GB of VRAM on top of whatever system RAM you have. Shadow of Mordor recommended specs are 8GB system RAM and 3GB VRAM and it looks better on the PC than it does on the consoles too. Sure it has a requirement of 6GB VRAM for an ultra texture pack, but those textures aren't available to the PS4 or the XBox One.

          I bet any Haswell/Broadwell Core i5/i7 system with 8GB RAM and a single Geforce GTX 980 4GB will be enough to play any cross platform game for the entire life of PS4 and XBox One, not of course at max settings, but able to run it well. If you slap in more system RAM and use the 980's in SLI, I bet they'll play them very well for a long time.

          I believe this because a beast system released less than a year after PlayStation 3's release is still able to play the latest cross platform games of the XBox 360 and PlayStation 3. Check out YouTube videos from Syker, the system has a Core 2 Quad, 4GB RAM and a single 8800 GTX and plays well: Far Cry 3, Thief (2014), Call of Duty Black Ops 2, Hitman Absolution, Tomb Raider (2013), Battlefield 3 online and more.

      • You only get effective 4GB VRAM when running these in SLI, unfortunately.

        The GPU power is boosted over the three cards (not necessarily tripled); however, the effective VRAM ends up being the maximum of the one card.

    • +1

      Although this guy does not understand the concept of SLI but technically, he is right. The total usable vram despite having 3x980 is still 4gb. But what is tripled is the memory bandwidth, which is the rate at which data is read or stored, i.e. bytes/sec. This is a common misconception. But definitely no buyers remorse for going triple 980 unless you are stingy with the electric bill.

      • Changing soon though, will be 12GB with DirectX12 and API's in game.

    • That is changing, AMD and NVIDIA are working on memory stacking, so cards in CrossFire or SLI can stack their memory giving 12GB in this setup. AMD have it working under Mantle and both AMD and NVIDIA should have it working under DirectX 12.

      So, if they get it to work before the end of 2015 with a driver update, would the remorse lessen?

      https://twitter.com/Thracks/status/561708827662245888

      They do it by each card rendering part of the screen, so half each for 2 cards, a third each for 3 cards, and amazingly a quarter of the screen each with 4 cards. I wonder what happens though if one quarter has lots going on and the other 3 quarters are basic? 17fps in one corner and 60fps solid in the other 3 perhaps?

      • That being said, memory stacking is up to the developer's end to optimise, and you wouldnt see games taking the full advantage of DX12 until early 2016 or so. Directx 12 is no means a guarantee for a 100% GPU SLI vram stack. That is in theory anyway. In real world test, you could be better off with a Titan X than three SLI lower end cards.

        As for rendering the different part of the screen at different fps, you would expect some form of adaptive vsync to kick in. It could be a great way for Nvidia to market those overpriced G-Sync monitors (yes Asus ROG SWIFT…im pointing at you). It is all too early to tell.

        • I do believe a single card is a better solution, but since AMD have this in Mantle I'm sure NVIDIA is already working with developers to have this released with DirectX 12 games. Probably not many mind as only a few support Mantle and that has had great benefits to AMD GPU users. Also no reason why it cannot be patched into any DirectX 12 game at a later date.

          The vast majority of games do not require more than 4GB of VRAM to run at Ultra settings, so by the time they need more hopefully this solution is realised. We will see though.

          Things is, my 7950 3GB Boost is fast enough for any game I've played at Ultra, in a years time a second one will be cheap and having them run with 6GB of RAM would be awesome.

    • Fail. 4GB VRAM is fine, even for 4K. GPU computational power and bandwidth is the current bottleneck.

      Oh, and garbage game optimisation (Like COD:AW) where you load all irrelevant textures into VRAM just cos? Doesn't count.

      • Even if 4GB was currently adequate, across the board, for 4K, it most definitely won't be in a couple of years. Anyone buying three 4GB 980s will run into VRAM bottlenecks long before GPU bottlenecks.

  • 4GB per card

    • -1

      As the posts above state, in SLI 4GB is the max you will be able to use.

      • obviously those posts weren't up when my post went up, eitherway taken 4gb available when in sli

  • +1

    Yes! Finally, minesweeper at 60 fps!

  • +2

    Ah..I will double this as a heater for the coming winter cold.

    • In Tas, you could probably not buy a heat pump/reverse cycle aircon and just let one of these babies run all winter/half the year. Save on buying the heat pump AND get your 60FPS at 4k ulta whammy hd at the same time. Win-win for all.

    • +2

      Would you like an FX-8350 to couple your GTX 980s on your whole-room heating system?

      • 8390 I think

      • +6

        He's trying to heat his room, not burn his house down.

        • or go with a FX-9590 for another ~100W of heat

  • Finally!
    I've been looking for 3 of these!

  • Are there Windows 3.11 drivers for these?

    • +2

      Yes, the drivers will be shipped to you as well. Prepare to load roughly 215 floppy disks (1.44MB high density dual sided floppies).

  • 65 +ve, has anyone bought one?

    • +2

      close, but how life-like can graphics get? decided to buy a mirror instead.

      • +1

        if you spend the same on a mirror, that means you'd see 2K

        this gpu setup will be able to run 4K

        therefore, you should buy this instead

      • +22

        I tried 'outside'. The graphics were really good — definitely next-gen stuff, but everything seems to be pay to win and you only get a single life, there is no fast travel, the NPC's don't give you any quests and the story was lame. Also, where are the enemies?

        Verdict: Amazing graphics but terrible gameplay. 2/10.

  • This "bargain" reminds me of this watch
    https://quriosity.files.wordpress.com/2011/06/amazonwatch.pn…
    The reviews at the time suggested with the money you save you can buy a new car to go with it.

    All good for the OP to sell well below their cost. (even though it will probably be obsolete in a few months.
    I hardly believe the 90+ people who have upvoted this are interested in purchasing it or really consider it a bargain though.
    https://s3.amazonaws.com/pushbullet-uploads/udcry-wUDPUwB1kL…

    • So touchy. Maybe I should have wrote "tech heads".

  • Sorry, I would go for 4k Sony 65 TV instead for this price range

  • What just happened!?

  • Has the price gone back up? DAMMIT! I was about to buy one!! :P

  • 128 comments and almost 2600 clicks for just 3 units that must be sold together, meaning they're only selling to one person.

    The world can be so funny sometimes.

    • +1

      3 units that must be sold together, meaning they're only selling to one person.

      The units don't need to be sold together. Each 'unit' is a system of 3 x 980s.

    • +1

      3 graphics systems in total

  • So….how many eneloops does this use per hour again?

    • +4

      About 300.

      Source: I don't know jack about electricity.

  • will probably need one of these for the witcher 3

  • +2

    ozbargained - out of stock

  • It is though…a terrific deal for what is typically a very expensive 3 products.

    I'm just curious how many people actually bought this.

    • They apparently only had five units in all of Australia and mwave sold all of them.

  • Damn, I missed it again!!!

  • I missed out…. imagine how much this thing will be worth in 10 years….

    • I know right.. it is sad to even try to know :( haha aww i just made myself sad

Login or Join to leave a comment