• expired

INNO3D GeForce RTX 2080 X2 GAMING OC $1079.55 + Free Express Shipping @ Quick Computer Services eBay

140
PROPER10

this has been seen for around 40 dollars cheaper on ebay brand new 20 days ago but it is the cheapest one currently on ebay

Original PROPER10 10% off Sitewide on eBay Deal Post

Related Stores

eBay Australia
eBay Australia
Marketplace
quickcomputerservices.com.au
quickcomputerservices.com.au

closed Comments

  • +1

    I thought it was two of RTX2080 for $1079.55 when I first saw the title…

    • i think thats the product name

  • +2

    I really hope AMD comes out with something soon that rivals these cards at a far cheaper price point. Nvidia really needs some competition! Thanks for the post.

    • +1

      Assuming AMD have learned from the Vega 64, the new 7nm Vega (whatever they call it) has the potential to be a game-changer. With 60% off-the-top power savings just for being 7nm vs 14nm, and a bunch less heat, it's a good start. Fingers crossed AMD have got it right. I'd love to see a version taking down a 2080, with less power and heat.

      • Blasphemous! 😉

      • +1

        It's more likely that they will be GTX 1080-equivalent performance at a GTX 1060 price, which might be a shot against the bow of the RTX 2070 if developers are sluggish with optimisation and feature implementation in games, and NVIDIA doesn't price drop. Both will likely happen before Navi.

        2020 is the real battleground, because I don't think AMD will use what little HPC silicon they'll have available to deliver high-end GPUs over CPUs in early 2019, and that's not really where the volume sales are in PC gaming anyway.

        • +1

          You're probably right. My optimism sometimes gets the better of me, sorry. ;)

          • +1

            @[Deactivated]: It's still a big step forward for the industry, but AMD are definitely playing the long game.

            They'll need to make similar advances to what NVIDIA have with hybridised raytracing in order to stay competitive, but that's a problem the entire GPU industry has been working towards for over a decade.

            Intel will also enter the market around then, so we will start seeing much healthier competition in most segments.

            • @jasswolf: I don't see the need for AMD to provide ray-tracing just yet, yes it looks good but developers have to support it, which only a few developers have and even a 2080ti doesn't offer high fps when utilising ray tracing for reflections. As for the AA benefits, it becomes less necessary at higher resolutions, conventional GPU cores can cover that, and AMD can attempt to introduce better algorithms on existing hardware.

              Look at the Cyberpunk 2077 trailer and see that without RT upcoming games can look awesome.

              The RT cores on Nvidias 2xxx series use up significant amount of silicon, that either reduces the amount for conventional performance or costs the consumers more money to pay for it, along with larger dies are more prone to expensive failures. AMD don't need to compete with 2080ti, it's better if they compete with 2070 / 2060 and win that market, since it is the largest % of the market.

              I'd state it is smarter for AMD not to invest into RT tech until it is much more common in games and they can provide the performance that a future midrange GPU can utilise it effectively. It's kinda like Physx, yes some games use it, but when its not active, does it really matter?

              • @FabMan: This is now the ramp up period to mainstream raytracing adoption in 2020.

                As 7nm EUV should offer double the density of 16nm/12nm at the bare minimum, you'll see NVIDIA using that die budget to add even more RT and Tensor cores relative to the traditional rasterisation units, as DLSS should be able to pick up the slack for 4k 240hz.

                AMD at the very least will have some sort of ASIC-like circuitry for AI acceleration in Navi, or the next gen consoles will get left behind in terms of graphics power and upscaling techniques.

                • @jasswolf: We'll see. With AMD being used in current mainstream consoles and for the next generation of consoles too, game developers will have to put in extra work to get ray tracing just for PC games that AMD will not support in the next generation of GPUs either. Will publishers sell enough extra copies of games by spending money developing ray tracing as an optional feature? If Nvidia was used in one of the next consoles, probably would be, but they aren't, so I can't see many publishers / developers doing so.

                  So I imagine some games will support the tech, but not as many as some hope. Physx is an example of this, it makes gaming a little nicer for those that support, it but not that many games do.

                  If AMD kept the GPU architecture similar, but with greater performance and a better AA option for override in the drivers, they can be a smashing success.

                  • @FabMan: I think you're significantly underestimating how simple it is for a company that makes more general purpose processors (in this case, GPUs, NPUs, as well as x64 and CPU cores) to make ASIC components.

                    The only thing stopping AMD from shoving their own RT cores in the next consoles is how quickly they can optimise the mathematical processing involved using their existing architectural work. Rest assured their R&D teams have been working this problem for around about the same amount of time as NVIDIA.

                    PowerVR is another company that has been trying to solve this problem. Apple is no doubt taking on the same challenge now that they do things in-house. ARM, Qualcomm and Intel too.

                    By no means am I saying everyone with a Pascal card should drop everything and buy Turing, but what we're seeing is the culmination of 2 years of full-scale hardware development, backed by over a decade of research, and something like 40 years of academic study. To suggest that raytracing and pathtracing is a fad is peak internet commenting.

                    • +1

                      @jasswolf: Okay I don't agree with your assessment and your guessing, that is okay. We'll see what the future brings, you don't know and neither do I.

      • Why are you assuming 60% power saving, from what I've read, they aren't utilising 7nm transistors in the 7nm chips, its rather a marketing term to represent that the process is the one that will eventually lead to 7nm transistors.

        • If they do, it's dead in the water.

          The suggestion is that they'll use a slight variation of the low power node currently in phones, hence the 125W TDP target.

          Hardly explains Zen 2 and the Vega 7nm offering either.

          EDIT: wait, are you confused about whether 7nm is actually half of something like 14nm? In that sense they are marketing terms, but the power saving should be up to 60%.

          • @jasswolf: From what I've read, companies are promoting the process size they are using a little bit early. Its supposed to be named after the minimum feature size, but they don't anymore. The example I used, was Samsung promoting their 10nm process when it should have been called 16nm or 18nm, they defending it stating the technology method used will eventually shrink down to 10nm. So prediciting the energy / performance benefits are tricky just on the process naming. I don't know if AMD or TSMC are being honest in their naming conventions, they might be.

            As for 14nm to 7nm, assuming everything shrinks consistently, the GPU chip size should quarter in size.

            • @FabMan: You can say the same thing about Intel, but in terms of the benefits, it is genuinely a power reduction of that amount for the same number transistors and the unit running at the same clocks.

              • @jasswolf: Intel does currently abide by the minimum feature size for a nm process name / number.

                • +1

                  @FabMan: Oh now I see: you're referring to the ITRS guidelines. Yeah, those have been ignored for a while now due to the nature of the low-power nodes that have been getting pushed through every couple of years.

                  They don't tend to follow the same design rules, as the thermal and design requirements are very different to typical HPC environments. I mean, even Intel aren't currently abiding them for 10nm if the first Cannon Lake chip is anything to go by.

        • @FabMan @jasswolf you guys seem knowledgeable - what card is best for a 1k budget? Are there price drops looming to hold off for? Appreciate any help!
          Will be used with a PG279Q :)

          • @winhhh: Just to confirm on this 1K budget, is that the entire PC build including the monitor or just for the GPU?

            What parts, if any, do you already own? Curious if you have a GSync or FreeSync monitor already.

            I assume you'll be playing video games and it isn't for professional work, but what type of games, such as fast esport games or big open world games like Assassins Creed Odessey?

    • Its really the mining boom that (profanity) things up for AMD. The cards were better for mining so the price skyrocketed compared with nvidia (Although Nvidias skyrocketed as well when they could be used for mining).

      When you compare RRP's the AMD cards are usually better value. Now that the mining boom is over I think AMD will make a comeback with gamers.

      • +2

        I don't see how mining hurt AMD, they were selling cards to miners instead of gamers, but they still sold cards. AMD just didn't make any extra money from the sale of the cards, the retailers did.

Login or Join to leave a comment