[Pre Order] eVGA GeForce RTX3080 XC3 Black 10GB $1139, eVGA GeForce RTX3080 XC3 10GB $1169 + Shipping @ PLE

3213

Related Stores

PLE Computers
PLE Computers

Comments

  • Im going to wait until black friday

    • Same, I have a 2070 Super, so not in a hurry. Especially if Cyberpunk 2077 can run at 4K with DLSS enabled at 60+ FPS

      • is that 4k 60 with dlss using a 2070s?? wow

        • Yeah I'm saying IF I can hit 60FPS with DLSS enabled, then I can wait for next-gen graphic cards probably.

          That doesn't mean these are a bad deal for others. Technically these are default prices, but if you're still running a 900 series or lower and have been waiting for an upgrade and still have an old Core i7 but can't do a full system upgrade, then this is a massive boost in performance.

          Here is a review showing the CPU scaling at 1440p and 4K, summary is as long as you have a 4 core 8 thread CPU or higher, you should be fine until you can afford a better overall system later:

          https://youtu.be/SNMVikiHCXg

          If you're on an older i5 or older AMD system with only 4 cores and no hyper threading, then there's a bigger hit on performance.

        • dlss actually boost fps through enhancement.

          so without dlss it would be less fps for the same resolution as opposed to with dlss…

          • @nsuinteger: DLSS boosts FPS by running at a stupidly low frame rate then upscaling using an algorithm trained on that specific game.

            It is in the name, Deep Learning Super Sampling.

            • @This Guy: Just had a quick look as I didn't think that was the case and you are wrong there

              This is not a “between-frames” Frame Rate Amplification Technology (FRAT) like the other technologies. All frames are rendered lower resolution and then upscaled using AI.

              The frame rate is not interpolated/smoothed when using DLSS, it's native. The picture is upscaled from a low resolution using a variety of different techniques working in tandem though.

            • @This Guy: Deep learning doesn't work on frame rate, it scales individual frames with something kind of similar to waifu2x

        • The problem will be Ray Tracing. It will be great visual with RT on but FPS shit and RT off, FPS good. So 2070s users will be stuck in big dilemma.

      • Yep… me too.
        I'm running a 1440p monitor, so I think Cyberpunk 2077 should be fine with the 2070S.

        I'll wait for a sale &/or wait for the Super or Ti & also see what AMD can offer.

        Lots of big updates coming through in the next few months.
        Exciting Times.

    • You probably will be waiting that long before you actually get the card if you order now…

      • glad i grabbed a 2070 S XC 2 slot 2 fan

        eVGA design is gettin shitter each year, espo that FTW3

        • It's basically the only 2 slot 3080 on the market, there's a small price in clock speed to maintain that format. For anyone rocking a small form factor build, it's the only option.

    • Last year's Black Friday didn't have any deals on new 2080 / 2080 ti. On top of this stock shortage issue this year you won't see any deals at all.

  • well acctually the price is not that bad…or even a bit good?

  • FTW3: https://www.pictr.com/images/2020/09/17/7GNQAc.png its been like this for 10 mins now. I dont want to refresh as it's grabbed my card details already :S. no confirmation email yet though.

  • Can't checkout.

    "There was an issue with payment"

    • Paypal kept giving me that error. i changed to credit card. and tried about 100x times.

      after that i got all kinds of error.

      invalid shipping method
      invalid warehouseID
      invalid payment method
      etcetc

  • Only one HDMI 2.1 port

    • Yeah sucks that I can't run all 3 of my 4k 120Hz monitors off a single card. Will have to buy 3 and NVLink them. /s

  • Thanks OP, you're the best

  • Damn I was able to checkout from this link. You da man OP!!

  • +15 votes

    Is it just me or do they all look terrible? The aesthetics all look weird.

  • Do NOT forget 1.1% cashback

  • Thanks EVGA + PLE for making my decision easy on which AIB and retailer to go with.

    • good for you, cause i just checked the other sites have different varients for 1300+ that is ridiculous

  • do you need a special PSU? I heard it uses new pin connectors.

    • no xc3 uses 2x 8pin, ftw3 uses 3x 8pin. All other AIB uses standard 8 pin plugs. only the founder edition uses the 12pin so far - which has a dongle in the box anyway that adapts 2x 8 pin to the new 12 pin.

      • what is the difference between xc3 and ftw3? Do other AIBs only use a single 8 pin?

        • no single, most cards will have 2x 8 pin, flagship models come with 3x 8pin. If your PSU doesnt have 2x 8pin, you probably need a new PSU anyway.

          • @lawyerz: It's not for my current system. I just need to know if my next PSU will have everything that's needed. I will get a Corsair RM.

            • @lostn: I'd get a 750w/850w. I personally bought a new PSU - Corsair RM850x. If I'm not mistaken, most PSU above 750w will come with 4x 8pin power anyway, but you should get a higher tier PSU for these power hungry cards. If you can wing it, I believe the Corsair AX series is a top quality range.

        • FTW3 has higher clocks and a phat 2.75 slot cooler

    • No, they come with converters. The bigger issue is that you probably need a 750W PSU to use them stably.

      • Benchmarking has shown that with an overclocked i9-10900k, the total system draw with a RTX 3080 would be no more than 520 watts under extreme loads. A gold rated 650W PSU will be more than adequate.

        • That's what I've been fishing around for total power usage… I only have an itx case so psu is 650, and I got that just incase power needed goes up this gen loll

        • Except these have been shown to push 400W in bursts, and the 10th gen CPU has a PL2 of 250W on its own, so there's an obvious scenario in which the system would crash…

          750W is a good safety margin, as well as a little efficiency under typical loads.

          • @jasswolf: Where has this been shown? Provide links please

            650W is plenty.

            • @Radical Larry: https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-f...

              My bad, it's 370W peak at stock. PL2 of the 10900k is known to be 250W at stock. Your numbers do not add up for peak usage, and if that happens for about 10ms the machine crashes, and that will happen.

              The current peak OC estimates surge well past 400 W, but FE BIOS power limits may kick in before that:

              https://www.igorslab.de/en/nvidia-geforce-rtx-3080-founders-...

              750 W, minimum, which is NVIDIA's own recommendation. The 650W advisory you're giving is for the 3070.

              • @jasswolf: I would cede that with an overclocked i9-10900k and an overclocked 3080 and running torture tests, you'd ideally want at least a 750w PSU with a high efficiency rating.

                For anyone running an i9-10900k @ stock with a 3080 @ stock, with the primary use being gaming and productivity, a high efficiency 650w PSU is more than enough. NVidia recommend 750W to cater for the worst case scenario and to ensure there is a massive buffer for any real world use cases. 750W is conservative, 650W is realistic and totally fine.

                • @Radical Larry: I can't be any clearer than 250 + 370 being a problem for 650 when you've got board and component requirements to meet. That's literally peak draw for each at stock, not an OC.

                  You're skimming the edge of a crash due to power loss. Your advice sucks, so consider revising it.

                  • @jasswolf: Once my 3080 arrives I'll do a 48 hour stress test timelapse for you using a gold rate 650w power supply and my i9-10900k - Can't really offer you much more than that.

                  • @jasswolf: Looks like I don't need to make you that video - Someone else has already done it:

                    https://www.youtube.com/watch?v=Bdohv96uGLw&t=0s

                    • @Radical Larry: The 600W models run over-spec, and the 650W gets pushed at 1440p in BFV with unspecified settings… I'm failing to see how this is a good recommendation at all.

                      What do you think happens when RTX features are enabled… what about RTX IO and DirectStorage?

                      Or as multithreaded utilisation climbs, or as people inevitably do more than just run a game in an ideal scenario?

                      The video just highlights how silly this is, especially for an SFF machine.

        • Yep - Jay says the same thing https://www.youtube.com/watch?v=yq7ef7sKryg

          Too bad I have a Ryzen so will be closer to 600W under multithreaded applications but around 550 under gaming.

          • @Stallion: Hadn't seen this, but he mentions 500w at around the 13:25 mark of the video. That's even less than I'd read in other benchmarks. Thanks for pointing it out.

        • Also forgetting that power supply capacity drops as they age, drops even faster when pushed hard all its life. There is no logic in taking that risk when moving up to a 750w is only usually an extra $10-20.
          Sure, it may run fine for a year, that's not the argument here. Is it the responsible thing to do? Hell no, and I would never suggest such advice to my friends.

    • No. The Founders Edition needed a smaller power connector due to the tiny PCB (Printed Circuit Board) used to allow for the boards rear blow though fan.

      https://videocardz.com/newz/nvidia-geforce-rtx-3090-and-rtx-...

      The top two pictures are reference board designs for partners to use/modify. They have two 8 pin connectors in the top right corner.

      The third picture is a 3080 FE, with the single 12 pin connector top right (slightly left from the right edge of the board).

  • They all look much the same to me, is only the boost clock different?

    • xc3 uses 2x 8pin, ftw3 uses 3x 8pin

      to me anyway thats enough to go with ftw3 over xc3. Gamer Nexus testing show power consumption literally went up to the hard limit of 375w with 2x 8 pin with the founder edition. when overclocked ever so slightly.

      The rest is the factory overclocked clock speed which I feel I can just do it myself and 3x8pin should… in theory… give me more headroom to overclock.

    • Nope. Partners's are binning cards based on performance.

      Partner's will test basic GPU parameters, put the GPU's on cards then benchmark them. Once they have bench marked enough to find some parameters that correlate to higher benchmarks, they will bin the better performing GPU's for more expensive products.

      I would expect the cheapest cards to be on average ~5% slower than the FE cards that were bench marked (some will be just as fast, some will be slower). The slightly more expensive cards should perform similarly to the FE's that reviewers have bench marked. The most expensive cards should have better power systems allowing for faster clocks and more performance than the FE cards, but thermals will most likely be worse (those FE card coolers bench marked excellently).

  • Managed to order the EVGA FTW3 Ultra. Confirmation email just came through. I got stuck on the payment processing page so I opened a second tab to check my account and the order was in. Confirmation email took about 10-15 mins. Hope that helps.

    • Mine came in about 11 mins as well. Sidenote I got the FTW3 non ultra version to save $30 thinking I can just overclock it myself with MSI afterburner and now i'm doubting myself lol… too late now I guess

      • Theres like 50mhz between the 2 i think? I went the non ultra. Should run cooler than the ultra till i need to clock it some day.

  • grabbed the 3080 at 1139. Im a happy man.

    • one is a black version, could be just the colour of the card but I have no idea what the difference is between black, gaming and standard

      • yep, picked up on it now. its a bit late at night so my brain isnt too active :p did you pick one up?