• expired

Sapphire AMD Radeon VII for $1034.10 + $26 Shipping or Free C&C @ Umart-Online eBay

120
PULL10

Cheapest price on the radeon vii so far, good for people who need the 16gigs of vram or want to go AMD. Might also include 3 free games listed here, if anyone wants to confirm.

Original PULL10 10% off Sitewide on eBay Deal Post

Related Stores

eBay Australia
eBay Australia
Marketplace
Umart
Umart

closed Comments

  • Bloody hell! I'm having enough hassle deciding what to buy without these dropping to the $1000 mark.So loud and hot.

    The big plus here for me is that these can be used with mixed resolution multi-monitor. eg 16:9-21:9-16:9.

    Will we now see the sub $1500 2080ti?

    • +2

      The big plus here for me is that these can be used with mixed resolution multi-monitor. eg 16:9-21:9-16:9.

      So can literally every other GPU with multiple outputs (see: every GPU in the last 5 years).

      • -7

        Not if it is a nvidia, they require all displays to be set to the same resolution. Sometimes you can sort of get around it by running games in windowed mode, with the associated performance hit and gui problems.

        • +4

          Where the hell are you getting that from?

          • @Tacooo: Find one search result where someone was able to get it working? AMD only added the ability early this decade, and I have been hoping for nvidia to do the same since. It is one of the major issues keeping from jumping on a 2080, and wishing it wasn't so does not affect the reality.

            AMD lets you run with different vertical resolutions as well. You can choose to render off screen areas for the smaller displays, or match the smallest vertical and add black bars to the higher monitor.

            The nvidia workaround with windowed mode/scaling is also a pain to configure as you lose output on displays when configuring. Dealing with gui elements is also discouraging.

          • +1

            @Tacooo: To clarify, the windows desktop is possible. The issue is gaming where all displays are treated as one big display. ie nvidia Surround vs AMD Eyefinity.
            This thread goes into some detail about the issues.
            https://forums.evga.com/Can-you-mix-and-match-monitors-for-s…

            You tube has people using the windowed mode workaround(which still involves scaling and not being able to run native resolution)

          • -1

            @Tacooo: How did you go? Did you find someone able to run mixed resolution monitors in nvidia Surround?

        • +1

          Wtf!!?! 1x4k 2x1080p here if a 1080. Ur on drugs.

          • +1

            @T1OOO: Good thanks, I can hear colours.

          • @T1OOO: This issue has nothing to do with running the Windows desktop. It only relates to nvidia Surround when all displays are treated as a single display surface for gaming.

            I think you are saying that you run the 4k at 1920x1080 in nvidia Surround mode, is that correct? If so yep 3 monitor all 3 at 1920x1080 is fine. The condition is that nvidia surround will not allow mixes of Landscape and Portrait, mixed resolution, refresh rate, or polarity.

            Another oddity is that any resolution over 1600 vertically disables bezel compensation.

            • @Major Mess: No, I never bothered looking because your original statement was wrong/missing facts. You weren't just talking in general, you specifically meant in Surround mode which you failed to mention until the third comment. Those are two very different beasts.

              • @Tacooo: yeah man i buy multiple displays instead of 1 big 4 or 5k one,… right real issue there !? NOT!

  • Highly doubt you can get 3 games from Umart because you can not even find any promotion info on their own website.

  • Wouldnt the RTX 2080 be a better buy?

    • -1

      for gaming probably, but some ppl need that 16gb of HBM memory for other purposes.

      Currently it is just a tad weaker than a RTX 2080 but AMD have been known to improve their drivers to surpass equivalent Nvidia cards.

      eg. RX 580 was < gtx 1060 when released but now it beats the 1060 in most benchmarks after 1.5 years of driver updates.

      same as the vega 56/64 < gtx 1070/1070ti but now on par or better.

      • Their architectures are surprisingly similar these days, so I'm not sure there's really an edge to squeeze out on that front. In fact, mesh shader pipelines will probably put the advantage NVIDIA's way until at least Navi.

        • Point at the AMD's RTX cores and DLSS abilities, please?

          • -1

            @[Deactivated]: Aside from you conflating RT with RTX (which is DLSS, RT and mesh shading), the answer is it's obviously not accelerated by ASIC-like circuitry as seen in the RTX series. That would be the obvious point of contrast.

            DirectX has a machine learning API, and DXR and VKRay are hardware agnostic. Additionally, none of this precludes machine learning being used to optimise game performance and game assets, something being used by many developers already as part of their workflow.

            It's ironic that you're bolting in here to be a Team Green fanboi here, when my post is meant to highlight that NVIDIA have basically nullified AMD's typical advantages. Happy hunting, I guess…

            • @jasswolf: I'm not being a fanboi, I'm brand agnostic. But the statement "Their architectures are surprisingly similar" is patently false.

              • @[Deactivated]: From a gaming perspective, they are remarkably similar in terms of their other advantages. That was my original point, and I'll leave it at that.

          • @[Deactivated]: Bit apprehensive about RTX now considering Crytek showed a demo of Raytracing on a Vega 56 at GDC which is hardware agnostic.
            Didn't Hardware Unboxed demonstrate that DLSS was a blurry mess on their Youtube channel and that conventional anti aliasing was a better option.
            ALso Google using AMD for Stadia may determine how Raytracing and AA is adopted in future.

            • @shellshocked: Voxel-based raytracing is just another interim hybridised rendering technique, and a much weaker one. You'll notice ghosting and LoD issues with dynamic scenery/geometry in that demo due to the nature of the technique. The hardware push being made by NVIDIA is the right long-term call, but the voxel solution is welcome in the same way FXAA can be useful even though MSAA/SSAA would be ideal.

              You'll want to get up to speed on the development and implementation of DLSS today. Hardware Unboxed did what YouTube channels do: created drama for clicks.

              Make no mistake that machine learning will be a component of any cloud-computing approach to gaming, whether that be in compression techniques for video encoding, freeing up compute resources for shared task work amongst multiple user instances, or even managing to be utilised to reduce aliasing if there's enough spare resources. Remember that Stadia is a uniform hardware experience, so development will be fine tuned similar to a console.

      • RX 580 was < gtx 1060 when released but now it beats the 1060 in most benchmarks after 1.5 years of driver updates.

        No it wasn't.

        One quick look at the reference design specs shows the RX 580 soundly beats the GTX 1060 from an architectural point of view, before you even consider driver optimisation and OC'ing potential:

        GTX 1060
        Core Config: 1152:72:48 (3GB VRAM) / 1280:80:40 (6GB VRAM)
        Pixel Fillrate: 72.3 GP/s
        Texture Fillrate: 108.4 GT/s (3GB VRAM) / 120.5 GT/s (6GB VRAM)
        Memory Bus Width: 192-bit
        Memory Bandwidth: 192 GB/s (3GB VRAM) / 216 GB/s (6GB VRAM)
        GFLOPS (3GB VRAM): 3470 (Single) / 108 (Double)
        GFLOPS (6GB VRAM): 3855 (Single) / 120 (Double)

        RX 580 4GB & 8GB VRAM:
        Core Config: 2304:144:32
        Pixel Fillrate: 40.2 GP/s
        Texture Fillrate: 181.0 GT/s
        Memory Bus Width: 256-bit
        Memory Bandwidth: 256 GB/s
        GFLOPS: 5792 (Single) / 362.0 (Double)

        *Note: The fillrates for the RX 580 are at base clock speeds, they're even higher at boost clock speeds.

        The only advantage any model of the GTX 1060 had over an RX 580 was a slightly higher pixel fillrate (which could easily be overtaken by RX 580 owners with a mild OC) and a slightly lower TDP.

        The GTX 1060 was the RX 470/570's counterpart, strictly speaking. It only competed with the RX 480/580 due to a few games which had either very poor optimisation for AMD cards or APIs favouring NVidia cards, or both.

        In any case, it wasn't just driver updates that allowed the RX 580 series to take the lead, it was the ease with which they could be overclocked, and vendors soon took advantage of that by releasing models that were factory overclocked by as much as 150Mhz over the reference design's base clock speeds, and still had headroom for additional OC'ing which put the RX 580 on par with the GTX 1070 in quite a few games at 1920x1080.

        The highest OC ever seen on any aftermarket vendor designs for the GTX 1060 was an 89Mhz increase over the base clock speeds and pushing the GTX 1060 beyond that was incredibly difficult.

        In any case, once the RX 580 began to be heavily discounted around Q2 2018, the GTX 1060 made little sense as a mid-range card and made even less sense in the long-run as you quite rightly mentioned due to AMD's far more legacy model-friendly driver updates, as opposed to NVidia's "scorched-earth" driver update policy.

        • -1

          One quick look at the reference design specs shows the RX 580 soundly beats the GTX 1060 from an architectural point of view, before you even consider driver optimisation and OC'ing potential

          You're confusing architecture and compute power. Pascal was the superior gaming architecture for the time (DX11).

          Don't hide behind factory OC numbers: Pascal was also a much better overclocker, as is Turing, and they do it all on 120W stock TDP vs 185W from the RX580.

          The RX580 had its place, but the only AMD card still in the hunt right now is the RX570.

      • You've also generally got headroom to undervolt on these cards. Don't expect to be able to, they wouldn't be clocked as they are if all could, but it seems to be pretty common to be able to UV it enough to knock down the fan noise/heat.

        The only objectively worse thing with the AMD card is power draw (bar needing a card from either of the suppliers due to apps bias's)

    • Also even though they introduced that you can use G-SYNC on Freesync monitors, I've found that my 1070 flickers with my monitor with it on, so its not perfect . So I'd say it's a good choice if you have a decent freesync monitor that you don't wanna roll the dice with whether GSYNC will work fully on it.

      • -3

        You also get flickers with an AMD GPU on a lot of FreeSync Monitor. It's the nature of "free".

        I have MSI MAG27CQ and when I contacted MSI support regarding Brightness Flickering when FreeSync is on, they say tough luck, AMD issues.

        Try a different version of driver, that may helps. It was in my case.

        • +2

          It's the nature of "free".

          That's gotta be the worst reasoning I've ever seen.

        • +1

          What kind of flickering? I've this monitor paired with a RX580 which has been flawless.

          • @[Deactivated]: https://www.google.com/amp/s/amp.reddit.com/r/Monitors/comme…

            That kind. And wow, getting negged for stating the truth, MSI Support actually closed my ticket and send me a link to AMD forum addressing the issue.

            https://community.amd.com/thread/215821

            • @Bigboomboom: You need to be careful saying anything that can be construed as disparaging AMD on the internet..

              Problem was AMD set the bar too low manufacturers to be able to use the Freesync label. Freesync 2 is much more stringent, and a much better standard to follow.

            • @Bigboomboom: The response from MSI just seems a tad dismissive. I wonder if graphics card manufacturer or even display port cabling come into it.

              Can you still reproduce it with current drivers? What does it look like? Using HDMI or DP? Does it help using the other? Which games do you have issues with?

              I notice mine when it crosses the freesync barrier on games like Assassins Creed, but it doesn't flicker or such.

              As I said before, mine has been flawless. I'm running at 120hz as it's more easily divisible for 30fps tubes though. It runs without issue up to 144 though.

              • @[Deactivated]: I tried both DP cable that came with monitor and Club 3D DP 1.4 cable which is Certified. I rerolled a slightly earlier driver after trying CRU FreeSync range (which didn't fix it with latest driver) and the earlier driver works fine.

                The issue is like this here
                https://www.reddit.com/r/Amd/comments/auicnk/have_freesync_b…
                https://www.youtube.com/watch?v=fhkn2t1CXc0
                Mine was at the top and bottom of monitor only.

                MSI full response was

                "Dear XXX,

                It is known issue by AMD so i am suggest setting FreeSync to OFF.
                https://community.amd.com/thread/215821

                Thanks for your support to MSI."

                And it isn't new, it's reported on many branded monitors. Some monitors actually have firmware update to fix the issues, lots are just waiting on AMD.

                • @Bigboomboom: Could it be overdrive doing that for you near the cusp of low Freesync cutoff? What is your "Response time" setting? Does it happen when FPS are > 50? Does it happen when response time is set to normal?

                  • @[Deactivated]: Not Overdrive, Response Time was set to OFF that was one of the first thing I tried. Happens almost at all time, even when FPS was over 90+

                    • @Bigboomboom: Assuming the monitor does the same thing with another graphics card, that's a pretty open and shut case for the ACCC. Something is wrong with the monitor, something that had you known, would have stopped you purchasing the monitor. And it's around a major feature, Freesync. Don't give up.

      • +1

        You might want to adjust your Freesync range with CRU to fix this, there's plenty of guides online to help.

  • It is all getting a little suspect. New AMD cpu's are due to be announced. AMD is turning 50. We all know Navi is coming but is not supposed to initially have a model at the level of the Radeon vii. Considering how few manufacturers there are, why has Sapphire apparently discontinued manufacturer of this card that is selling above RRP? Could the better Navi be just around the corner? Will nvidia have a new card earlier this year? How many years am I going to wait to make the jump to my upgrade? Someone sell me their GTX1080 for $350.

    • +1

      They're unlikely to make a Navi card at this level because TSMC's 7nm is probably a mess for large dies. 7nm+ address a lot of the issues that would make yields difficult.

      Navi is still probably June-July, and there'll almost certainly be a product launch on the 1st of May, probably a Ryzen product.

    • Wasn't supply of the chips supposed to be fairly low?

      • AMD said no, and other manufacturers popped up. It could be that Sapphire decided that it wasn't worth the bother and they would just wait for Navi, tho they are big AMD GPU manufacturers. Or it could be all crap and the cards are not discontinued.

      • It was, but it was always bound to catch up and the price to get better. I've had a number of stock alerts I had set from last month become available for the Radeon VII.
        Ended up getting one from eBay as new with games.
        Probably should have waited till this, but where is the fun in that!

  • I would rather wait for Navi results before I get the 7 with that kind of money. At least 600-700 is the price point it should've been from the start

    • Yeah, but probably no replacement for this until next year, and a lot of people have been waiting for a lot of years. 7nm looks like being close to the physical end to our current technology? Apparently the in chip 7nm circuit fins are so small that heat being transferred is beginning to become a problem, because they are so close together. Multi GPU may be the only "easy" option for big performance gains in at least the short term.

Login or Join to leave a comment