• expired

Sapphire AMD Radeon RX 7900 XT Graphics Card + Gigabyte 2500E 1TB NVMe M.2 SSD $1269 Delivered @ PCByte

660

Deal or No Deal. You guys decide. ( Reference Video Card )

Free Bundled SSD is this :
Gigabyte 2500E M.2 2280 NVMe 1TB Internal SSD 2400MB/s GP-G325E1TB-M2 ( $109 )

Related Stores

PCByte
PCByte

closed Comments

  • The same ssd is $79 on PCByte

  • It seems that when you click the link for the SSD from the actual page, the new page lists it at $109.

    • +8

      thanks, I'm sold. just bought the card

      • -1

        Blame the lack of investment by AMD. HPI is terrible and OpenCL was gutted. Even blender doesn't use it anymore.

    • +2

      idk i actually buy for productivity and i would not say most people are buying for machine learning. Most would just be buying for general/gaming.

    • As someone who has at times considered going green for that reason; Don't delude yourself into thinking it's what most people are after. 🙄

  • +16

    Here we go with all the - “I’ll HOLD for the 8900 launch” oh and the “No deal - Bobs Pc had them for 29c cheaper 6months and 3 days ago” - SMH

    • +2

      Is that quote from the Sydney Morning Herald?

    • +3

      send bobs

      • +1

        and vagana.

  • +3

    Damn good price for a damn good card

  • +3

    In b4 all the nvidia fanboy comments

    • +4

      in after all the comments about nvidia fanboy comments

  • +2

    I will wait for 4060, my power supply is only 550w, so 4060 is the only option for me

    • +1

      Depends on what you pair it up with + other accessories you would use from pc ports. For a 65W cpu and some ssds you'd be okay with a 550W :)

      • 7900xt Requirements
        Typical Board Power (Desktop) 315 W
        Minimum PSU Recommendation 750 W

        https://www.amd.com/en/products/graphics/amd-radeon-rx-7900x…

      • Yeah I'd tend to agree. Undervolting and adjusting the power limit a touch might be necessary, but you can usually get away with very little loss in performance — if not a slight benefit from the undervolt.

        I run a card pulling over 300w on a quality 600w PSU without issues.

    • +3

      I'm running a 7900XT paired with a Ryzen 5 7600. I've slightly undervolted it and don't really see above 290w power draw at 100% load on MS Flight Simulator. I wouldn't be surprised if the total system power draw in that instance was less than 400w. PSU is 750w Gold

  • +2

    Waiting to see if the 4070Ti from TECS is honored, if not will probably go Team Red

      • +15

        Time for ur meds mate

      • +2

        But they are the one keeping nvidia in check. If AMD did not exist, the 4090 would perform like a 4080 and would have costed 2500$.

  • +2

    Good price for the times, and your giving your money to the lesser of two evils.

      • You have to pick one, for me the price to performance is the biggest factor, you might know more than me and see it the other way, that's fine.

      • +3

        Jensen's getting desperate hiring trolls now, could have put that money into lowering card prices lol.

      • +1

        Nvidia basically held the entire video game industry to ransom for 2 years, and is still trying to manipulate prices. Gamers who can’t buy a graphics card won’t be buying games. That’s not good for developers either.

        • Absolutely, they are entitled to make money but NVIDIA have pushed the line, severely, could label them international terrorists haha

    • +1

      Not to mention the extra RAM gives it more longevity.

  • What is the equivalent in RTX for this GPU?

    • +1

      Happy to be corrected, but it seems to be in the ball park of 4070ti performance. Giveor take, depending on what game and if it has been optimised for AMD or Nvidia.

      https://gpu.userbenchmark.com/Compare/Nvidia-RTX-4070-Ti-vs-…

      • +4

        Best to avoid linking userbench - they're not a reliable source.

        The cards are still comparable as you say, but as noted below the amd has an advantage in rasterization, while the 4070ti will do better with rt enabled.

        • DLSS looks better than FSR, that gives a point to nvidia, but AMD has more VRAM, 12GB is already getting outdated.

        • +1

          User benchmark is a good start to gauge a cards performance. Obviously it's flawed, but for a 15 second opinion, it's a fantastic resource.

          • @lew380: I see where you're coming from, but I feel like taking an extra 30 seconds to get a more reliable and specific number is worthwhile. Userbenchmark gives +9% (of their magic "effective speed" metric) in favour of the 4070ti, whereas the meta-review linked below has +11% raster at 4k in favour of the 7900xt — the most important number for these cards imo.

            That's a 20% difference in the overall performance impression you'd take away from a glance at either one, which would be enough to change my opinion on which to pick. And as I'm sure you'd know, there are plenty of far more egregious examples of userbench weirdness.

            • +1

              @snep: But using the same link when looking at ray tracing differences, the 4070ti has a 20% lead. 4070ti leads by 20% in ray tracing, 5700xt leads by 9% in raster performance. So the 9% in favour for the 4070ti makes sense. So user benchmark is more accurate than it appears, you can't cherry pick your prefered metric and label it as wrong, it's just very generalised. You have to do your research and see which fits the games you play, and most people would favour the rasterisation of the 5700xt.

              • @lew380: A significant majority of people don't use ray tracing though so RT vs rasterization isn't an apples to apples comparison. The equation as to which of these 2 cards is better is actually pretty complex. Of note to me cuz I want to build a SFF - only the 4070ti has zero fan mode apparently, and it appears to be cooler than the 7900 XT as well.

                • @hobbsey1: Exactly, the generalisation is fine but it came down to minor detail on your build. Nothing wrong with a site which gives generalised performance metrics.

    • +3

      4070 Ti on raster performance. not comparable on ray tracing performance tho

    • +1

      Aggregate testing/reporting like this puts it a bit below half way between a 4070ti and 4080 for both 1080p and 1440p for standard raster performance. As mentioned, things like ray tracing or other more vendor specific things and benefits may be quite different.

    • +2

      lack of investment by AMD into the developers ecosystems

      Interested to know more on this. Got any sources?

      • -5

        It's out there on reddit if you want more commentary, but they don't even have their own proprietory routines that are needed in machine learning.

        At least Nvidia invested into creating their own proprietary software which is why almost all AI researchers are using the green machine.

        • +2

          but they don't even have their own proprietory routines that are needed in machine learning.

          Very specific use case in imo. Most of the folks in here would buy a good gpu for just gaming and won't be applicable for them. Given most of the games are made on unreal engine these days, you'd see better performance on team red's cards over nvidia.

          I asked for sources as this time around I'd be interested in doing more than just gaming on the pc (cad, ml, hpc and so on at enthusiast scale, not production scale)

          • -8

            @kaleidoscope: I'm fairly certain AI will become mainstream in schools and even part of the curriculum in the near future.

            If not, then I have no hope for this society improving itself.

            • +1

              @Wintry Golem: I'm no expert but its more than likely the consumer apps for AI accelerated tasks would end up running mostly on cpu cores than a discreet gpu. I think am5 platform is capable of delivering these in future iterations similar to what apple's m1-m2 chips are doing.

            • @Wintry Golem: buddy schools still teach a lot of stuff by hand that can be done with a calculator or a quick google search

        • +6

          they don't even have their own proprietory [sic] routines that are needed in machine learning

          Yeah, they've supported open source frameworks for GPU compute instead, like OpenCL and HIP, and even provide compatibility layers for CUDA since Nvidia is evidently disinclined to provide support except for their own products.

          I'd say that's far more evident of an intent to "invest in the developer ecosystem" than focusing on a closed-source proprietary solution and making it unavailable to competitors…

        • +2

          Nvidia has spent a lot of money over several years encouraging developers to use proprietary tools and to abandon open standards.

          Now that Nvidia's plan is bearing fruit, and open standards are neglected, your idea is that this is AMDs fault, and we should support Nvidia more?

          Your logic is upside down.

    • +4

      What is the 'doesn't do what most people want it to do'?

      Display video - check ✔️
      Play some games - check ✔️

      I am not a team red fan boy, but there are positives to AMD. Like the VRAM storage.

        • Absolutely agree!
          I have an RX6900XT and it is extremely painful to run things like stable diffusion.
          If you want to do anything with AI/ML then do yourself a favour and grab an NVIDIA card instead.

        • +4

          From the point of mucking about with machine learning, its better to get AMD and then pay for cloud ML services. Unless you are doing AI 24/7 you end up ahead, and if you are doing AI 24/7 you know you need, and can afford, more serious setups.

          NVidia have expected they could get away with crypto prices past the end of crypto - and it doesn't add up.

        • You are right. If you are wanting to do AI then go with Nvidia. Different horses for different courses. There are plenty of people interested in the gaming aspect and absolutely not the AI.

          AMD has never marketed their cards for AI purpose.

        • +2

          I'm amazed that you think everyone will be running their own version of ChatGPT on their graphics card, like the only missing piece of the puzzle is buying an Nvidia card and bob's your uncle.

        • +3

          No one cares mate 99% are buying for gaming/content creation… your 1% is irrelevant for most. Back in your box

    • +1

      How many times do you have to post the same thing? Paid for comment?

    • +1

      Mate you joined on 1 April. Are you non-LHR? How many nvidia shares did you buy?

  • +22

    a 20GB Graphics card will last allot longer then a 12GB 4070Ti

    Just like the 8GB cards are sucking in todays games

    • +1

      Exactly this!

    • +1

      insert last of us remake fps on a 4090
      unfortunately for something like blender, which is pretty vram intensive, this performs worse than a 3070

      • "unfortunately for something like blender"

        Thats why AMD has their CDNA AMD Instinct™ MI250 Accelerator Cards with 128 GB on a 8192-bit Memory Interface with Peak Memory Bandwidth: Up to 3276.8 GB/s with a power draw of 500W | 560W Peak and 362.1 TFLOPs of Performance

        Their gaming cards only do gaming but because Nvidia gaming cards still use Cuda they accelerate the programs because nvidia helped the developers implement it.

        • +2

          yeah well i dont have any good quality organs to sell for a productivity card

          • @Pugkin: They are only $8,000USD :)

          • +2

            @Pugkin: If you need it shouldn't your employer pay for it?

            • +1

              @killingtime: Hobby work and self employment, I upgrade parts with productivity being my priority

    • 8GB should still be fine for most games at 1080p

  • +5

    needs another X

    • And a 800W PSU to go with that

  • +1

    Thanks OP, bought 4. Going to crossfire.

    • +1

      You wish OP could believe you.

  • Do these cards have coil whine?

    • mine does not - some others might

    • +3

      Haven't got this card specifically but in my experience with my friends and my own high end cards over recent years is coil whine is increasingly common now that these cards are pumping some pretty crazy power

  • how are amd cards these days for running emulators?

    • Can run on internal gfx eg 5600g APU

    • I'm playing CPS3 PS3 emulator maxed out FPS ( 60+ ) with a 6600XT and 5600 CPU :) .. runs all the other fine too… it supports everything so the emulators work well for it… also AMD/ATI have been around for a long time and older emulators work well… That being said.. they work better on windows 10 vs 11

  • -2

    hello AMD sir please redeem for 1000 dollars thank you doing the needful good bless sir pray

  • If anyone is interested on mostly or only gaming on these cards, here is a video how amd cards with superior vram ages like fine wine

  • -2

    SSDs being given away in cereal boxes (metaphorically) is generally good news (and prices coming down) and hopefully mechanical disks going the way of the dodo soon.

    • +2

      Magnetic disks are still good for applications like home nas boxes. Specially with raid array setups which is how they are normally setup anyways

      • -2

        It's A Digital Disease

        r/DataHoarder

  • A fairly specific scenario but don't bother with amd graphic cards if you plan on using a quest vr headset with your pc. They cannot encode the video for the headset fast enough to provide a satisfactory experience.

  • I wonder if you can redeem the free TLOU deal

Login or Join to leave a comment