Building a PC for CAD, 4K video editing and web dev

Hi

I'm looking to build a PC and am looking for suggestions for parts, please. The PC will be used for:

  • CAD for engineering and 3D printing
  • 4K video and photo editing
  • Programming and web dev
  • Occasional gaming (including VR)
  • Possibly some online machine learning/computer vision courses

A few other things:

  • I have a preference for Intel/NVIDIA.
  • I already have 3 external monitors, one UHD and two FHD, that I want to use.
  • My budget is certainly not unlimited but I'm prepared to spend as much as I need to in order to build a good and reliable machine.

I'd appreciate any help you can give me!

Thanks :)

Comments

  • Budget?

    • Let's say $5000 but I'd prefer to spend a lot less. I haven't built a new PC in years so I guess I don't know how much a computer with these capabilities would cost now.

      • I don't think you need anything crazy for CAD, I've used my integrated graphics on my laptop to 3D model for classes. That was just exporting to STL, unless you're doing heavy duty rendering a decent GPU shouldn't be an issue. I have friends that do professional renderings, and they don't have 5000 dollar computers.

  • Look up the recommended specs for your CAD software and start there plugging options into https://au.pcpartpicker.com/

    We can't help you without more specifics and budget, so help us to help you.

    • Added a budget to the comment above. Thanks :) I've been playing with PC Partpicker over the last few days but there's so many different options for CPU, GPU etc. that I'm a bit lost.

      • there's build guides for certain tiers of performance/price points. There's also a site called logicalincrements.com but I prefer pcpartpicker from what I see. The max build with a rtx 2080ti and top tier intel or ryzen 9 is around $2800 - $3100.

        • Thanks! I will check that out.

          • @Duckman: ok, that was the US site lol… top tier build is more like $3700 - $4400 which makes sense. You would be going for max CPU + RAM, hard drive speed, as most CAD programs may not improve much with gpu acceleration.. I would try to play around with built systems and see if they run the programs you want. You could get something under $2000 if you are just starting out, maybe less. Machine learning would mean an rtx 2070 or similar so then that would bump the cost up.

    • I wouldn't bother about reviewing the specs for the CAD software. Certainly don't buy into the workstation graphics or processor requirements bullshit…

      My recommendation for you OP is to purchase the best parts you can afford in whatever brands you like best. In order of priority for 3D CAD or modelling - GPU, RAM (minimum 32gb IMO), CPU, motherboard, PSU, then whatever else you need.

  • I know you say you prefer Intel, but you could get yourself a decent Threadripper build for that money.

    • I was hesitant to give a specific budget because I didn't want to give the impression that I am trying to build a "dream PC". I really just want to build something that will work well, whilst not spending more than I need to. I just looked up Threadripper and I think it would be overkill for me, but thank you anyway for the suggestion.

    • +13 votes

      Most devs use MAC running Windows

      Is that really the case, or just hipster devs working in local cafes?

      AUTOCAD users use Metabox

      Metabox are laptop builders using Clevo parts. I think OP is looking for building a desktop PC.

      • All our team get laptops, We have about 12 devs in our office and they all use MACS running Windows, they are far from hipster. We havn't had any issues with Metabox and they are solid, powerful machines and portable. When we've had to purchase desktops, they have always been HPE Workstations, Z series with a 5 year carepack.

        • Thanks but I am specifically looking to build a PC. I already have a basic $1300 gaming laptop that does most of what I need but it isn't capable of running 3 external monitors, and it's also probably going to hold me back from exploring VR and 4K video editing. I can build a PC with the specs I need for much less than a laptop would cost, so that's my reasoning for going the PC option.

        • We have 25 devs, 90% use Dell XPS's of one description or another, 10% holding out on Surface Book/Pro. Have a couple of Macbooks in a drawer which everyone refused to use (even our sales guys who bought said macbooks swapped to Surface Pro/Book).

          Always best not to generalise, it's highly dependant on usecase, technology, etc.

          • @Abaddon: More like it depends on the shops and the head honcho of technology. If u work for someone u just have to use whatever they provide.

            My teams all use MBPs because the comp can afford those by leasing them for a few years, and the head honcho is onboard with the Macs for dev even though he still uses Windows most of the time LOL.

            I prefer Mac as a OS it is more reliable than Windows, and I can do most things faster using a Mac, imagine before Powershell how painful it is to use Putty and Git shells for Windows…

            About hardware and after sale exp Apple Australia has better service, but Dell and Microsoft are really catching up.

            And lastly if I could be superficial MBPs look way cooler than any other laptops, albeit I have to admit the latest Surface Laptops look nice but heck they are as expensive as the MBP 13

            • @sillyguy:

              MBPs look way cooler than any other laptops

              Yeah, well, you know, that's just like, your opinion man…

              I personally like the look of the XPS line over the MBPs.

  • Are you doing CAD professionally?

    I don't think there's a big difference with top of the line Nvidia gaming cards and their Quadro cards. Better performance for dollars with regular GPUs. I'd say just get an AMD processor and the standard bells and whistles. Well a bit more - either 1 or 2tb ssd, HDD (shuck a decent size one since you're photo and video editing), 32gb ram.

    Most of the cost will be in the GPU. Top of the line ones are $1k+.

    I'm not much of a programmer so this more geared towards gaming and general use.

    • No, this PC is just for personal projects. I don't need a Quadro card and will just be looking at regular NVIDIA cards, for the reason you mentioned.

      Why AMD?

      • The newer AMD processors are performing significantly better than intel. So will be better in workload situations and better value, pretty similar in gaming though.

        • Seconding AMD for sure!
          While they were lacking a lot a few years ago - now they do significantly better at most multi core tasks (so great for CAD work)
          Intel is mainly better for single core tasks which is why the top of the line Intel does better in games (only by a margin though) - but as soon as you introduce any kind of multi core process or doing anything else while playing games (like streaming) - AMD takes the cake (and their price per performance is much better too)

      • I use CAD a lot and 4K videos etc.
        AMD Yes!

        I am using R9-3900X, 64GB RAM and a crappy GTX 1650 card but will upgrade soon.
        I am not looking back to Intel at this stage.

        • I have been using SolidWorks, Tinkercad, and Fusion 360.
          The R9-3900X is a power horse and I barely see all of the max out.
          I am also using Samsung 970 EVO NVME 1TB SSD for my system drive.

          I own 3 3D Printers - FDM, and one resin printer.

      • You also mentioned 4K video. Going AMD at this point brings the possiblity of PCI Gen 4. And fastest possible NVME storage speeds.

    • Performance-wise the Nvidia Quadro / AMD FirePro cards are quite unwhelming, once you factor in the huge extra cost, it's just not worth it. The only problem is that Autodesk's 3D software is all optimized and certified for those so-called 'workstation' cards, you may likely find fidelity and stability issues with the non-certified drivers. It's a great shame, when the consumer cards get so much more performance/dollar than these other overpriced cards. Our office has found that by just rolling back to older drivers, you can get the GeForce cards to work well enough in most Autodesk 3D Design Suite programs; AMD's cards were generally pretty glitchy though, and we now stay well away from them.

      • I use Autodesk Inventor on an RTX2070 and have never had a problem. Most CAD forums will say the same thing. The premiums for workstation caesar are not worth it.

  • CPU: you gotta decide what's important, number of core vs single core speed. That should be driven by your most used/important software. If your software is true multi thread, invest in the most core count you can. But if your main software is still single thread, get the a 6-8 cores CPU with the highest base clock. And I'll stay with Intel.

    GPU: again, check with your software, can they get by with directX? If opengl is highly recommended/required then bad news, 5 grands ain't gonna get you far.

    Once you work out your cpu and gpu combo, the remaining budget should settle your ram and motherboard easily.

    • Thanks. I'm confused by what you said about OpenGL, because I think the GPU in my laptop supports OpenGL, and this laptop isn't exactly high-end (definitely didn't cost anywhere near $5000). What am I misunderstanding?

      EDIT: Looking into it, I see there are updated versions of OpenGL and I guess the most recent version must only be compatible with very high-end GPUs?

      • ok, this is my understanding. Gaming or profession cards all support opengl and directx to a certain extent. How well a card process data is determined by its driver. In gaming card, the drivers are optimised for DirectX code and less so for OpenGL code. On professional cards, the drivers are optimised for OpenGL. So there will be a situation where a quadro would have same chip, same CUDA count, but cost 3-4 times more than a geforce card, it's all in driver.

        • realistically this isn't true anymore, the differences between DirectX and OpenGL are relatively minor performance wise at consumer levels and really shouldn't be a consideration in purchase unless you have a very specific piece of professional software that has requirements as both API's are extensively used by gaming companies. given it is home hobby and gaming any consumer gaming card will do just fine, basically as you are wanting VR you should be looking at 5700 or 5700 xt in AMD or anything above a 2060 super on nvidia side.

    • In Autodesk's 3D software, most of it is only single-core, so your advice is spot-on — highest clock speed is important + more cores are a waste of money.

      Also if you are working with larger files 1GB+, I would make sure to get the fastest RAM that your motherboard supports, between 16-32GB of it.

  • I reckon get onto luke from techfast (frequent poster on the deals page).

    As for your demands i work in the engineering industry with heavy autocad loads and we've recently all been switched to laptops running 6 core xeon processors combined with Quadro graphics cards and they rarely get maxxed out (laptops in case of working from home/covid).

    The only downsides are obvisously 2 monitor outputs - might be able to get a hdmi splitter? and temperature throttling when really crunching the renders.

    Honestly anything with at least 6c/12 threads, strongly recommend AMD Ryzen 3600 or better, and above will more than be enough with a decent Quadro card (maybe even a brand new or second hand Titan), and 16+gb of ram and you'll be flying. All up $3K on a computer will get you awesome specs with $2k to spend on other things.

  • I just put together a machine for statistical analysis. 3960X + 2080 Super. You can do this for around $5500.

    If you're willing to spend this much money, the only platform worth considering is Threadripper at the moment.

    •  

      nice.
      I recently built a 3900x system.
      Wish I went threadripper so i could have more ram
      128gb is not enough LOL

    • I don't entirely recommend Threadripper and EPYC, they're very niche cases. The 3950X outperforms many Threadrippers and is cheaper. Basically comes down to binning, better IPC, and frequency. Though, don't know about availability or shortages.

      The TR and EPYC biggest advantage are extra PCIe lanes, followed by having more and more cores. You want 32 cores? Or building a small server?

  • The reason why most people suggest AMD is AMD has better multicore performance with cheaper price and which means AMD is better when running multicore CPU intense software,
    the only advantage on Intel chip is faster frequency compare to amd, if u still prefer Intel..9900k is decent chip with overclocking to 5.x GHz
    only high end graphic card can do vr comfortably with your budget should able do 2080 super or 2080ti ..It will be better than quadro ..Gamer Nexus did a benchmark before,
    Don't over stretch your budget and don't try to future proof your system, a i7-7700k cost 600aud couples years ago get beaten by newly launch ryzen 3 3300x 200aud chip ..I always tell my customer buy what you need now and upgrade your system when times go by

    • I think this is really good advice for a desktop.

      I'd only add to buy the best keyboard, case and monitor combo you can afford, with the intention of keeping them through upgrade cycles.

      • Add to this the power supply.

        Hardware wise I've been though a series of Intel (two or three Socket 775 CPUs, two Socket 1150 CPUs) and AMD (AM4). It's been through countless GPUs. It's had DDR2, DDR3 and DDR4 memory. It's had so many hard drives, SATA SSD, M2 SATA SSD, M2 NVMe SSD - even 10K rpm hard drives.

        But it is only on it's second case, I think it has the original power supply - but I suspect it's the second. It's still running the same DVD drive :P

        The other thing I've managed to keep until just recently is my CPU cooling solution.

        • I'd also add buy the best PC Case you want, and other accessories like Desk, Chair, Headsets, Mics, Webcams, etc etc.

          I would hold off getting a second/third screen if it's not in your budget. You can get the same screen as your "best" one a few years later. In the meantime, just make do with one good screen, or get some cheap secondhand ones until you have the budget to splurge.

          Getting the best motherboard might be wise (not always) if the platform is going to have Long-Term Support (eg 2017 AM4).

          The CPU should be considered slightly. Especially for upgrade paths (eg Ryzen 1600 to 3700 to 4950X).

          The GPU definitely should be upgraded as per budget (ie No Future-proofing).

    • A $200 new CPU that also requires a new mobo most of the time.

  • Check the thread below. You don't need a monitor and with a ~$5000 budget could certainly purchase more RAM, SSD, 650W+ PSU, and a better GPU (RTX2070S+).

    https://www.ozbargain.com.au/node/521495

    Personally I have a preference for Intel. Tried Ryzen and AMD GPU combo's but didn't like the driver/software instability (missing features in their GPU driver, etc.). Use my current config. primarily for video editing, VMware, AutoCAD, and the very occasional gaming.

    https://imgur.com/a/IJHf1jO

    If you're on a tight budget, I wouldn't recommend the Intel HEDT platform, as a Ryzen build will give you better value for money.

    • My PC build is a Ryzen CPU, but a Nvidia GPU. I was looking at a new GPU when the 5700 line came out, and reading about all the driver problems, I went and got a RTX2080 for $900.

      AMD CPU, and NVidia GPU is a good stable combo.

    • If what you tried was the 1st or 2nd gen Ryzen, i would agree with you. I had a 1700x that was definitely having issues with drivers for a while.
      I have since upgraded to a 3700x and it is a pretty big difference. Have had zero issues and software is great. The architectural changes seem to have made a big difference for 3rd gen.

      As for the GPU - yea AMD is lacking on the high end. If you want a high end card then currently nvidia is the way to go.

      • I tested the first gen Ryzen (1800x) and a few RX580's. System had instability issues (RAM, CPU performance issue running Adobe, and other products, GPU driver constantly Blue screening, etc).

        Then built a 2700x system, and tested it with 4 different RX5700XT's. Ram compatibility, and software performance issues were resolved, but their GPU driver was still very unstable, and lacked features you find in Nvidia drivers, so I sold off the parts.

        May give them another go in a couple of years, but my 10920x build will do perfectly fine for now.

  • As someone who does both Cad (for 3D printing) and video editing I’d be looking at and AMD cpu and a Nvidia GPU

    I’d be basing it around something like a ryzen 7 3800X cpu, x570. Motherboard, 32GB of ddr4 3200 ram, nvidia rtx2070. And plenty of high speed storage.
    Other than running an older cpu that’s what I’ve got. (I’ve still got an i7 5820k which is due for an upgrade)

    The GPU is useful for the acceleration in video editing software like premiere pro, as is the 32GB of ram. ( I do max that out when running encodes, exporting animations from after effects etc)

    4K video also takes loads of space, particularly when working with it uncompressed for animations.
    I’d be looking at probably a 1TB nVME M2 boot drive. Something like a Samsung 970pro, then 2TB of SSD as a scratch drive / place for current projects your working on and then some big mechanical drives to back the videos up on / transfer to once you’ve done the encoding.

    Also get a decent power supply to power it all.

    For 3D modelling for printing stuff I use Fusion 360 and find the above specs more than enough for that. The video editing pushes my machine more.

  • Here's my attempt at it.

    I went 3900X, it's $730 for 12 cores, the 3950X with 16 cores, $590 for an "extra" 4c/8t doesn't seem to scale well on value, and I'm not 100% sure you absolutely need it.

    Would be great to know the exact software programs you're using, it's tough to say if more cores or stronger cores are the way to go

    https://au.pcpartpicker.com/list/8sYwXv EDIT: I initially forgot a PSU and added another 1TB NVME boot drive

    See you can go Intel, and depending on frequency dependency, and how much the iGPU can improve workflow/workload it might be worth it, but it only has 8c/16t total, versus 12c/24 thread above, so a 50% core reduction for an overall "speed" (ipc x frequency) boost of, ~10%, in single core applications?

    https://au.pcpartpicker.com/list/J7hBdm EDIT: I initially forgot a PSU and added another 1TB NVME boot drive

    GPU 2070 Super, it'll do VR just fine, 2080 Super if you must but just avoid the 2080ti it's a silly purchase, no value in the product at all.

    Also keep in mind, 10th generation Intel is around the corner so you could get a 10700k for much less than a 9900k and it'll be essentially the same chip, 10th gen opens up a platform that will have forwards compatibility to 11th gen CPUs, only the 11th gen will have PCI-e 4.0, X570 from AMD has PCI-e 4.0 already, so for increased NVMe drive speed this may be a factor.

    Also there's constant rumours of the next generation AMD Radeon GPUs and NVidia's 3000 series, so if you need to buy one now, probably go with the 3900X build (again, please post your intended software products), but I'd wait if you can.

  • I really appreciate everybody's comments here. I'm slowly working my way through them (lots of info to take in), and strongly considering the AMD CPU and NVIDIA GPU route as I can see that my bias towards Intel doesn't really make sense anymore.

    • AMD has done a massive 180 in recent years, and have moved to a far superior, cost efficient approach to building processors, compared to what intel has stuck with.

      AMD produces a small 4 core chiplet, and then scales that from using a single chiplet in their cheapest processors, to 4x chiplets in their midrange 3900/3950x processors, to 16+ chiplets in their expensive server processes. It's an incredibly efficient approach. They also rely on other companies to build labs and produce the processors, so they get to take advantage of the latest tech (7nm).

      Meanwhile intel is stuck designing separate individual processors for each different price point. They have to run multiple production lines for each of these different designs. On top of that, they sunk billions of dollars into production labs which are stuck on (now ancient) 14nm tech, and they have to continue producing each new cpu generation on this same ancient tech because of all that capital they've sunk into these self-run production labs.

      The only part i have to add to what others have said, is figure out how many cores you should require, and then look for the relevant AMD cpu. On a positive note, if for some reason you need to change, there is a huge range of various core counts and/or generations, that can easily swap into the same AM4 socket.

  • If you're going to be editing 4K footage, you'll need some fast drives. So it's basically NVMe drives (M2 form factor). Make sure you dont get SATA M2 drives. You could even try RAIDing them for more speed - so check your MB has at least two slots and that they both take NVMe drives. From memory, my MB has two M2 slots, but only one can take the fast interface.

  • Unless there is a must to get one right now, it is best to wait a bit as Intel's next gen desktop processors are on their way. From what we know so far, Intel intend to match the core count / thread count to AMD equivalents as much as possible.

    Intel will be using a new chipset and apparently will have high memory frequency support (even though it is generally not an issue for Intel CPU). Even if you felt you may not need that much compute power, some sort of clearance for current gen CPUs/MBs should happen. AMD will most likely respond with price cut to give Intel a hard time. AMD's desktop Ryzen 4xxx series CPUs are also coming. Then, there is the next gen nVidia and AMD GPUs.

    With weak AUD and Covid-19 related logistics issues and lack of demand, there is no good deals at the moment to justify buying essentially last year's parts to build a system.

    • What would be your guess for how long I should expect to wait for prices to fall on the Ryzen 9 3900x?

      I guess the problem is that there's always something newer and better around the corner, but if I can save a bunch of money by waiting a month or two, I can certainly do that :)

      • It's mainly due to your preference for Intel/nVidia, I suggested you wait. Intel's next gen i9 will have more cores, but I think only in 10C/20T configuration (maybe Intel will release a 12C/24T one as well). It will depend on how well the next gen Intel desktop CPUs perform to determine whether a price cut is likely. nVidia is moving to 7nm for their next GPU.

        Ryzen 9 3900X - if that's the one you are after, then probably fine to get one when you are reasonably happy with the price. It's a CPU that a lot of people would be interested in when discounted, especially after AMD basically stated Ryzen 3xxx is the CPU B3xx/B4xxx/X3xx/X4xx chipsets will support. Though, if you really have heavy 4K video workload (you really edit a lot of 4K footage), then you should look at Ryzen 9 3950X.

    • We already know the new i9 will be 10 cores.

      They can't match AMD in core count without a completely new architecture, like the one in their HEDT line
      There will never be a 12 core, ringbus is inefficient at high core counts

      Higher memory frequency support means official support for ddr4 2933mhz

      Nothing to write home about, it isn't really new functionality

      • It was due to OP wrote this:

        I have a preference for Intel/NVIDIA.

        Obviously, OP now changed his stance on the CPU side. However, for someone interested in Intel, might be best to wait.

        Also, I certainly hope AMD won't screw early adopter again next time with slow and repeated microcode updates. It was a pain to first wait for the RDRAND bug fix (including one attempt which caused an issue, resulting that proposed update had to be pulled). Then, there is the wait for the performance improvement fixes.

  • I don't know how important storage will be for you, but if serious, you'll have to go with the low tier of the HEDT of both sides where you can get more than 16x PCIe lanes. Especially if you combine with an optane drive, there will be a noticeable difference in "snappiness" between chipset and cpu pcie lanes.

  • I had a crack at it

    https://au.pcpartpicker.com/list/JwQ2Nq

    You could also downgrade the GPU to a 2070 super if you won't use it too much.

    The 3900x is 12 core but higher clocks. The 3950x is 16 core. Both have the same power requirements, the 12 core may perform better for you if your software cannot use 16 cores well.

    If your software isn't going to use that many cores efficiently, Intel is releasing a 10 core i9 in the next few weeks, it should be about the same price as the 3900x.

  • While I get a lot of people like Ryzen 3xxx series CPUs, there are a few things to note:

    • Make sure you update your motherboard BIOS as one of the most recent microcode updates includes a performance enhancement. Earlier ones had important bug fixes.
    • If you must go for 4 DIMMs, make sure you get top notch or at least quality RAM modules. Samsung B-die if you can afford (or Micron E-die if budget is limited). If you go for E-die and intend to overclock with 4 DIMMs, bear in mind that you won't be able to reach the same aggressive settings you can with 2 DIMMs.
    • If you go for Ryzen 3900X or better, make sure you get a decent X570 motherboard with quality VRM. MSI stuffed up quite a number of their X570 motherboards.
    • Hackintosh will be tricky (though it isn't easy with Intel either). Hybrid VM with Hyper-V is most likely still a no-go with AMD CPUs (thanks to Microsoft). The latter shouldn't be an issue. If you are really into VMs and containers, you would run linux.

    My switch to Ryzen 3xxx series wasn't pleasent. While the performance was great, I had to tackle with RAM issue (never thought a CPU can be that picky on RAM modules) and waited multiple times for microcode updates (BIOS updates). It is okay to buy Ryzen 3xxx series now as things have been ironed out, but I cannot say I am happy with the whole experience.

    • Samsung B Die is discontinued as far as I know, and hence now it is overpriced.

      Running 4 sticks you should look for a board with T topology. I believe the entire Gigabyte x570 lineup is ok, as does asus, however asus is overpriced

    • Hybrid VM with Hyper-V is most likely still a no-go with AMD CPUs (thanks to Microsoft)

      Is this due to licensing?

  • Bad time to build a PC

    https://au.pcpartpicker.com/list/WmFLBZ

    AMD 3700X (8 core, 16 thread is enough. You need GPU power for VR)
    X570 ATX M/B (B450's need bios flash and are discontinued next month)
    32GB RAM
    2080 Ti (as you want to do VR. Make sure card's outputs will support your monitors natively)
    2TB Intel 660p SSD (best for I/O. I/O is the current bottleneck, not bandwidth)
    8TB Seagate IronWolf (No SMR, for media storage, (lesser played) game installs and back up for SSD)
    Win 10 Pro 32bit (key is $70 cheaper than 'x64' because 'x32' on package. Download 'x64' install media directly from MS)
    Fractal Design Meshify C (Open with mesh filters on all intakes. Glass side to show off PC… Buy what you like that will fit CPU cooler [160mm])
    3x Noctua NF-P12 redux-1700 (replace front and back case fan. Install third in front bottom fan slot. Don't use included case fans unless you like noise)
    SeaSonic PRIME Ultra Gold 1000 W (you will need ~800W)
    Thermal paste

    $4614.96

    Why it is a bad time:
    This is last years hardware with a price increase.
    2080 Ti is still under powered for 4k and high refresh rate VR gaming. It is two years old using a six year old fab process (14nm).
    3080 series GPU's are on the horizon. We are expecting news with in the next week and availability late this year. These are expected to provide a massive increase in performance over the 2080 Ti as they are moving to 7nm.
    AMD Zen 3 CPU's are expected in September. Single core performance is expected to be in parity with Intel.
    AMD's budget motherboard chip set is expected next month. It is expected to have support for at least Zen 3.
    RAM is ridiculously expensive right now.
    The 'best' consumer SSD is 2 years old. Current PCIe 4 SSD's are not suited to boot drives. Depending on 4k video work flow you might need a raid setup.
    HDD's are currently over priced and going though the SMR controversy.

    Recommendations:
    I would consider pushing RAM to 128GB depending on how serious you are about 4k video editing ($$$).
    I would consider storing video on a local server and not installing a local HDD ($$$$).
    If 4k video editing is a major use, I would highly recommend a professional monitor supporting the colour spaces you edit for, HDR (if editing for) and a calibration colorimeter (if not included with monitor) ($$$$).
    If you need Wifi, consider a MB with onboard instead of extra USB or PCIe adapter ($$).
    4k and VR gaming is the requirement eating budget. Down grade GPU if 4k/VR gaming is not a high use case).

    CAD software not budgeted for.
    3D printer not budgeted for.
    Network upgrades to support 4k video editing non budgeted for.

    Consider waiting until Sep/Oct for either 10% to 20% discount overall (buying current gen) or a 5% to 40% increase in performance depending on workload (buying next gen).

    • If gaming is important you want to be above the Xbox Series X and PS5 spec's to ensure decent gaming experience for the next few years.

      • Hey This Guy

        So a couple of things. Firstly the PS5 and Xbox One X are interesting to look at, the PS5 will have 8c/8t and be quite low clocked, so any 4c/8t desktop PC from the last few years (see 4770k, 6700k etc), will match it's CPU performance. But this build isn't a gaming focused build it's a workstation that games, so, this is why some are suggesting more cores/threads might be a good idea here, it might also not be.

        Secondly, recommending a 2080ti, I just can't agree with that, it's price to performance just isn't there, it has about 30-35% more performance than a 2070 Super and it's about 100% more expensive. Like I said above, it's a silly purchase except if money means nothing to you, even the 2080 Super which is the highest a reasonable person would go, it only has 10% more performance than a 2070 Super but it's about 30% more expensive.

        A 2070 Super will run 4K, no it won't be high FPS at max settings, and it will run VR fine, if you look into the refreshes we're mostly talking about 1440p at 90fps, and if you take a look at most VR games, they aren't graphically intensive, so buying a 2080ti to run VR is overkill and really bad value.

        Price 2080ti - $2k, 2080 S $1.3k, 2070 S $1k. a 2080ti just is not twice the GPU of a 2070 S, it's 35% more performance for 100% more, makes no sense.

        • Yeah, you might want to read. I said do not buy now. OP said their CAD work was non professional (you don't game on a work machine anyway) so the only requirement that needs more than a ten year old quadcore is 4k (UHD) and VR gaming.

          Look at the build cost, not card cost. My build was $4.6k.

          For a 22% savings (2070 Super) OP would loose more than 30% fps For VR, latency and stutter become a major issue with the 2070 super providing 30% to 50% less frames on time compared to the 2080 Ti. 90fps is useless if you keep vomiting because the lag.

          Next gen consoles are rumored to basically run a 3700X. The 3700X is just behind an Intel i9 9900k, not a 7 year old quad core.

          • @This Guy: Yeah I jumped through a couple of things which I see need clarification.

            No, a 4770k won't be an exact match for the upcoming consoles, it does however have 8 threads, which is the same as the PS5, meaning that having more than 8 threads for gaming is likely to wait another generation as games won't be optimised for more than 8 threads, yes the Xbox SX has 16 threads but this doesn't mean developers will optimise for 16 threads always, they might just use 8. That's what I was getting at.

            Also, which VR system are you using? Mine runs at 80hz, many run at 90hz, and it's perfectly fine, not sure where vomiting and lag come from, just doesn't exist, a 2070 S will be fine for VR.

            So using an entire build's price, let's use yours.
            https://au.pcpartpicker.com/list/WmFLBZ

            Ok, well for a start to make this as fair as possible, somehow it's selected a $2.8k 2080ti that's insanity, you'd never ever in your right mind spend that much, so selecting a 'normal' price for a 2080ti, so I selected the Asus Strix for $2.1k, brings the total price to 5739.51

            https://au.pcpartpicker.com/list/pXcW8M

            Swapping in a 2070 Super, just under $4k ($3913.99)

            https://au.pcpartpicker.com/list/KmtwXv

            5739.51 / 3913.99 = 1.466405212059274

            In other words, using the total system cost, the build will cost 46.6% more for 30-35% more performance. And this is at the edge of sensible for a system cost.

            Please, do not recommend the 2080ti to people, it is a bad value product that should be avoided. Thank you.

            • @conza: Yeah, prices change. System was $4600 when I posted. When I checked today system minus GPU is up ~$200

              Like for like:

              A 2080 Ti system is $4700 right now

              A 2070 Super system is $3650 right now

              That is a 29% increase for at least 30% increase in performance. Don't be a (mod: edited). Argue against your opponent's best argument instead and it will make your arguments unloosable.


              All you have really said is:

              An old quad core is just as capable as a current gen octacore.

              Which is wrong.

              and

              2080 Ti's are poor value.

              Which is true only if you don't consider OP is spending ~$2700 to be able to run a GPU.

              In Minecraft with ray tracing and DLSS a 2080 Ti can barely hit 57fps at 4k

              In Minecraft. That was a 300MB java game.

              A 2080 Ti is not fit for purpose in a $5k gaming PC (because OP's other listed use case's basically require a potato).

              As an upgrade a 2070 Super can make sense, as I am sure it did for you, but not in a $5000 4k + VR gaming machine.

              • @This Guy: So we both agree now is a bad time to buy a PC and this is essentially academic, but applies in general when looking at high priced GPUs, when adding a GPU to a system we're examining whether or not in a reasonably priced and speced system, if there is value in upgrading the GPU to a higher end model.

                In this case we're comparing the perfectly adequate 2070 Super, which I believe is a universal goto for buying a high end Nvidia GPU, on the 15th of May 2020 when perhaps no one should be buying a PC, and your argument is, it is still worth it to upgrade to a 2080ti given it has 30% more performance, and this individuals budget allows for the higher expense.

                With the parameters set, we'll start with your 2080ti system.

                I've modified it to have only 2 fans and a more reasonably priced power supply. The chosen case* has an exhaust fan already so I have included 2 premium 140mm fans for $60 instead of $120 for 3 120mm fans, and I have taken off $130 for the excessive PSU, 750W is plenty.

                So it's now a $4512 system, and using the cheapest reasonable GPU from both the 2080ti is $1900 (not recommended for such a high power GPU, the cooler is likely too small) and a 2070 for $950). This brings the same system with this 2070 S to $3562.

                2080ti - https://au.pcpartpicker.com/list/prwhrV
                2070S - https://au.pcpartpicker.com/list/bLf9L2

                So the GPU price is 100% more exactly, for 30% more performance, but taking into account the system cost which is $4512 / 3562 or an increase in 26% performance. So on your numbers, it wins right?

                I disagree though, because the end user has still spent $950 for an additional 30% performance. I find this spending excessive and unnecessary, and still poor value.

                I think your main system is a fairly configured one, it isn't too overpriced after a couple of minor changes, but even with a cheap 2080ti and the entire cost only increasing by 26%, the fact that 26% in this case is $950, and that $950 only gives an additional 30% performance, it is not something I could recommend in good conscience*.

                Make your choices, I stand by the fact the 2080ti, in any configuration at any price at $1900 or more, is poor value and should be avoided.

                EDIT: Yes I also lowered the Windows 10 OS to $40, you can find legitimate keys on the internet for far less than $155, I didn't research current prices for keys but regularly in my experience you can find keys for approximately $40.

                • @conza:

                  I have taken off $130 for the excessive PSU, 750W is plenty.

                  I spec'ed the PSU for a 3900X (225W) and an Ampere GPU (assumed 400W) as OP looked like they were taking this and that from multiple people, and if they bought a 2070 Super they will most likely want to upgrade to the next Titan (assuming release next few months) or a 3080 (assuming release end of year). 1000W leaves 200W available for more HDD's, a PCIe 4x SSD (when they drop in price next year) or even USB add on cards if OP goes crazy with VR peripherals.

                  I have included 2 premium 140mm fans

                  I used Noctua as they don't fail, they perform as claimed and are quite. When included they were $17 a piece. They are rated at 25 dB/A, not 37dB/A like the junk you changed them to. From experience I don't trust that fan's rated pressure. In my experience non Noctua fans fail in ways that make them insanely annoying (tick tick tick tick tick tick tick). The two included case fans will be of similar quality to the Corsair's you like. I also gave recommended positions for the 120mm fans (bottom front) which are not compatible with 140mm fans positions on that case (top front) to give more fresh air to the GPU.

                  the cooler is likely too small [for a blower GPU cooler]

                  Not an issue if you have two high pressure fans feeding it fresh air…

                  I disagree though, because the end user has still spent $950 for an additional 30% performance. I find this spending excessive and unnecessary, and still poor value.

                  K, then spec a system with an APU. At around $800 dollars you should be able to spec a 3400G with 1971 GFLOPS. A 3400G is ~$35 more than a 2500X (same cores, similar clocks). A 2070 Super produces 7066 GFLOPS, which is 3.6x more, but at ~$850, you are paying 24x more.

                  And a 3400G performs great at 4k if you play decade old games.

                  Yes I also lowered the Windows 10 OS to $40

                  Those keys are usually sold against the terms of use and are technically pirated.

                  I stand by the fact the 2080ti, in any configuration at any price at $1900 or more, is poor value and should be avoided.

                  A Raspberry Pi 4b produces 50 GFLOPS for $75 (28.8 GFLOP/300MHz*500MHz) and supports 2x HDMI2. The Broadcom VideoCore VI is worth what, like 20c? $850's of VideoCore VI would produce 212,500 GFLOP's, massively out performing the 2070 Super while supporting 8500 4k monitors…

                  But that is dumb, because just like the 2070 Super, the Videocore VI is not fit for 4k gaming. OP asked us to spec a PC for a list of uses, not to ask us what we have bought or what we find is a good deal.

                  Colouring your suggestions with what you consider value is only ever going to screw over your clients. Listen to what they want, what they are using it for then spend their money to give them that experience.

                  I get you are trying to save OP money. OP would be best served waiting 6 months.

                  (Just as an aside, DLSS works by upscaling. To his 57 FPS in Minecraft RTX, the 2080 Ti has to upscale from between 720 and 1080. Even a 2080 Ti is not fit for purpose).

                  • @This Guy: @This Guy

                    "I have taken off $130 for the excessive PSU, 750W is plenty."

                    "I spec'ed the PSU for a 3900X (225W) and an Ampere GPU (assumed 400W) as OP looked like they were taking this and that from multiple people, and if they bought a 2070 Super they will most likely want to upgrade to the next Titan (assuming release next few months) or a 3080 (assuming release end of year). 1000W leaves 200W available for more HDD's, a PCIe 4x SSD (when they drop in price next year) or even USB add on cards if OP goes crazy with VR peripherals."

                    For your information, you should probably look into how much power components use, 1000W would be for, several HDDs, water cooling, dual GPUs, if you plug in a 3900X and a 2080ti into this site the load power usage is 511W, recommended 560W so even a 650W PSU should be 'enough' but I went a bit higher at 750W because it's unlikely the system will be at full load all of the time so it should be hopefully around 350-400W under typical load, PSU calcs are universally accepted, you should give them a try, if you do and have an alternative one that suggests higher power usage I would be very interested here's the one I used - https://outervision.com/power-supply-calculator

                    I don't agree with upgrading every year, or if you've bought a high end GPU, a 2070 S should last most people 3 years, maybe 4, you can turn settings down for some triple A games over time, but right now it can run basically anything at basically maximum settings, upgrading to a 3080 would be very silly, in my opinion.

                    I have included 2 premium 140mm fans

                    I used Noctua as they don't fail, they perform as claimed and are quite. When included they were $17 a piece. They are rated at 25 dB/A, not 37dB/A like the junk you changed them to. From experience I don't trust that fan's rated pressure. In my experience non Noctua fans fail in ways that make them insanely annoying (tick tick tick tick tick tick tick). The two included case fans will be of similar quality to the Corsair's you like. I also gave recommended positions for the 120mm fans (bottom front) which are not compatible with 140mm fans positions on that case (top front) to give more fresh air to the GPU.

                    These are Corsair Maglev fans, fair enough they jumped up in price so I think at $17 fair enough, it just didn't make sense at $39 each, to each their own, Noctua are obviously very well regarded, so are Corsair's Maglevs among others no point staying on here.

                    Case compatibility is important I didn't cross reference that this time, PCPartPicker is normally good at picking that (when not using custom parts, which I did).

                    the cooler is likely too small [for a blower GPU cooler]

                    Not an issue if you have two high pressure fans feeding it fresh air…

                    Umm, haven't seen the data, I think we're getting confused. I meant that the GPU cooler by MSI is clearly not the best cooler you could get for a 2080ti, so my point was that it should be considered a $2.1k GPU not a $1.9k GPU, so I continued with it being a $1.9k GPU for the sake of argument, it is possible with enough fans that the ambient temperature of the case is cool enough to mean the GPU cooler wouldn't need to work as hard, I don't think either of us can say for sure without building and testing the system, so it's better to spend more money on the GPU's cooler to be sure that it can cool itself despite whatever case/case fans the user goes with. That's my position on it.

                    I disagree though, because the end user has still spent $950 for an additional 30% performance. I find this spending excessive and unnecessary, and still poor value.

                    K, then spec a system with an APU. At around $800 dollars you should be able to spec a 3400G with 1971 GFLOPS. A 3400G is ~$35 more than a 2500X (same cores, similar clocks). A 2070 Super produces 7066 GFLOPS, which is 3.6x more, but at ~$850, you are paying 24x mo

                    I think that's not worthy of a response.

                    So going to ignore the comments about APUs is there anything else…

                    Colouring your suggestions with what you consider value is only ever going to screw over your clients. Listen to what they want, what they are using it for then spend their money to give them that experience.

                    So based on experience and knowledge, recommending a good value product from your perspective and the perspective of tech reviewers in the space, is all you can do. You and I can't separate what we think value is, so it's a pointlesss arguement, you suggest there is value in adding a 2080ti vs a 2070 S, I do not see the value in it, $950 for 30% more performance isn't worth it, in my opinion, it is in yours, that's why I went through and demonstrated the differences using essentially your data.

                    I get you are trying to save OP money. OP would be best served waiting 6 months.

                    Yes I agree they should wait 3-6 months for either Big Navi or the 30 series, or really both, not to mention Ryzen 4000 and Intel's 10th Gen (again probably both), so revisiting with a 3080 and a 4700X will be really interesting.

                    Lastly though, a 2070 S isn't a budget GPU, it's an awesome GPU, it can handle modern games very well and I think most people would buy it and be satisfied with it's performance for the applications that the OP sets out, so if the OP or someone else wanted to spend more on a 2080ti, by all means, it's a free country, people should buy what they want, but they should also understand the cost and value based on the opinions of others who have researched these components, knowing the FPS for the games they want to play and any other applications they intend to use it for.

                    • @conza: First, we stopped being helpful to OP after your first reply. This is purely fun for us.

                      so my point was that it should be considered a $2.1k GPU not a $1.9k GPU

                      For $200 you could buy 5 Noctua fans at their top price… You have no idea what you are doing. Further proof:

                      Noctua are obviously very well regarded, so are Corsair's Maglevs

                      Yeah, no. More:

                      For your information, you should probably look into how much power components use, 1000W would be for, several HDDs, water cooling, dual GPUs, if you plug in a 3900X and a 2080ti into this site the load power usage is 511W, recommended 560W so even a 650W PSU should be 'enough' but I went a bit higher at 750W because it's unlikely the system will be at full load all of the time so it should be hopefully around 350-400W under typical load

                      I gave you numbers. A 3900X can draw just under 225W on Air. Ampere is quoted as up to 400W. X570 boards can draw 75W (rated for 140A, can push double). Both SSD's and HDD can get as high as 20W. That is ~740W. Yes I round up. PSU is critical and you can not under spec this part.

                      A good case fan draws ~1W. Corsair's XD5 only draws 30W. 33W. Like I said..

                      1000W leaves 200W available for more HDD's, a PCIe 4x SSD (when they drop in price next year) or even USB add on cards if OP goes crazy with VR peripherals.

                      And that is ignoring the 1st rule of power ratings, they are always a load of rubbish.

                      /end proof.

                      I think that's not worthy of a response.

                      Nice. Then you still try to sell the unfit for this purpose 2070 Super. Pick one.

                      So based on experience and knowledge, recommending a good value product from your perspective and the perspective of tech reviewers in the space, is all you can do.

                      Pull your head out of your arse. You and I are no one to the OP. Be a decent person and help them with what they asked for. OP has more money than you. Keep working and you can buy frivolous stuff too.

                      This isn't about value, it is about a solution that is fit for purpose.

                      Lastly though, a 2070 S isn't a budget GPU, it's an awesome GPU

                      It is a poor product. It is great for 3440x1440 gaming, but it is over kill for 2560*1440 and is not fit for 4k. It's key feature, ray tracing, has to be upscaled ridiculous amounts to work at high resolution. The 2080 Ti is a poor product too, but..

                      both yours and my opinion of a graphics card is irrelevant, we are meant to be posting here to help OP achieve their goals.

                      But you don't get that or have a clue what you are doing so I look forward to you continuing to ask OP to install a wind tunnel and a metronome in their house and further justifying your purchase of a 2070 Super to me while ignoring how unfit is it for the OP's original stated goals.

                      • @This Guy: Ok wrapping up, I'm going to skip several things. Firstly, I don't see value in hyperbole like comparing a 2070 S to a raspberry pi, or stating it is somehow unfit for purpose when it clearly is so I'll just ignore odd comments like that.

                        More case fans or a better GPU cooler, pick more fans if you like, they'll cool everything, motherboard, cpu cooler, gpu, storage drives, everything sure, but again, you and I haven't tested 5x Noctua fans and a blower card (or a cheaper axial fan cooler) vs. a higher end GPU cooler. I have however seen that blower cards struggle to cool big GPUs and that the further away you are from the GPU die source the harder it will be to keep it cool, and as we know, GPU temperature matters to get the most out of these modern cards.

                        Why not use PSU calculators? I'm not going to dispute the individual part calculations, minor anecdote I just watched a video recently where a 3970X under a stress test was pulling 270W, so I find it a bit unbelievable that a 3900X would pull anything more than 200W checks link So that's 216W power draw for the system with a 3900X? It says system power draw, not the draw from a clamp on one of the psu cables or from a probe so, the processor itself has to be less than 216W? What am I missing here? Again I just can't see how all the PSU calcs could be wrong. Anyway I've used several of them, they seem to be fine.

                        2070 S. Performs about 5% above a 1080ti, it's decent for;

                        1080p (overkill), 1440p (very high end), 1440p UW (great), 4k (very good), VR (excellent - prove me wrong, haven't seen a bad result but if they're out there go for it).

                        Don't see why you continually personally attack me, this is just an academic discussion, it doesn't help your argument, nor hold water since I do know what I'm talking about and you appear to for the most part as well, that's why the discussion is continuing(?).

                        OP has more money than me? Haha, pointless nonsense, you have no idea how much money he or I have, just what he's willing to pay, that doesn't mean you spend it all willy nilly, so I don't see the point in comments like that, a waste of money is a waste of money if you're putting your life savings toward it or you make $10k a second, buy what you need, overspending is overspending.

                        • @conza:

                          you and I haven't tested 5x Noctua fans and a blower card

                          PM me your mobile# and I will send you some pictures.

                          So that's 216W power draw for the system with a 3900X? What am I missing here?

                          Kinda proving my point why PSU calculators are rubbish. 90+ PSU. The chipset, storage and GPU idling but you want me to believe they are consuming 90W?

                          Again I just can't see how all the PSU calcs could be wrong.

                          Most manufacturers play fast and loose with their rated numbers, which doesn't matter if your PC only consumes 200W. The reason I recommended SeaSonic over Corsair, even though I am currently running a Corsair PSU, is that some of their cheap models used to be built in factories known to produce poorly designed PSU's. Corsair still has some employees who make stupid decisions (like calling out a youtuber) and I won't risk someone else's money with a bad, general suggestion.

                          2070 S. Performs about 5% above a 1080ti

                          1080 Ti is 20% faster (10609 GFLOP vs 8218 GFLOP) with 3GB more. However it's drivers are often not as optimised as the 2070 Super. I have never heard of a 1080 Ti being recommended for more that 1440.

                          prove me wrong

                          1080 Ti https://www.anandtech.com/bench/product/2140

                          4k GTA V, a seven year old game, is 51fps, with only 40 frames on time

                          2070 Super https://www.anandtech.com/bench/product/2516

                          4k GTA V is 47fps, with only 27 frames on time.

                          In this cherry picked example the 1080 Ti produces 8% more FPS and 48% less stutter.

                          Find your own examples to prove yourself right.

                          Don't see why you continually personally attack me

                          I am trying to teach you to listen to your client

                          since I do know what I'm talking about.

                          You don't

                          overspending is overspending

                          Just because you don't find value in something doesn't mean other's won't

                          • @This Guy: Go get an education, watch Gamers Nexus. As soon as you said I don't know what I'm talking about, you lost me, just another internet troll.

                            • @conza: Watching a few Tech Jesus video's doesn't make you a industrial designer. I'm not, but I am competent enough of a PC builder to know that noise, reliability, air flow and use case all matter.

                              What will you tell OP when they complain their PC sounds like a jet engine?

                              How are you going to remotely diagnose that OP's GPU is thermal throttling because your removed the GPU's intake fan because 140mm fans are better?

                              What are you going to say to OP when they complain of a ticking noise coming from their computer a year from now?

                              Are you going to say sorry for wasting $250 of OP money if they need to buy a completely new power supply if OP decides to go all in on VR and adds a tone off accessories?

                              Learn to find out what you don't know and you will be able to build amazing machines. And always respect the client's needs. But I am the troll…

  • Moving to a smaller fab process doesn't increase performance, its the new architecture that normally accompanies it that is responsible

    New GPUs will have lower heat output, and lower costs of production as you can get more products out per wafer.

    Nvidia already have lower heat and power consumption compared to AMD GPU even with their "old" 14nm GPU architecture

    Judging by the experience of intel and amd, they may also have problems with lower maximum clockspeed and lower yields. The benefits of smaller fab process is just marketing at this point

    • "Nvidia already have lower heat and power consumption compared to AMD GPU even with their "old" 14nm GPU architecture"

      Citation Needed

      • It is pretty common knowledge that Nvidia cards use less then their AMD counterparts

        https://www.guru3d.com/articles-pages/msi-radeon-rx-5700-xt-...

        Not exactly a controversial statement, it has been the case for years

      • this has been the case for at least the last few generations of cards, I use AMD myself but they do consume more power and run hotter.

        e.g.
        https://www.tomshardware.com/features/amd-vs-nvidia-gpus
        "Prior to AMD's Navi, at least for the past six or more years, the competition between AMD and Nvidia in terms of GPU power efficiency was decidedly in favor of Nvidia. But Navi changed all that, right? Using chips built with TSMC's 7nm FinFET process and a new architecture that delivered 50% better performance per watt, it could close the gap. Except, it was so far behind that even a 50% improvement didn't fully address the efficiency deficiency."

    • Literally everything you said is wrong, idk where to even start.

  • Does Nvidia have open source graphics drivers yet or do you still have to use that wrapper via dkms?