• out of stock

[eBay Plus] Intel Core i9 10900KA 3.70GHz CPU (No Heatsink) $629.10 Delivered @ Computer Alliance eBay

730
PLUSJY10

Intel S1200 Core i9 10900KA 3.70GHz 10 Core CPU BX8070110900KA Marvels Avengers Collectors Edition Packaging

Warranty: 3 Year Australian Warranty & Support

Original Coupon Deal

Related Stores

eBay Australia
eBay Australia
Marketplace
Computer Alliance
Computer Alliance

closed Comments

  • This or amd 3900x for productivity, CAD and photo editing?

    • +2

      If adobe only then 10900k, otherwise 3900x

    • +29

      Either. You're overthinking it.

      • Ok thanks. The 3900x comes with a cooler for $750, what cooler should I get for the i9 if I go with that? It seems a good noctua one is $150 which makes the i9 about the same price.

        • +1

          It’s $110 perpetually on newegg for the d15

    • +2

      whichever cheaper

    • +2

      Go the AMD and you can have option of upgrading to a 5000-series down the track without changing motherboard.

      • +5

        Isn't it the same for Intel (Rocket lake compatibility with Z490)?

        • +1

          Gen 11 has only up to 8 cores@250w while 5950x has 16 cores max at 150w.

          3900x -> 5950x : 50% more performance
          10900k -> 11900k not really an upgrade for productivity load.

          • +4

            @Cloudream: wtf, are they really going down to 8 cores from 10 for the 11th gen?

            might put an order for a 5900x in if that's true intel's going down hill fast.

            • +2

              @Axelstrife: Yes, larger core and cache design so they have to cut 2 cores, otherwise too much heat because it’s still 14nm on desktop platform.

          • +2

            @Cloudream: 250w for 8 cores? wtf?

            • +3

              @BargainKen: That’s the power of 14nm + larger core design. Zen 3 use >60% less power to do the same calculation.

              • @Cloudream: They dont think very much of their consumer market, I think all the 10nm they do make goes to Server parts and some is contracted out to TSMC for 7nm. Must be where the real money is.

      • +2

        upgrade from 3000 to 5000 a few years later when both techs have been superseded, great advice.

        • You will get good return on investment on it, the 5000 series are some serious cpus. CPUs don't become outdated as fast as graphics cards.

    • whichever one is cheaper

  • +3

    Damn, the 10900K has really fallen in price. It's even lower than the 5800X and the 3900X now.

    At this price, it's really good value.

    • Z590 and 11900K are just around the corner ;)

      • +1

        11900k: 2 cores less and slightly worse multi core performance.

        • +2

          You are sacrificing multi-core for single-core performance.

          Leaked benchmarks are showing a 11% increase in single-core and a 12% decrease in multi-core performance when comparing the 10900k to the 11900k at identical frequencies.

          Overall, I would be willing to take that sort of a trade-off. For gaming and general-productivity, the 11900k would be the better CPU.

          Question is, what will the pricing be like. If they end up undercutting the 5800x, it would be a no-brainer to get one. By lopping off two cores, die complexity would drop enough that yields are likely to be higher with the 11900k, so pricing may very well be quite competitive.

          • +1

            @FuRyZ: 11% increase compares to gen 10, just on par with 5800x (with curve adjust coming in new AGESA BIOS).

            But:
            - Requires better cooling, motherboard VRM and 100w more power supply, so the combined cost is probably the same if not higher
            - No upgrade path, image you can later upgrade to 5900 for ~200 hundreds dollars after sold 5800
            - X570 can provide 3x PCIe 4x M.2 if you need it in future for productivity load.

            It's probably a good choice if it's indeed cheaper. Let's hope Intel can release something good so AMD CPU will become cheaper ;-)

            • @Cloudream: I wouldnt say on the 5800X is on par, depends on that workload you are looking at. The 10900k is still the best CPU for gaming today. As for everything else, 100% go the Ryzens.

              • I do agree on power draw, but realistically, efficiency is not something most people care about in a high end gaming system.
              • Upgrade paths have been a headache with Intel for a while, I think most people are simply used to that by now.
              • For real productivity systems, you wouldn't get the 11900k anyway, so I am not sure if most people will care for three 4x M.2 slots. Rocket lake will have 20 PCIe 4.0 lanes, similar enough to 24 lanes on the Ryzen. We don't know much about how many lanes the Z590 chipset will provide or how they are divided up, so I won't speculate. Ultimately, it's not all that important, even today's highest end graphics cards haven't been able to saturate PCIe 3.0 much less the doubled bandwidth provided by PCIe 4.0. We are not at a stage where PCIe bandwidth is that big a concern.

              Like I said, it will come down to price. Intel have been squeezed into a corner, the big drop in 10900k pricing reflects that. The 11900k is at best a stalling tactic to hold onto that gaming market where Intel still holds some minor lead. And because of that lack of direct competition, AMD will likely continue their recent price gouging, much how Intel have done for years.

    • it's to justify the horrible power efficiency and buy enough time for intel to come back with 10nm. Personally I don't like that it runs so hot and sucks so much power.

      • +2

        Personally I don't like that it runs so hot and sucks so much power.

        Ehh, never bought into the power argument. Calculate the time your PC is at full load and figure out how much more power this is using vs. something like a 5900X. It'll probably be like $20 per year. Also, the 10900K runs really cool, so idk where the "hot" argument comes from.

        • +1

          I've been following PC tech for ~15 years and this is the first time I can remember so many people caring about performance per watt or TDP, except maybe the Bulldozer release. Apparently CPU power consumption is a big deal when you're gaming for a couple hours a day at 50% CPU usage.

          I just don't understand tribalism in tech.

  • -5

    if only it came with a heat sink

    • I don't think stock cooler is best for this CPU

    • Why? You'd need a better one

    • this is dumb

    • -2

      wow, so one thinks i could have been making a joke…lol

    • -2

      I think it means you don't need one

  • +3

    Great chip, great price.

  • +3

    No heat sink? They mean no cooler?

  • +3

    was waiting ages for an AMD 5600x, over it, got one of these. way better.

    • What motherboard did you get?

      • +1

        im going to choose the asrock z490 taichi, because of my case the board is on display and i like the look of it

        • 👍

  • Does it come with the fan at least ?

    • +2

      no, my advice would be a good aftermarket cooler

    • +1

      you should never use these with stock fans anyway

  • -8

    With this much spent I'd rather go with 5800X.

    • +3

      you cant get one but

      • -2

        I don't mind wait but each to their own.

      • +3

        5800x is regularly in stock.

    • Lol

    • I dont think anyone here asked

    • Sure, go the povo option lol

  • Definitely great value !

  • -2

    Do not buy, it uses too much electricity.

    • +5

      Does anybody seriously consider power consumption for general use in first world countries? Honest question.

      I'm still using a 3770 for general + gaming, although I have a pending pre-order for a 5900x.

      • +1

        Don't think so, though they would consider the thermal output for cooling requirements, especially in smaller builds.

        • +1

          Which in turn is directly related to power consumption. More power = more heat to my knowledge.

        • Yeah, it's definitely form factor/case dependent. Smaller builds are more sensitive to thermal requirements.

          Looks like I triggered a few people for even asking the question. Maybe I should've asked, if you can get more or equal power for a lower RRP, for an ATX desktop build, how much does power consumption really sway your decision?

          I'm not saying power consumption means nothing, but assuming the product can be appropriately cooled, surely there are higher priorities than power use?

          • +4

            @Timboj: Power consumption is generally not going to be someone's highest priority, but boy would I love it if my 9700K, RTX3080 system didn't raise my room's ambient temperate by up to ~5C.

      • +1

        Yes, of course they do.

      • +5

        Does anybody seriously consider power consumption for general use in first world countries? Honest question.

        No, it's just an argument by fanbois of each side when their side has slightly better power consumption numbers. FWIW, I own both high end Intel and AMD chips and have measured their power consumption under load. The difference is around 30-40W with the same GPU.

        This means that for every hour your PC is under load (because that's the only time you'll see the power difference), you'll spend 30 - 40W more, let's say 35W on average.

        Do some simple maths. If your PC is under load for 4 hours per day, that's around 140 Wh of energy per day, so 51,100 Wh per year, i.e. 51.1 kWh. The average price of electricity in VIC (similar in other states) is around 27c per kWh.

        Therefore, if your PC is under full load for 4 hours per day, the difference in your electricity bill is $13.80 per year. Even if the difference was 100W (which it isn't), it would still be less than $30 per year. FWIW, most people don't even put their PC under full load for 4 hours per day, so the difference is likely moot. The difference was practically non-existent in gaming loads.

        Hopefully we can put the power argument to sleep permanently. Again, I'm not an Intel or AMD fanboy, the point is just that discussions about power consumption are so far removed from reality that I think it's worth actually converting it to how much you're saving/spending per year rather than some arbitrary wattage, and definitely much more so than random shots of "too much electricity".

    • the cryo cooler for this chip from coolermaster is another 200w

      gotta melt some icebergs to create one on your cpu i guess ;D

    • Not if you use solar panel.

  • -5

    14nm++++++++++++++++++++++++++++++++++++++++?

    wat a joke

    • +1

      11th Gen desktop CPUs are on 14nm too I believe

    • +1

      your joke is getting fking old now

  • I think a 20% off is coming up soon. If you don't need this now, wait a bit longer.

    • Why do you think this is the case? Serious question, planning to pull the trigger on something like this soon

  • +2

    Bought a i5 10400F for $210 from MSY for gaming - will see what the I7 and I9 prices are like in a year or two.

    • I did the same thing

  • +1

    Is there any actual overclocking benefit of the 10900k over the 10900F? Legit question.

    • The Silicon Lottery reports slightly better chances of getting a higher overclock on a 10900KF, but personally I wouldn't pay more than like $10 extra for that chance.

  • Which mid range motherboard with wifi and 4 ram slots would the hive mind recommend? Ideally under 250. Thanks

    Also which 750w PSU?

    • Check the psu and motherboard tier list posts on linus tech tips forums (will come up if you Google those terms)

  • -1

    Seriously, why can't Intel just include A cooler, ANY cooler in the box? AMD does it

    • +1

      Not always. The XT versions don't (eg. 3800xt)..

    • Because a stock Intel cooler would be useless for this. If anything it might cause damage to the chip since it can't keep the 10900k under control and the user might be lulled into thinking it's ok.

      AMDs included wraith coolers can actually keep up with their chips decently unless you push them hard.

      • +1

        Because a stock Intel cooler would be useless for this.

        Why would Intel give you a cooler that is useless?
        If they were going to include one why wouldn't they include one that works?

    • Because if people use those coolers and get a bad experience they'd blame Intel…?

    • AMD does it

      Lolwut, get your facts right. Of the current lineup, 5800X, 5900X, 5950X do not contain coolers…

  • Just a question, what is the main differences between this one and i9-10900F?

    • +3

      F suffix = no integrated graphics
      K suffix = chip can be over clocked

    • +3

      F = No interrnal GPU
      K = unlocked multiplier
      A = Avengers :)

      • A = Avengers Box Art (for some reason they couldn't get the deal through to bundle a copy of the game with it)

    • +4

      The 10900F has the imbedded graphics disabled. So you wont be able to use the motherboards built in HDMI port. Sometimes this is useful if your trying to diagnose a fault with your GPU

      • +1

        Do you have that many GPU faults that not having imbedded graphics is really a good thing?

        • +2

          Huge help when troubleshooting my last GPU swap (nightmare day which would not have ended happily if not for that). Can also be handy if you ultimately put the CPU to some unforeseen use (e.g. when my 3770 went from my main rig to a secondary box, saved me having to get a GPU). Another use I've had for it is during the set up period for a new system prior to swapping over and transplanting the dedicated GPU from the existing rig.

          So if the difference is only say $30-$40, I'll probably take the GPU, and at $0 it's a no-brainer!

    • How come this particular model is not in Passmark ?

      https://www.cpubenchmark.net/high_end_cpus.html

      • It's just a K with a lame box

        • You're confusing F (no iGPU) with A (Avengers box). The F cpu is supposed to run slightly cooler since it has no gpu on the chip.

          • @Agret:

            You're confusing F

            No I'm not.

Login or Join to leave a comment