eBay 20% off Sale - The Usual Price Jack Has Started

Surprise surprise… the price jack has begun.

https://www.ebay.com.au/itm/Gigabyte-AORUS-GeForce-GTX-1080T…

I added this to my watch list on Saturday when it was $1190. Today, it's $1309.

Is this misleading conduct by the seller - inflating prices prior to a sale? I'm sick of this 'sale' bullshit. Is this something Consumer Affairs or the ACCC would be interested in?

Edit: seems Kogan got done for a similar practice so maybe it is something the ACCC would like to know about!

Edit 2: Here's the link to report it to ACCC: https://www.accc.gov.au/contact-us/contact-the-accc/report-a…

Edit 3: Company name: Austin Computers, ABN: 69780893412 - get reportin'!

EDIT 4: ARE YOU KIDDING ME, THEY RAISED THE PRICE AGAIN! IT'S NOW $1367.

Edit 5: As @nickxau pointed out, it's an obscene $1482 now, almost 22% higher than it was on Saturday.


Mod: We have removed the unobtainable deals (due to a price increase after being posted) and merged them below (5 in total), note that many were posted before this thread was made, so they will appear at the top of the comments section below. See original mod comment and discussion here. Please note that other unobtainable deals which were removed due to combining an existing sale (6 in total) which finished before this 20% off sale started weren't merged (not related to a price increase and never possible/available).

Update 19 Jan 11:40am: The eBay deal posting guidelines have been updated to reflect new requirement that deals requiring coupon codes can only be posted on the day the coupon becomes live.

Related Stores

eBay Australia
eBay Australia
Marketplace

Comments

  • Merged from LG 55" OLED TV - OLED55B7T $1671.40 Delivered @ Videopro eBay

    Lowest price ever for this TV if the price doesn't get jacked up before the sale starts.

    Seems there is only 15 left hopefully they add more stock.

    Original 20% off Selected Stores @ eBay deal

    Mod 15/1: Price has increased to $2055.40 after the discount code (regular price before shipping was $1993, now $2473). Therefore this deal will be removed at 16/1 10am when the sale starts as unobtainable deal, unless the price reverts (highly unlikely).

    • +19

      Good price but it will most likely be jacked prior to the sale knowing Videopro.

      • +1

        It may be possible to click commit to buy,and then in three days add the coupon at checkout and pay then.

        • +1

          Unfortunately not possible as it requires immediate PayPal payment to check out.

          • @pw2002au: Add to cart?

          • +4

            @pw2002au: @Stix: does nothing … if they put up the price you just get a seller had revised the listing message.

    • +9

      LG Australia clearing them out before the B8,C8,E8,G8,W8 launch in March/April with the new Alpha 9 processor (A7 on the B8)?-

      https://www.youtube.com/watch?v=ih9gXQCpWnQ
      https://www.youtube.com/watch?v=eBpL4AqLfgg

      A great deal if Videopro they don't raise the price.

      • Is the processor the only upgrade? Hoping for HDMI 2.1.

        • See my post shortly below this one for more info, with the short answer is there is no HDMI 2.1 but there is 120 Hz support.

          • @jasswolf: Is it really getting 4k/120hz support? Because the current model already supports 1080P/120hz.

          • +1

            @jasswolf: @G3ck0:
            The only way TV's can display 120Hz with 4K is through HDMI 2.1, No TV's were gona get it this year and will be lucky to be on next year - probably only the high end and flagships it looks like. The max hdmi 2.0a/b can carry is 60hz (60fps) in 4K. Jasswolf is either saying 120Hz for 1080P or has no idea what he's talking about. As you need a cord connected to the TV that will actually display that huge amount of bandwidth. It doesn't have a displayport 1.4 connection, So there is no way of having 4K/120Hz in any scenario.

          • @jasswolf: @Monstalova:

            Actually, it (referring to A7 processor) does support 4k at 120hz via HDMI, but not if you also have HDR. It's either HDR or HFR via the HDMI port. But it DOES support 4k + HDR + HFR for streaming.

        • You are a year too early.
          We won't see 2.1 on devices till 2019 (or maybe late 2018).

          • @systmworks: OT- Denon is offering to swap out the board in their new Denon X8500H when it launches next month. You'll send it into your local service center where you pay for the HDMI 2.1 board ugprade-

            https://www.whathifi.com/news/denon-launches-x8500h-132-chan…

            Saves buying a new AV Receiver. YMMV.

          • @systmworks: Yeah I half expected to hear that. I'm mainly excited for it's frame syncing capabilities. Not having screen tearing in console games would be fantastic since it is so rife there.

            Similarly, I've been waiting for display port 1.3 to put into pc monitors since it was approved, but it's been almost a year I believe. These things take a long time.

    • +1

      What's the sound quality like on this or do you still need a sound bar?

      • +20

        If you're spending $1,600 on a nice 55" OLED, get some proper speakers.

      • Agree with above - if you are buying the best picture technology - don't stick with the worst sound quality (built in speakers).

        Nothing beats a true 5.1 (or better, like 7.2.4) surround speaker system which can be bought for a few hundred dollars (entry level - less than some high end sound bars) - but a good soundbar is much better than any TV speakers.

        • +2

          Better to start with quality 2.0 speakers then work your way up slowly.

          • +2

            @haru: When it comes to audio, go with the best thing you can afford at the time.

            Unlike TVs, speakers aren't obsolete every few years. Good speakers will last you decades. You'll just need to periodically upgrade your receiver (if you want to utilise new connections and other technologies).

          • +2

            @haru: @O15: Agreed.. my 6 B&W speakers (2 floorstand fronts, centre, rears, sub) are around 18 years old now and still sound/work like new.
            The front 3 have kevlar cones so they may in fact be bulletproof.. :)

            Ive only considered replacing them for a cosmetic change from plain black to walnut or cherry wood - but it would cost me thousands.

          • @haru: @systmworks:
            Have you looked into having timber enclosures built for them? Something to slide the existing unit, as is, into.
            I've seen this done and it can look quite good.

          • @haru: @O15: no I have not thought of that.. I guess you need some sort of thin buffer so the speaker boxes dont vibrate against the enclosure.
            Do you know of any website links ? If not I can try Google..

          • @haru: @systmworks:
            Haven't seen any, on the web, sorry. I would recommend talking to local carpenters.
            My parents had a wooden cabinet made for their stereo, about 30 years ago. It's reasonably tall as it houses several separate components. Back then everything was it's own unit, the amplifier, tape drive, etc. They also had two wooden floor standing boxes, to sit the pretty ordinary looking (but great sounding 9" speakers lol) into. Each box has a square speaker mesh insert which slots into the front. It's quite an impressive sight.

            Knowing what they've paid for some of their custom wooden furniture, in recent years, it probably wouldn't be cheap.
            However, if you like the sound of your existing speakers and are serious about changing the aesthetic, I'm sure it would be much cheaper than new high-end speakers.

          • +1

            @haru: @systmworks:

            My B&W DM220's are 30 years old and still going strong. Best value hi-fi I've ever bought.

          • +2

            @haru: @O15: rocking a set of Logitech z5500 for the past 10± years

    • Seems like an amazing price!

    • +13

      Fairly confident this price will hold, but for those wondering about the 2018 models, they're iterative updates (again) that will come with an updated version of WebOS that supports Google Assistant (Alexa via download).

      They'll support 120Hz video, but will have no means to retrieve such content other than WiFi/USB/Streaming (ie. no HDMI 2.1, at least in the firmware configuration at launch).

      The biggest changes appear to be the panel tweaks, beefed up image processing (though less so in the B8 models), and increased economies of scale for 65" panels.

      So it's probably a case of waiting until the 2018 models drop a bit from launch prices (yet to be confirmed, but it's hard to see them starting far north of $2000) and picking up a C8, unless you're a PC gamer who foresees having the hardware for 4k/120 fps in the next 3-6 years, in which case you'll want to wait for HDMI 2.1 and (presumably) VRR.

      My personal recommendation is to pick up an interim mid-range 4k LCD and wait it out for either new OLED panel technology or MicroLED tech in 2019-2020 as there seems to be a whole host of format wars for picture quality and refresh rates that need to settle down a little so the hardware specs can be ironed out. In particular, modular MicroLED panels look like the best way to future proof if you can position yourself to buy 2-3 years from now.

      • +23

        Or wait forever.

        • +1

          Hahah I suppose.

          What I'm saying is that MicroLED is about as cream of the crop as most people are ever going to want from a TV screen, and modular panels that connect up with each other will allow people to upgrade to 8k over time, at which point you're not even going to be able to spot the pixels on screen even if you press your face against it, at least in the 100" panels people will have stuck to their walls.

          At that point, we'll have pretty much reached a saturation point for today's type of TV, because image processing and panel brightness/lumination aside, it will be virtually indistinguishable for the human eye.

          We seem to be reaching that point in a lot of standard use cases for audio-visual technology today, with the outlier being 3d/vr/ar graphics.

          • -3

            @jasswolf: Actually no GPU at this stage can handle 4K @ 120hz. Even if you start to SLI, many games don't support it.
            I think the only feasible part of your discussion at this current stage is HDMI 2.1. I believe now that panels this year will be released with HDMI 2.0b however I reckon at the end of the year, HDMI 2.1 will be released, and that will be a massive game changer!
            Cheers

          • @jasswolf: @vinni9284: I wasn't aware that most people bought televisions only to replace them in two years, so thanks for the clarification.

            This year's Geforce 2080 (or equivalently named model) will push past 4k/60fps, and the refresh will likely be a 4k/120 fps card.

          • @jasswolf: @jasswolf:
            You're welcome!

            I think TV's are becoming consumables like shopping LOL.
            I mean this TV (its previous versions) were $3500 a couple of years ago. Now less than $1600 ..!
            You buy one every two years and flog off the old on Gumtree/eBay.
            Many do it. Have a look at the classifieds.

            Edit: I reckon the 1180 (not 2080) will still struggle… It will only play 4K games @ 60Hz in max settings.
            Nvidia/AMD are too smart … they are not going to market a GPU 10 times the performance of previous gen in one go. They want you to continuously spends on their iterations to get there LOL

            By the time you talk about this new tech, 8K panels will be available.
            Then the GPU will lag behind with TV/Monitors as usual

            Cheers

          • @jasswolf: @vinni9284: I was joking.

          • +1

            @jasswolf: @jasswolf: they replace phones every 2 years or less and prices are similar to this.

          • +1

            @jasswolf: @jasswolf:

            LOL .. OK … sorry; it's hard to tell a joke from text :-)

          • @jasswolf:

            they are not going to market a GPU 10 times the performance of previous gen in one go.

            Possibly because they don't market what they can't make. Unless, are you aware of some secret conspiracy between AMD/Nvidia deliberately holding back R&D outputs?

          • @jasswolf: @wxyz234: smartphone sales are actually slowing because the last 3 years of phones have managed to satisfy what most people want from the form factor.

            The challenge for phone manufacturers going forward are to dramatically increase battery life for high performance tasks, to introduce new form factors that increase productivity on the device (without removing the market for their other device types) and to deliver improved GPU and CPU performance to facilitate all of this.

            Otherwise, people will start hanging onto phones for 3+ years, especially given the current plan prices vs. going SIM only until your phone dies/slows.

          • +1

            @jasswolf: @jasswolf:

            Yeah, I bought a Note 8 recently and while its a nice phone its not a huge improvement over my old LG G4.

            I think i am gonna hold on to the Note 8 for 3 years before i get another.

          • @jasswolf: @haru: The 1080ti is a legitimate 4k/60fps card on the highest of graphical settings, and it's been a long while since a nVidia card didn't improve in raw processing by about 35-50% from its direct predecessor, with about a 20-25% gain from the old Ti to the initial superseding model.

            Ergo, the next xx80 will be at least a 4k/60fps card, while the xx80Ti will push towards 4k/120fps. Given the way people have been raving about the Volta and the recently released Titan chip, expectations might even be exceeded. Monitor manufacturers are currently scrambling to try and produce affordable 1440p/144Hz and 4k/120Hz screens so they can stay relevant in the upgrade cycle.

          • -2

            @jasswolf: @haru:

            You have to read the nested messages above to understand what I meant by my comment.
            The previous poster mentioned GTX2080, which is several years away.

            Have a read about Moore's Law which I believe is true and current
            https://en.wikipedia.org/wiki/Moore%27s_law

            I am not saying that they have technology today that are light-years away however I am confident that they can currently produce a GPU today that will smash the 1080ti however they are a business and will get us to spend continuously.

            Cheers

          • +2

            @jasswolf: @vinni9284: Moore himself has stated that this will have halted entirely by 2025 due to the physical constraints of the technology, but it's important to note that this was regarding a doubling of transistors on a circuit and not pure performance, especially in today's computing experience where we have to synchronise multi-core processors effectively.

            But for those of us that are buying from the retail shelf, it's already long slowed down. Intel's improvements in instructions per clock has been about 2-3% per generation for the better part of a decade, with only the number of cores and threads shifting upward.

            I've got an i5-760k from 2010 still here at home, and today's i7-8700k is one of the first CPUs that genuinely doubles it in single-core and quad-core performance (it thrashes it in multi-core, but we're talking 4 threads versus 12 here).

            Miniaturisation has improved, but cost and performance have been moving hand in hand for a long time, so we the consumer don't see much of that beyond GPUs, RAM capacity and increased power efficiency.

            They're currently struggling to produce 7nm chips, so without a radical change in chip design or a completely new technology emerging (and for the love of this discussion thread, do not suggest quantum computing), the handbrake is well and truly on unless you have millions of dollars, an air-conditioned warehouse and liquid nitrogen to play with.

          • @jasswolf:

            upgrade to 8k over time, at which point you're not even going to be able to spot the pixels on screen even if you press your face against it

            That's overkill and a waste of processing power. To generate that kind of resolution will require a lot of compromise from your gaming rig. And for what? You can't even see the difference.

            From the distance I sit, I can't even resolve individual pixels on a 1080p screen.

          • @jasswolf: @lostn: the gaming rig doesn't exist yet, nor does the texture support from the game developers.

            What I'm talking about is 8k starting to trickle into the mix in 2020, and consumers presumably buying televisions to last them 4+ years. I've also clearly stated that 8k is arguably the pinnacle of pretty much any use case you can come up with for a TV.

            At that point people are likely using multi-view, custom widgets and that sort of thing, maybe even a screenshare with users receiving different audio via wireless headsets. Don't get bogged down with the idea of someone standing 1 metre away from a wall, or someone using an 8k monitor (which would likely have to be at least 50" to present such a resolution optimally).

          • +1

            @jasswolf: @jasswolf: I can only see 8k being justified if it's a giant TV, or a head mounted display (e.g. VR).

            When your VR display is so sharp it has no screen dooring and you can't see individual pixels, you've gone as far as you'll need to go in terms of resolution.

            I don't think 8k makes a lot of sense as a 55" TV unless you're sitting as far away from it as you would a 24" monitor.

          • @jasswolf: @lostn: It doesn't, but I'm talking about 50-75" modular 4k panels becoming a 100+" 8k TV, at least in the near future, and thus the indistinguishable pixels/near-life realism part is something of an exaggeration. :)

          • +1

            @jasswolf: @vinni9284:
            Its all a big conspiracy. The TRUTH IS OUT THERE !!!

          • @jasswolf: @jasswolf: Is that even possible? My phone battery starts suffering around the 1.5 year mark (as is my wife's) so I figured it's built in obsolescence. Do phones still last 3+ years?

          • @jasswolf: @Ramrunner: these best way to maintain battery life is when it's new to let the OS profile your usage by going through the full charge before recharging, then after doing that for a week or so (don't stress if you need to plug in at times for work, etc), then after that try to cap charging at 90% of the battery capacity to keep it lasting longer.

            There's a slew of apps that can help with that on iOS and Android, so have a google around for a more in-depth guide to familiarise yourself.

            Failing that being useful, just get a 3rd party battery replacement done, which shouldn't cost more than $50-$60 in total. Do some prior research on what kind of 3rd party battery options are out there and if they're OK and your repairer has them, use them.

            Otherwise, most repairers will have genuine batteries available to them. There's no planned obsolesce at work, it's just the nature of lithium-ion batteries.

          • +1

            @jasswolf: @Ramrunner: Normally if that happens to me, I hard reset the phone. This gives me the opportunity to really think through the apps I really need, and whether or not I really need all those files (pics, videos etc) on my phone as well, which definitely doesn't help in keeping the phone fast. These days I use Files Go by Google to get rid of duplicate files, and just keep my files etc in check.

        • Too soon

        • Too soon

        • +2

          Been waiting since 2008, what's another year or two?

      • Though if you need a top of the line tv today. This is a great deal!

        • -3

          Yup, but what I'm saying is that top of the line today vs top of the line in 3 years is going to as dramatic as this year vs 10 years ago.

      • +2

        I’m a gamer with no money

        • -1

          Don't worry, you'll be able to rent a gaming box from the cloud by then, so just save for the fancy screen

          • @jasswolf: That’s a bit drastic.

            I’m afraid you are falling victim to the curve/3D marketing hype.

            They need to sell tvs.

            How much better of a picture do you want from top of the line 4K TVs?

            I think the differences will be so subtle you won’t know unless you’ve been sold marketing crap.

          • @jasswolf: @random101: Better raw picture quality? Not much.

            Better image processing? Absolutely.

            Better frame rates? Absolutely.

            Better running costs? Absolutely.

            The ability to affordably buy at least a 65" panel so I can properly appreciate 4k without sitting 1.5 m from the screen? Absolutely.

            The ability to upgrade my TV instead of completely replacing it? You bet.

        • Mo' money mo' problems.

      • +1

        But I want OLED NOW.

        • And you'll want microLED next year!!!

          • +6

            @jasswolf: Not at early-adopter prices.

          • @jasswolf: You don't say… ;)

          • +1

            @jasswolf: Might as well wait for nanoLED a couple years after if we're going to use this logic.

          • -2
          • +1

            @jasswolf: @jasswolf:
            Why shut down his comment? I totally agree. MicroLED doesn't seem to have a great pixel density at the moment. What Samsung showed was a 146" 4k display. Don't you think they would've shown off a <100" 4k display if they could manage it, seeing as it was at a consumer electronics show? No doubt they will get smaller over time, but also wouldn't put it past Samsung to market it as something like nanoLED (as ArthurT85 mentioned) when it gets small enough. And if that's the case, I'd definitely wait for nanoLED, cause the 30ppi microLED display Samsung showed off doesn't cut it

          • @jasswolf: @mrdavedave: MicroLED is a technology, not a size. There's 6000 ppi MicroLED screens in prototype stages at some companies (not very useful of course, but it exists).

          • @jasswolf: @mrdavedave:

            cause the 30ppi microLED display Samsung showed off doesn't cut it

            It depends how far you're sitting from it. If it's a 146" screen you're probably going to be sitting very far. There is a recommended viewing distance guide based on size and resolution of the display.

          • +1

            @jasswolf: This is an absolutely amazing TV for a crazy cheap price. You won't find better than this for video games. MicroLED will be 3+ times this price for a few years.

          • @jasswolf: MicroLED doesn't seem anywhere near as important as OLED is over LCD. With perfect blacks, HDR and high brightness there's not much more to gain.

          • @jasswolf: @CarbonTwelve: lifespan, elimination of burn-in (not as big a problem as some people claim, but still a long-term issue), energy usage and long-term manufacturing costs.

            So aside from peak brightness the picture quality is in the ball park, yes, but MicroLED is pretty much the way forward from OLED unless someone comes up with some extremely novel manufacturing techniques.

          • @jasswolf: @MrFunSocks: only with current console tech: not the 2019-2020 offerings, nor PCs. Seems kind of silly to buy a TV for that purpose only to want to replace it in 3 years.

            As I hinted at in the original comment, something like a 55N7 if it drops below $800 is a more comfortable buy as the kinds of TV panels I'm talking about are something you won't replace for 10 years, in theory.

          • @jasswolf: @jasswolf: 3 years is a long time, and this sort of money isn't bank breaking.

            Also you're setting yourself up for some massive disappointment if you think you'll get a 55"+ microLED TV for this price any time soon. I'd say it'll be longer than 3 years. Even then, OLED will still more than hold it's own, since microLED is pretty much just aiming to replicate OLED but without the organic. microLED won't have blacker blacks than OLED or faster response times than OLED. It's just not organic. Have a read:

            https://www.digitaltrends.com/home-theater/microled-vs-oled-…

          • @jasswolf: @MrFunSocks: feel free read the comment directly above my comment where I stated what the difference is.

            Thanks for your estimates and enjoy your TV.

          • @jasswolf: @jasswolf: So yeah like I said, pretty much no difference between MicroLED and OLED in terms of real world use. Max brightness can be higher, and burn in that can be a potential problem on OLED are really the only differences, only you can get an OLED now for $1600 instead of a MicroLED for 3x that in 3 years time. Seems like a pretty easy decision to me.

            With technology you can always say "just wait x and you'll get a better y", that's how technology works. For now, OLED is the best available and affordable, and will be for the next few years.

            I will enjoy my TV too, thanks. Been enjoying it for a few weeks, it's incredible. Enjoy your MicroLED in 3 years, when I'm sure I'll enjoy it too because if they provide any significant advantage I'll buy one too :). In the meantime I'll enjoy the best TV technology around.

          • @jasswolf: @jasswolf: I agree that it seems to be the way forward, but the point I was making was that nobody who buys an OLED now is going to be looking at a MicroLED in 1/2/3 years time and wishing they had one of those instead.

          • @jasswolf: @CarbonTwelve: for someone who isn't settling for a 55" when they're sitting far back enough to warrant a 65" unit and only want 60 fps performance? I agree, but I still worry about the blue OLED lifespan and the image processing on the LG isn't as good as it could be.

      • +2

        Thanks for the info. I thought waiting for HDMI 2.1 would be worthwhile, but hadn't heard of microled and the plasma is kicking along fine.

      • +4

        MicroLED being available in two years is way too optimistic.

        • +1

          Samsung have announced they are launching an ultra-expensive MicroLED panel this year.

          https://www.theverge.com/circuitbreaker/2018/1/7/16861790/sa…

          • +1

            @jasswolf: Sony launched it in 2012

          • @jasswolf: @noise36: They demoed it then as a concept for a consumer device, yes, but Samsung have said Spring 2018 for what they demoed at CES this week.

            LG's faithful investment (and subsequent domination) in OLED is the reason why this has taken so long to surface as a consumer option.

      • +1

        That's exactly what I've done, gone for the Sony X9000E (though this price for the OLED kinda makes me regret not waiting now) and I'm gonna wait a few years til OLED technology gets better. I'm mainly on my PC anyways, I bought the TV to watch movies and Netflix just whenever in my room.

        But there's also a part of me that wants to somehow return or sell the (unopened) Sony TV and get this one (if they don't jack the price). Not sure if it's worth the effort though, the Sony is still a fine TV.

        • +1

          The X9000E is a high-end LCD panel… see if you can return it and grab this if you must have that level of image quality now. Otherwise, have a look at the HiSense N7, or wait for the H9E Plus that will come with Android TV and Dolby Vision.

          • +1

            @jasswolf: I agree it's on the higher end in terms of price, but it's still considered a mid range by most people in terms of performance compared to other TVs out there. I bought the X900E for almost the same price as what this might be, and I'm pretty much purely using it for movies and such, so I would prefer the OLED. I'm not sure on the return policy though, so I might have to wait for it to sell.

            If the discounted price is actually $1600 then I will do what I can to return and get it.

        • -1

          I have heard that the X9000E is better than the OLED for gaming from owners in previous post. One poster said that they returned their C7 for this and couldn't be happier. I would keep the X9000E IMO.
          The holy grail for me TBH is the 49" X9000E which is insanely difficult to find & Sony wants ridiculous amount of money for it.
          Cheers

          • +2

            @vinni9284: Funny you say that since input lag is important for many games and Sony TVs have the worst input lag out of all the name brands. Also OLEDs have much faster response time than LCD, which helps with motion in gaming. I'm not really sure why anyone would think Sony LCD is better than LG OLED for gaming, that is unless they own a Sony.

          • @vinni9284: @Ryballs: Agreed. The 2016 and 2017 LG TVs are supposed to be outstanding with game mode turned on, and I believe both have decent HDR support with this as well.

          • @vinni9284: @Ryballs:

            I am going by previous discussion tbh however I have the B7 and love it!
            No issues with gaming or burn in/retention.

            I would still love to get the 49" X9000E for the other room as OLED's don't come in that size.

          • +1

            @vinni9284: @vinni9284:

            Yeah I find many people assume Sony makes the best TVs for gaming since they also make gaming consoles. Unfortunately that's simply not the case.

            Sony TVs do have arguably the best image processing but that doesn't really benefit gaming since majority of processing is disabled while in game mode.

          • @vinni9284: Dreaming!

            The Sony TV's are great and bright, but the LG's OLED's are better TVS for gaming and everything else.

            https://www.rtings.com/tv/tools/compare/sony-x900e-vs-lg-b7a…

          • @vinni9284: @noise36:

            Well I have the B7 and not the X9000E and loving it for sure.
            You can't say by the testing results from your link, the performance for both is not night & day difference. The Sony is not too bad for Gaming in comparison.

            I was at a previous post a while ago when the OLEDs were worth $2800+ and the Sony was worth ~ $2000 so it was worth considering the Sony at the time.

            Now with the price of the B7, it's a no brainer!

            Cheers

      • +1

        You sure the 120Hz isn't for black frame insertion?

    • +1

      Sweet price. Just for anyone that is impatient, Video Pro can be really slow to process and send orders.

Login or Join to leave a comment