• out of stock

Sony Bravia KD43X7000E 43" 4K HDR LED TV - $639 Delivered @ homeonlinesuperstore eBay

1020
PREZZY

Good tv for the price (this just replaced a Hisense N6 that I bought and had issues with), especially for anyone that’s looking to get a 4K HDR TV for their PS4 Pro or Xbox One X without spending too much.

This is well and truly the cheapest price it has been so far.

Full specs here
www.sony.com.au/electronics/televisions/x7000e-series/specif...

49 inch model available for $793.60 www.ebay.com.au/itm/Sony-Bravia-49-KD49X7000E-4K-HDR-TV/2226... - thanks Noddy

Original 20% off at selected sellers eBay Post

Related Stores

eBay Australia
eBay Australia
Marketplace
eBay Australia homeonlinesuperstore
eBay Australia homeonlinesuperstore

closed Comments

  •  

    Outstanding price

  • +5 votes

    the 49 Inch is $800

  •  

    Crazy good price. Payed $920 for mine only acouple months ago.

  •  

    50hz? How does it go with sports / fast motion?

    • -2 votes

      Standard TV is only 25FPS, sometimes 50FPS. Higher than 50hz will make literally no difference in a TV.

      •  

        Except if you wanted to use it for games, makes a huge difference.

      • +9 votes

        120Hz TV's do make a difference, you just don't know the logic behind it.

        Source media is typically encoded at 24 or 30fps.

        50Hz is not evenly divisable by either 24 or 30fps, this means that frames are either lost (dropped) or inserted (double ups) and this makes the result feel unnatural.

        If you had 60hz tv, then 30 fps plays just fine (evenly divisable) so each frame is shown for 2 refreshes.. But a 24fps source needs to insert 6 frames somewhere (interpolation).

        BUT at 120Hz (and 240Hz) both 30 and 24fps sources are evenly divisible and produce a smoother natural video.

        240hz TVs have benefits with 3D movies - I won't bother boring anyone even more, a google search will yield a more educated explanation.

        So…. The eyes do not perceive the increased refresh rate, but they do perceive the difference between an interpolated and non-interpolated result.

        • -1 vote

          Except most TVs that say higher hz are marketing gimmicks… atleast they were when i was looking into it 12 months ago.

        •  

          @Xizor: I don't understand. Which higher hz sets were these?

          How did you establish that it was a marketing gimmick?

          There is no question about it, 120 and 240hz TVs solve a real problem, no gimmick there.

        •  

          @iDroid: Scroll down to "ah, marketing". Not saying it applies to this TV and i agree REAL fresh rates matter. That said i am using a Hisense 65M7000 which punches well above it's weight despite the fact. But real and marketing gimmicks are 2 different things and i saw a lot of sets that were marketing gimmicks. I don't know this Sony but just saying not everything is as it seems, research is key. https://www.cnet.com/news/ultra-hd-4k-tv-refresh-rates/

        • +1 vote

          @iDroid: Turn it off and your picture will be better. It is a gimmick and gives what is called the "soap opera effect"(google it). The TV tries to smooth motion by "making up" extra frames as inbetween motion, between the real frames. It makes everything look more like a video recording vs a film.

          iDroid is mixing this up with matching frame rates to refresh rate, where extra repeated frames are added causing Judder. Our television transmissions are 50Hz interlaced(which the TV/Player should convert to a 25Hz progressive frame). Media from Europe and Australia will be 25p, Motion Pictures are 24p(really 23.9776p, but either in the digital age), and USA content is 30p(really 29.97). Your TV should be able to automatically switch to these different scan rates automatically, and you will not see duplicated frames etc. The 120HZ/240Hz setting is separate and will function at various scan rates to match the various frame rates of the source. Your media player must also support changing its output scan rate to match the media rate. Bluray players, Kodi, Mediaportal, etc should all support this. End result is with the 120Hz off you will get the material as intended, in the original format, with no visual issues.

        •  

          @Major Mess: Not sure if you're agreeing or disagreeing with me.

          The only thing I'm saying is that both 120Hz and 240Hz are not gimmicks, they solve a real problem (even if that does not apply to our terrestrial transmissions).

          I don't know anyone who watches terrestrial transmissions - so I kind of discount that - especially since the source is so nasty, nothing your TV will do can help much :)

          Other frequencies? No comment unless I research the rationale behind them (which I have not done).

        •  

          @Major Mess:
          They do give a soap opera effect, but you can't call it a gimmick. Some people (myself included) much prefer watching a movie when its well interpolated .

          You can't call it a gimmick, just an option that some people choose to use.

          Good TVs like my Sony and my previous LG let you choose the strength of the interpolation so you can get a balance between artifacts and smooth motion.

        •  

          @iDroid: Disagreeing. You are mixing two issues and making them one. 120/240Hz has nothing to do with the source and display frame rate and will adjust rates because of the source so as to still function. 120/240 is a separate feature to attempting to add frames based on the before and after source frames.

        •  

          @Major Mess: Not sure I am.. See: https://thenextgalaxy.com/difference-between-120hz-and-240hz...

          To quote:

          The Role of Hertz in Television

          This is where we get to 120 vs 240hz. Hertz is what we use to measure how quickly your television can change from one picture to another. So if you are watching a film shot in 24 fps on a screen with only 1hz, you can only display one of those 24 frames every second, which isn’t very good. If you had a 24hz TV, everything would be perfect because your screen would be refreshing at the exact same rate as the frames.

          Except what happens when you watch a movie filmed in 30 fps? Now your 24hz TV is missing 6 frames for every second of film. But upgrading to a 30hz TV isn’t going to solve your problems because then you won’t be able to watch 24 fps movies. To explain why not, we’re going to move past 30hz to 60hz TVs.

          60hz is perfect for watching 30 fps movies. The screen is refreshing 60 times per second, meaning it is displaying each frame twice since 30 divides evenly into 60. But 24 doesn’t, it goes into 60 2.5 times. Interpolation is a process which fixes this problem by inserting copies of 6 frames into the film to bring it up to 30 fps. This kind of works, but the action gets funny when it is manipulated like this. It no longer looks natural.

          The solution to this problem is 120 and 240 hertz TVs. The reason these two refresh rates are perfect is that both of them are evenly divisible by both 24 and 30. This means that with either a 120 or a 240 hertz TV, you can watch both 24 and 30 fps movies without a problem.

        •  

          @iDroid: The problem with that article is that the author doesn't realise TV's can sync to 23.976Hz,24HZ, 25Hz, 48Hz, 50Hz, 60Hz. The interpolation sits on top of this.

          So if watching a PAL progressive source the TV will be syncing at 25 or 50Hz. Turning on interpolation, it will then run at 100 or 200Hz. So at 100Hz the TV is interpolating an extra 3 made up frames based on the actual previous and next source frames.
          If watching 24p the TV will sync at 24 or 48Hz. Turning on interpolation, it will run at 96 or 192Hz.

          These are two different functions.

        •  

          @Major Mess: Do you have some references you can cite? Every bit of research I do suggests otherwise.

          I'd like to see some evidence of which TVs dynamically change the screen's refresh rate based on the source material. I don't think I've ever seen or heard of such thing.

          I'm totally aware of the difference between interpolation and refresh rates, let's just ignore the interpolation for the sake of this discussion.

        •  

          @iDroid: Pretty much all TV's do it. A few of the cheaper brands may have missed out on 24p, but I cannot quote any, but they all support 25 30 50 and 60Hz. You are making an argument from ignorance. That is, you admit you don't have the knowledge so claim it must be such. This is a theist method of argument. Are you a christian by any chance? (If so remember your obligation under 1 Peter 3:15).

        •  

          @Major Mess: Here are the relevant specs from Sony's page for this model.
          PICTURE (PROCESSING)
          Video signal support 4096x2160p(24,50,60Hz), 3840x2160p(24,25,30,50,60Hz), 1080p(30,50,60Hz), 1080/24p, 1080i(50,60Hz), 720p(30,50,60Hz), 720/24p, 576p, 576i 480p, 480i

          Notice all of the rates quoted. The TV will run at those rates when supplied the particular source fps over the HDMI input.

          It is the same with monitors. Have a look at your specs and you will mostly likely see high refresh rate inputs quoted for resolutions below your native panel resolution. Have a search on refresh rate overclocking. In my case I have a Dell UP2716D that is quoted at 60Hz, but happily syncs to 80Hz(which introduces judder artefects when playing videos due to the division problem you mentioned earlier).

          Have a bit of a read about Android boxes and custom firmware to get Kodi to auto switch refresh rate. A lot of this information is discussed around that issue. Any of the home theatre forums should also have the info yo are after.

        •  

          @Major Mess: Hmm, "Video signal support" != "dynamically changing screen refresh rate"

          Without more information I'd expect that means "It will work", and does not imply "screen refresh rate matches video source"

          Supporting a video signal at a given resolution and fps only means it will accept that signal and output something that resembles it on the screen. Any other assumptions are just that.

          Any other references to back up your thoughts?

          I can find at least 20 references that support what I'm saying without even trying. I've tried hard to find some evidence of what you're claiming and yielding nothing.

        •  

          @Major Mess: Further more, the technologies that exist for dynamic refresh rates (eg, FreeSync and G-Sync) match the video card's refresh rate, not the source material (https://en.wikipedia.org/wiki/Refresh_rate#Dynamic_refresh_r...).

          Some setups could possibly change the video card refresh rate based on source material, but that's not a typical configuration and would cause all sorts of headaches in many situations.

        •  

          @iDroid: What has Freesync/Gsync have to do with what we are talking about?

          I don't understand why broadcast fps has nothing to with the subject. A lot of media is available that was originally a broadcast. TV Series etc.

          Here is how it works. I go to my media player I select a movie which is at 24p. The player sees the 24Hz and switches the output to 24p. The TV sees the 24fps data and switches to 24Hz. No extra frames needed as we have the same display rate as the source signal.

          I now play a UK Tv show. The player sees the 25fps(or perhaps 50 fields interlaced, which is deinterlaced to a 25fps anyway), and outputs a 25fps(or 50fps with doubled frames) signal over hdmi. The TV sees the signal and adjusts to 25Hz. Again no requirement for any extra frames.

          etc, etc…..

          A bigger problem is audio and getting 6 channel working for non-bit-streaming sources.

          These are not "thoughts". That again is theist reasoning. So out with it. Are you a christian?

        •  

          @Major Mess: I get what you're saying, I'm claiming that it's not correct.

          As I said before, the only reference I can find to "switches to 24Hz" (which would be a "Dynamic Refresh Rate") is both G-Sync and FreeSync.

          Show me any reference that backs up your claim. I've offered 2 references that back up my claims..

          Take a look at that wiki page I linked above, it explains things pretty clearly and it explicitly contradicts what you're saying.

          You can say the same thing to me as many times as you like and I'll remain unconvinced unless you can either disprove my claims or prove yours. But so far what you're saying as far as I can see is pure conjecture.

          Just to make sure I'm clear - I understand exactly what you're saying, I just don't agree and neither does any reference material I can find.

        •  

          @Major Mess:

          These are not "thoughts". That again is theist reasoning. So out with it. Are you a christian?

          Err, no, not a christian. I base my life on logic and reasoning. Not sure why that'd be relevant though or even what you're trying to say?

          And unless you can cite some reference, for me, they are "your thoughts" until proven differently.

        •  

          @iDroid: I am referring to your argument from ignorance. That is the theist think, "I can't think of an explanation, therefore God.

          You seem to believe the panels only run at one fixed frequency and the TV's software converts sources to that. Why do you believe this? What would be the point?

          There are still people who think component video is the same as RGBS, just because of the colour of the connectors.

          It is not my obligation to endlessly argue reality, and while I am happy to help, this is getting a bit ridiculous and I won't be replying anymore.

          EDIT:
          Here is a Google search where getting refresh rate changing has been an issue.
          https://www.google.com.au/search?dcr=0&source=hp&ei=u04FWvXL...

        • +1 vote

          @Major Mess: Home time guys. Threads done lol.

        •  

          @Major Mess: I've linked a number of articles that support what I'm saying - that is not "I can't think of an explanation, therefore God" it's "Everything I read about this subject contradicts what you're saying" I've even linked to supporting articles.

          You seem to believe the panels only run at one fixed frequency and the TV's software converts sources to that. Why do you believe this? What would be the point?

          Maybe because it's cheaper to have the hardware include 1 clock and do the rest in software? Regardless of WHY, I'm still yet to see any evidence of what you're claiming.

          And of course I can think of an explanation - I've never said what you're claiming is impossible or I can't explain it. I've VERY clearly stated that I can not find any evidence and funnily enough you still have not provided any. If you're so correct, then it should be really simple to find some supporting evidence.

          Arrogantly saying "It's like this, just believe me, you're a theist because you don't agree, and therefore you are mistaken because of the way theist think" is not a strong argument.

          I base my understanding on reference material, experimentation, etc. Not the unsupported claims of a random forum post.

        •  

          @iDroid: Looks like u replied while I was editing above. Maybe you could go on the LibreElec forums and let them know they are wasting their time working on software for a function that display devices don't support. Maybe youcould also help the Kodi guy's. They are all volunteers and you stopping them wasting their time would be a good idea. There is also Mediaportal, who again do it for free, and it seems silly that you let them continue with the impossible. There also all the major and minor manufacturers who also would be wasting their time.

        •  

          @Major Mess: I've looked over those forum threads and again absolutely none of them say anything about the hardware panel behaviour and the panel frequency.

          They're all about presenting the signal at different frequencies - you're jumping to a conclusion that the panel is then changing to that frequency.. Ok, fine, believe that. I'll believe it when and only when I have some evidence. But right now every bit of reference material I find contradicts that - where is the reference material to support what you're saying?

          Let's face it, that's a pretty significant thing for a panel to do - something that end users will be interested in, it's a MUCH bigger selling point than "240hz super motion oooooo eeerrr", yet, nothing I can find supports this. You'd think that marketing departments would be splashing this everywhere. But it's not even in the technical reference.

        •  

          @iDroid: What would be the point of changing the input frequency to get rid of Judder if the TV is going to deal with it as best it can, anyway?

        •  

          @Major Mess:

          Why would one need to ever do that since:

          Notice all of the rates quoted. The TV will run at those rates when supplied the particular source fps over the HDMI input.

          ??

    •  

      Has a 50hz mode, but it also operates at 60hz, and says as much in the specs.

  •  

    Great price, really hanging for a 75 inch though!

  •  

    Are you able to turn off the smooth motion effect or whatever it's called on these TV's?

  • +2 votes

    That's a very good price.

    One thing to keep in mind with a TV like this is that you're not going to be getting much benefit from HDR; the display simply doesn't get bright enough to really take advantage of content mastered for 1000+ cd/m^2. It'll display HDR content, but that's about it. It is reasonably colour accurate though, at least if you're looking at standard colour gamuts.

    •  

      THx for this info, much appreciated. What min. screen size would you say is needed to get the benefits of HDR?

      •  

        It's not so much about screen size as it is about peak brightness, wide colour gamut support, local dimming, things like that. It varies wildly from TV to TV, so you'd have to check for reviews of the specific products you're interested in.

        Generally speaking, more expensive TVs have better screens or more features that enable a better HDR experience, such as full array local dimming, or "quantum dots", or OLED panels. Those TVs also generally tend to be larger, but a larger TV is not always a better TV.

    •  

      Gears of War 4 has a cool feature where you can split the display so that one half displays HDR and the other half doesn’t. I played around with this for about 10 minutes, and the difference was quite significant between HDR and SDR.

      •  

        Microsoft also just released a 4K HDR + spatial audio demo for the Xbox called "Insects", which allows you to toggle between 4K, HDR, and spatial audio and really compare the differences.

        Good to see from them, it can be hard to know what to look for otherwise.

  •  

    Would like to get a 65" for my new xbox one x but dont know much about tvs. Anything decent in this current eBay sale for under $2K? The Hisense 65N7 looks popular but has been cheaper before. My current tv is a Hisense K700 series which I think has a good picture but no HDR.

  •  

    PS4 Pro can't play 4k ultra HD movies. If you want to get the most out of the screen get a stand alone player or an Xbox scorpio

  • +1 vote

    The Hisense N7 is about twice this price, but probably a much better cheap TV. Apparently the X7000 panel is pretty cheap and nasty for a Sony. Decent Samsungs have been on special lately too.

  •  

    Anyone using as monitor for photos/videos editing?

  • +1 vote

    Is it good price for the TV or good tv for the price?

  • +1 vote

    If its 50hz then isnt it bad for gaming eg consoles?

  •  

    So does anyone own this TV that can share their experience? It's damned near impossible to find a review!

    •  

      I own the 55" X7000D version, as do quite a few other OzBargainers. Extremely happy with the panel. We watch lots of Netflix and the 4K looks great. The OS for the 7000D version is Android and is fairly zippy and the apps available are great (Netflix, Stan, AnimeLab, network television etc.). We got the 55" for about $1080 via VideoPro. Apparently there's a different OS for the 7000E version. Would highly recommend the TV as an entry-level 4K television.

      •  

        That's a much better deal

        •  

          Yes, I've been following the 55" 7000E closely and it always seems to be about $1250-$1300, not really nearing the $1080$-1100 that the 7000D hit earlier in the year/late last year. I guess we got lucky with some Christmas/New Years sales, although perhaps the premium is justified as it is a newer model. That said, I suspect Videopro and/or Sony to have some nice deals in Dec/Jan (like last/this year), so might be worth holding out for those

    •  

      I did a lot of digging, and I’m 95% sure the US model is the X720E. There’s a lot more info out there for this one.

  •  

    Does this have HDR 10? Or another format. Just the Xbox One X only does HDR10.

  •  

    how do these compare with the X800Es that went on sale for $800 from sonystore over Xmas last year

    •  

      Only difference I could see was that the 800 is an android TV where as this one isn't. You can compare the 2 models on the Sony website for specs.

  •  

    Does this have android?

  •  

    Has anyone who bought one had theirs posted yet? Taking a good while…

    •  

      I cancelled my order. The company hadn't got back to me at all when I asked about dispatch times over a week ago. It doesn't bode well if there are any issues at all with the tv.

  • Top