Image quality on new Apple TV 4K

New beta good so far. My only issue is TVE feeds that are sent 29.97 instead of 59.94. They take on that 3/2 pull down look. CNN is a good example.

I’m still seeing the issue on live tv on my Apple TV. It only occurs when I change the video driver to experimental.

Changing Deinterlacing mode doesn’t seem to effect it, just the experimental video driver. When I turn it back to default, the problem disappears.

Seems to be gone for me. Running all experimental modes now and using match content.

1 Like

So for me it is MUCH better. I didn't realize how much I was relying on my TV's post processing to clean up the picture. I turned all of that stuff off and switched to the experimental video driver. Now TVE doesn't look like crap anymore, it's actually watchable.
I haven't looked for the microstutter issue yet, it takes longer for that to show but in my quick testing I didn't see it. (that could have been my TV's post processing getting in the way)

1 Like

I noticed that the combination of settings is important. Experimental deinterlace along with experimental driver results in the ATV putting out 29.97 on 29.97 sources. But if you turn off experimental driver, then the stats for nerds says that 29.97 is going out as 59.94. So its a matter of do you want the ATV or your TV to do the conversion?

1 Like

So, I just got the new ATV4k, add Channels and HDHomeRun 4K OTA today. What should the ideal settings be in the Channel Settings app? Appreciate the help!

My advice:
For channels:
Run the testflight build
Set the deinterlacer and video driver to experimental

For ATV:
Set to 4k SDR
Match frame rate
Match dynamic range

For HDHR:
Check your signal strength on the HDHR, they tend to not like hot signals. I'm 12miles from the towers and the signal reads 100%. I have 9db of attenuation that I added and it now reads 97% on most channels.

If you are far away from your towers then you probably don't have to fool with attenuation.

1 Like

Thank you! Appreciate the guidance. Now going to setup my Harmony Elite Hub remote according to the tips from channels.

Also, you may want to turn off motion smoothing on your TV - on my Samsung TV it caused logos/text to flicker, particularly on interlaced channels. Though it is a trade off - some content may look better with motion smoothing, particularly 30fps TVE streams.

Yeah i also found on my Sony bravia I had to turn off sharpness as well as all motion smoothing.
With the latest testflight and experimental deinterlacer and video drivers, even TVE looks awesome now.
The only issue I have is with some of the standard def sub channels, motion is a little stuttered. Only thing I watch them for is old westerns so it doesn't bother me.

1 Like

This is not true. If it looks washed out, then it is a setup/calibration issue. The Dolby Vision processing taking place in the AppleTV (and other certified Dolby devices) takes into account the various video modes like SDR, HDR, DV, HLG and was designed to remap these signals appropriately so they display properly, and if setup well can look even better than standard SDR, HDR, etc.

This is why so many of these devices default to full time DV processing. The manufacturers and Dolby already know this to be the case, but they are just not good at getting the word out to the public and touting this very good video processing, to say the least! I firmly believe this is why Sony has DV forced to on at all times with their newer UHD Bluray line like the X700/X800M2/X1100 players when you click DV on in the menus. Sony are picture quality and video processor experts and this is widely known with their Reality Creation processing using their X1 line of chips. I doubt they would force the DV processing if they didn't know it was superior to theirs.

There are even videos on Dolby's own YouTube channel speaking of and comparing regular SDR to SDR mapped using their proprietary Dolby algorithms and processing. The increase in image quality is easily seen in their examples. I will have to dig up some links but I am sure anyone can search their channel and find them.

In fact, many of the video processor owners and calibrators, like Kris Deering when setting up and calibrating the Lumagen Radiance Pro for his clients, set it up to map ALL sources to an HDR mode to avoid things like JVC projectors from re-syncing when it changes video standards from HDR to SDR or back again. This takes forever on JVC projectors and is quite an annoyance, plus it is said to slightly increase the image quality.

All it is doing is remapping the rec709 SDR color points into HDR's BT2020 color gamut, and converting the power gamma 2.2 into HDR's absolute PQ "gamma". This is very easily done and there are even calibration modes within SpectraCal's CalMAN Calibration Software to test and setup this very thing (Rec709 and DCI-P3 color gamuts mapped into a BT2020 gamut). UHD Blurays are actually mastered at DCI-P3 color points for the most part, which is then mapped into the BT2020 color container. This is done for future proofing purposes so that when more equipment and video sources can take full use of BT2020, it is there ready for them to utilize and already within the existing specs.

I am not sure where you get that it can damage your TV? If the TV is designed to play back HDR content, then sending it remapped SDR to Dolby Vision (or even forced HDR10) should not damage the TV in any way. The ONLY thing I can see happening is that some TVs and especially Projectors display in a brighter mode when receiving HDR and DV, so this will decrease your light source's lifetime, especially as I said with something like a projector and it's lamp, LED or laser light source, which would have to be changed more often if you take this approach. Is this what you mean by "damage your TV", @Maddox?

Of course, there is also nothing wrong with setting the AppleTV to SDR mode and clicking Match Frame and Dynamic Range boxes either, but I just wanted to clear up the posted thought that using a setting to force DV or even HDR10 is inferior and could somehow damage your TV!

1 Like

HDR/DV all-the-time mode causes a lot of un-needed mode switching, delays channel changes, and degrades the user experience when using Channels. Also, it does not look as good for SDR content. The menu navigation is also more washed out. Try it. It is true on multiple Vizio and Samsung 4K TVs I have experimented with. If you think your Sony is different, then post an example, possibly with pictures. We are just giving our solution for the best setting to use for best picture + best user experience when using the Channels app. When set to 4K SDR with auto mode switching set in the ATV settings, true HDR/DV content will automatically switch anyway.

EDIT: I did an example with pictures a couple years ago. It is here: Poor video quality/posterization on Apple TV Channels app

OLED's will absolutely shorten their comparative lifespan using HDR 100% of the time, and also hasten the (admittedly rare) burn-in.

LCD's - probably not a big deal, but I like to avoid manipulating the signal path, if I can.

Every reputable article or review you can find out there will tell you to never run your system full time in Dolby or HDR. Not only is is it driving your screen harder and brighter than it needs to, upconverting SDR content to HDR results in incorrect mapping of video levels and color gamut. You may think it looks more vivid but its totally inaccurate video. Like running your TV in Store Demo mode all the time, wow its bright and looks so impressive, but you soon realize that people are not that red and there is no detail in bright objects like clouds.

Watch this. https://www.youtube.com/watch?v=pNgwkK1iyzg

I am not sure where you are getting this from, but keeping it at HDR for all sources actually decreases mode switching because now it doesn't have to change between SDR and HDR/DV modes when it receives each one. It will just stay in HDR/DV. It will only change when it sees a frame rate change, if you have that option set (Match Frame Rate) in the AppleTV.

I have tried it and tested on numerous occasions and still do and use it to this day. As I said, if it is washed out, then that means that you are sending this HDR/DV signal into your TV and the TV is not mapping it correctly, if it actually goes into it's HDR mode and does what it is supposed to do, or it is simply not going into HDR BT 2020 mode when it sees this signal. It is not the AppleTV's fault, it is your TV's fault. I see this exact anomaly you're speaking of on your Samsung and Vizio on my Hisense HDTV, but when I do this on my LG OLED 65C8 the image is amazing because it maps and presents the image as it is supposed to. In fact I just confirmed this again last night while watching the amazing series, "The Chosen" which is only in HD with my AppleTV set to output DV to my LG OLED. It was incredible and you would've been hard pressed to tell it wasn't mastered in DV. My wife even commented more than once how great it looked and that it appeared "more real" to her, and she is one to NOT notice image quality and upgrades I may do where it is obvious with the improvements to me.

I do not have a Sony TV, so not sure where you got that from either? Even so, do you really think posting pictures from a phone, compressed and uploaded to the interwebs and then viewed on any number of thousands of displays could possibly show the nuances and image quality differences???

I agree, and I said as much, but I don't consider that "damage", I consider that what it is.....shortening the life span. If it is considered "damage", then every time you play HDR or DV on your HDR/DV capable set you'd have to say you're damaging it. :roll_eyes:

I already addressed the lifespan loss if you do this, so it is a personal choice if you feel the increased image quality is worth the loss of lifetime usage, but I find if your display or projector uses any sort of dynamic system, as my BenQ LK990 and Samsung LSP9T do, then it only hits its peak for a very short period of time. The average picture level of HDR and most of the normal video is within the same parameters of SDR video (100 nits to maybe a max of 200). The levels above that are more for specular highlights, etc.

If you are seeing incorrect mapping of video levels and color gamut, then as mentioned many times now, it would be the fault of your TV or something else in the chain. I know for a fact that the Dolby Vision algorithm and processing can and does map SDR Rec709 and DCI-P3 colors into the BT2020 color gamut correctly, as long as the settings and parameters are adhered to and applied properly.

As mentioned also, SpectraCal's CalMAN Video Calibration software and all the others have test and calibration workflows to check and calibrate this very thing. Why do you think that is??? Just about ALL UHD Blurays are mastered to DCI-P3 levels as that is pretty much the peak of our display's and system's capabilities with current technology, BUT these color gamut points are still mapped and packaged within a BT2020 Color gamut! If these smaller color gamuts were not mapped correctly, as you and others are stating, then these too would not map right and look wrong, as you are trying to say is the case with SDR Rec709 color gamut. This is simply not true.

Just a side note, I am Imaging Science Foundation (ISF) Certified (see my avatar) and have been for many, many years, going back to the founders Kevin Miller and Joel Silver. I have been in the audio/video and broadcast TV and radio realm for 33+ years. I have installed and calibrated consumer, commercial and professional displays and projectors since 1996 going back to the old "3 gun" CRT projectors which required many hours and even days to align and finally calibrate correctly. I also helped design, field, test, market, install, calibrate and support CES and CEDIA award winning video processors known as the TAW Rock, Rock+ and Rock Pro. This is just to show and point out that I am not an amateur just guessing and "calibrating by eye" or being wowed by showroom bright shiny colors and blown out images. I am usually the one in Best Buy trying to teach them what good video is, haha!

I am also the one who discovered the now prevalent HDFury Dolby Vision/LLDV solution to allow Dolby Vision source video to display on otherwise "HDR10 ONLY" displays and projectors which also allows for Dynamic Tone Mapping and amazing HDR10 to LLDV/DV imaging otherwise only gotten if using a good flat panel, a JVC projector or ultra-expensive Video Processor such as the Lumagen Radiance Pro, MadVR PC or MadVR Labs Envy. So I know what peak white clipping in clouds and other scenarios is, and no I DO NOT have it, nor any of the other anomalies you speak of, like red faces, because my system is set correctly and calibrated as it should be.

Keep in mind, Vincent's video you posted is now almost 4 years old. The Dolby Vision algorithms and processing has improved a lot since then, and so has the source video that is feeding it. I was easily able to see the improvement when I installed and setup the new version AppleTV that recently came out as compared to the prior generation. Another trusted and technical YouTube poster, Andrew Poole of Home Theatre Engineering in Australia also posted a similar video but has since contacted me regarding my LLDV solution and he has also posted a video in regards to that. We are hoping to do a video together showing the merits of this.

I’m sorry you are just wrong. I work for a major tv network and I know EXACTLY what our content should look like, what the levels should be and how the colors map on our scopes. SDR content mapped to HDR is just plain wrong, out of spec and in no way matches the feeds we are sending out. The colors are false, the video levels are no longer what is coming out of our studios. Again, brighter and more saturated is just that, but not accurate or a true representation of what was transmitted or streamed. But if you like how it looks go ahead. Me, I want what I see at home to match the feed we send to the satellite.

1 Like

I’m sorry, but I am not wrong. You’re kind of making my point for me.

What piece of equipment, software, etc. are you using to properly map said SDR to HDR for your scopes? Are you just shoving an SDR signal into an HDR scope or equipment? Are you using some sort of processing or algorithms to properly map that SDR into that HDR scope, as it should be?

It sounds like your system is designed end to end for SDR so just throwing that into HDR without proper mapping is of course going to result in what you’re seeing. Dolby actually has broadcast solutions that you can feed it ANY signal, be it SDR, HDR10, HLG and output it PROPERLY MAPPED AND PROCESSED to present as good or better than what was input.

This is exactly what I’m saying is happening to those who see it as wrong colors, washed out and whatever other thing you want to call it. It is because somewhere in their video chain, it isn’t mapping the signal properly, so it just results in SDR crammed or forced into what is expecting HDR. If it’s down RIGHT, it is mapped properly into the HDR/DV realm FIRST, then presented to the display already mapped to what an HDR signal would give.

I ask again, why would CalMAN have these calibration options were it not for this very scenario? Why would Dolby have this function built into its algorithms and processing? Why would top ISF and THX calibrators like Kris Deering and many others setup high end video processors to map their rec709 SDR input signals to their HDR outputs of it resulted in the poor and incorrect images you portray here?

1 Like

Because the average viewer with standard off the shelf TVs, streaming devices and using the presets that gear comes with will not see a proper picture if they follow your advice. And they are the majority of people out there. For you to make these statements is misleading. Only the very very few who have perfectly calibrated sets, streaming boxes using the latest codecs and no equipment such as avrs with their own conversions in the chain will “possibly” see a proper image. You are leading the rest into using modes that will result in an improper picture, possible damage to their OLED screens and shorted lifespans of their TVs. So why risk all that, watch SDR as SDR and watch HDR as HDR knowing even with an uncalibrated set it’s going to reasonably match what the service is sending.

This is simply not true. You’re making excuses for manufacturers that short change and produce inferior, not to spec equipment. If these sets and equipment meet the specs as they are written, and properly map the way they’re supposed to according to those specs, then they will not see what you say. It is because of poorly implemented engineering that people see what it is being reported.

Plus, I’m not the one who said one way or the other was “wrong”. I was just pointing out that if done properly, this was NOT wrong, but neither is keeping it all native SDR, HDR, DV, as I pointed out.

1 Like

we have some smart folks in the forums.

2 Likes