Image quality on new Apple TV 4K

To help us help you, is there an explicit place in the “show stats” on Apple TV that indicates whether the content is interlaced or not? There are a couple of stats that may indicate this but I’m not 100% sure. Would be nice to be able to tell easily. Sorry if I’m missing something…

If you go to your recordings on the server website (I.e. DVR-Manage at your IP:8089), it will say under Options-View Details whether a recording is interlaced or progressive.

Right, I was just hoping to be able to tell from “show stats” as well to help troubleshoot quickly.

I've been seeing the slight glitching on camera shots that were pretty still. It's really noticeable since there isn't much motion. I can go back and replay and then I don't see it again.

I'm uploading a new TestFlight beta now with more improvements to the experimental video driver.

2 Likes

New beta good so far. My only issue is TVE feeds that are sent 29.97 instead of 59.94. They take on that 3/2 pull down look. CNN is a good example.

I’m still seeing the issue on live tv on my Apple TV. It only occurs when I change the video driver to experimental.

Changing Deinterlacing mode doesn’t seem to effect it, just the experimental video driver. When I turn it back to default, the problem disappears.

Seems to be gone for me. Running all experimental modes now and using match content.

1 Like

So for me it is MUCH better. I didn't realize how much I was relying on my TV's post processing to clean up the picture. I turned all of that stuff off and switched to the experimental video driver. Now TVE doesn't look like crap anymore, it's actually watchable.
I haven't looked for the microstutter issue yet, it takes longer for that to show but in my quick testing I didn't see it. (that could have been my TV's post processing getting in the way)

1 Like

I noticed that the combination of settings is important. Experimental deinterlace along with experimental driver results in the ATV putting out 29.97 on 29.97 sources. But if you turn off experimental driver, then the stats for nerds says that 29.97 is going out as 59.94. So its a matter of do you want the ATV or your TV to do the conversion?

1 Like

So, I just got the new ATV4k, add Channels and HDHomeRun 4K OTA today. What should the ideal settings be in the Channel Settings app? Appreciate the help!

My advice:
For channels:
Run the testflight build
Set the deinterlacer and video driver to experimental

For ATV:
Set to 4k SDR
Match frame rate
Match dynamic range

For HDHR:
Check your signal strength on the HDHR, they tend to not like hot signals. I'm 12miles from the towers and the signal reads 100%. I have 9db of attenuation that I added and it now reads 97% on most channels.

If you are far away from your towers then you probably don't have to fool with attenuation.

1 Like

Thank you! Appreciate the guidance. Now going to setup my Harmony Elite Hub remote according to the tips from channels.

Also, you may want to turn off motion smoothing on your TV - on my Samsung TV it caused logos/text to flicker, particularly on interlaced channels. Though it is a trade off - some content may look better with motion smoothing, particularly 30fps TVE streams.

Yeah i also found on my Sony bravia I had to turn off sharpness as well as all motion smoothing.
With the latest testflight and experimental deinterlacer and video drivers, even TVE looks awesome now.
The only issue I have is with some of the standard def sub channels, motion is a little stuttered. Only thing I watch them for is old westerns so it doesn't bother me.

1 Like

This is not true. If it looks washed out, then it is a setup/calibration issue. The Dolby Vision processing taking place in the AppleTV (and other certified Dolby devices) takes into account the various video modes like SDR, HDR, DV, HLG and was designed to remap these signals appropriately so they display properly, and if setup well can look even better than standard SDR, HDR, etc.

This is why so many of these devices default to full time DV processing. The manufacturers and Dolby already know this to be the case, but they are just not good at getting the word out to the public and touting this very good video processing, to say the least! I firmly believe this is why Sony has DV forced to on at all times with their newer UHD Bluray line like the X700/X800M2/X1100 players when you click DV on in the menus. Sony are picture quality and video processor experts and this is widely known with their Reality Creation processing using their X1 line of chips. I doubt they would force the DV processing if they didn't know it was superior to theirs.

There are even videos on Dolby's own YouTube channel speaking of and comparing regular SDR to SDR mapped using their proprietary Dolby algorithms and processing. The increase in image quality is easily seen in their examples. I will have to dig up some links but I am sure anyone can search their channel and find them.

In fact, many of the video processor owners and calibrators, like Kris Deering when setting up and calibrating the Lumagen Radiance Pro for his clients, set it up to map ALL sources to an HDR mode to avoid things like JVC projectors from re-syncing when it changes video standards from HDR to SDR or back again. This takes forever on JVC projectors and is quite an annoyance, plus it is said to slightly increase the image quality.

All it is doing is remapping the rec709 SDR color points into HDR's BT2020 color gamut, and converting the power gamma 2.2 into HDR's absolute PQ "gamma". This is very easily done and there are even calibration modes within SpectraCal's CalMAN Calibration Software to test and setup this very thing (Rec709 and DCI-P3 color gamuts mapped into a BT2020 gamut). UHD Blurays are actually mastered at DCI-P3 color points for the most part, which is then mapped into the BT2020 color container. This is done for future proofing purposes so that when more equipment and video sources can take full use of BT2020, it is there ready for them to utilize and already within the existing specs.

I am not sure where you get that it can damage your TV? If the TV is designed to play back HDR content, then sending it remapped SDR to Dolby Vision (or even forced HDR10) should not damage the TV in any way. The ONLY thing I can see happening is that some TVs and especially Projectors display in a brighter mode when receiving HDR and DV, so this will decrease your light source's lifetime, especially as I said with something like a projector and it's lamp, LED or laser light source, which would have to be changed more often if you take this approach. Is this what you mean by "damage your TV", @Maddox?

Of course, there is also nothing wrong with setting the AppleTV to SDR mode and clicking Match Frame and Dynamic Range boxes either, but I just wanted to clear up the posted thought that using a setting to force DV or even HDR10 is inferior and could somehow damage your TV!

1 Like

HDR/DV all-the-time mode causes a lot of un-needed mode switching, delays channel changes, and degrades the user experience when using Channels. Also, it does not look as good for SDR content. The menu navigation is also more washed out. Try it. It is true on multiple Vizio and Samsung 4K TVs I have experimented with. If you think your Sony is different, then post an example, possibly with pictures. We are just giving our solution for the best setting to use for best picture + best user experience when using the Channels app. When set to 4K SDR with auto mode switching set in the ATV settings, true HDR/DV content will automatically switch anyway.

EDIT: I did an example with pictures a couple years ago. It is here: Poor video quality/posterization on Apple TV Channels app

OLED's will absolutely shorten their comparative lifespan using HDR 100% of the time, and also hasten the (admittedly rare) burn-in.

LCD's - probably not a big deal, but I like to avoid manipulating the signal path, if I can.

Every reputable article or review you can find out there will tell you to never run your system full time in Dolby or HDR. Not only is is it driving your screen harder and brighter than it needs to, upconverting SDR content to HDR results in incorrect mapping of video levels and color gamut. You may think it looks more vivid but its totally inaccurate video. Like running your TV in Store Demo mode all the time, wow its bright and looks so impressive, but you soon realize that people are not that red and there is no detail in bright objects like clouds.

Watch this. https://www.youtube.com/watch?v=pNgwkK1iyzg

I am not sure where you are getting this from, but keeping it at HDR for all sources actually decreases mode switching because now it doesn't have to change between SDR and HDR/DV modes when it receives each one. It will just stay in HDR/DV. It will only change when it sees a frame rate change, if you have that option set (Match Frame Rate) in the AppleTV.

I have tried it and tested on numerous occasions and still do and use it to this day. As I said, if it is washed out, then that means that you are sending this HDR/DV signal into your TV and the TV is not mapping it correctly, if it actually goes into it's HDR mode and does what it is supposed to do, or it is simply not going into HDR BT 2020 mode when it sees this signal. It is not the AppleTV's fault, it is your TV's fault. I see this exact anomaly you're speaking of on your Samsung and Vizio on my Hisense HDTV, but when I do this on my LG OLED 65C8 the image is amazing because it maps and presents the image as it is supposed to. In fact I just confirmed this again last night while watching the amazing series, "The Chosen" which is only in HD with my AppleTV set to output DV to my LG OLED. It was incredible and you would've been hard pressed to tell it wasn't mastered in DV. My wife even commented more than once how great it looked and that it appeared "more real" to her, and she is one to NOT notice image quality and upgrades I may do where it is obvious with the improvements to me.

I do not have a Sony TV, so not sure where you got that from either? Even so, do you really think posting pictures from a phone, compressed and uploaded to the interwebs and then viewed on any number of thousands of displays could possibly show the nuances and image quality differences???

I agree, and I said as much, but I don't consider that "damage", I consider that what it is.....shortening the life span. If it is considered "damage", then every time you play HDR or DV on your HDR/DV capable set you'd have to say you're damaging it. :roll_eyes:

I already addressed the lifespan loss if you do this, so it is a personal choice if you feel the increased image quality is worth the loss of lifetime usage, but I find if your display or projector uses any sort of dynamic system, as my BenQ LK990 and Samsung LSP9T do, then it only hits its peak for a very short period of time. The average picture level of HDR and most of the normal video is within the same parameters of SDR video (100 nits to maybe a max of 200). The levels above that are more for specular highlights, etc.

If you are seeing incorrect mapping of video levels and color gamut, then as mentioned many times now, it would be the fault of your TV or something else in the chain. I know for a fact that the Dolby Vision algorithm and processing can and does map SDR Rec709 and DCI-P3 colors into the BT2020 color gamut correctly, as long as the settings and parameters are adhered to and applied properly.

As mentioned also, SpectraCal's CalMAN Video Calibration software and all the others have test and calibration workflows to check and calibrate this very thing. Why do you think that is??? Just about ALL UHD Blurays are mastered to DCI-P3 levels as that is pretty much the peak of our display's and system's capabilities with current technology, BUT these color gamut points are still mapped and packaged within a BT2020 Color gamut! If these smaller color gamuts were not mapped correctly, as you and others are stating, then these too would not map right and look wrong, as you are trying to say is the case with SDR Rec709 color gamut. This is simply not true.

Just a side note, I am Imaging Science Foundation (ISF) Certified (see my avatar) and have been for many, many years, going back to the founders Kevin Miller and Joel Silver. I have been in the audio/video and broadcast TV and radio realm for 33+ years. I have installed and calibrated consumer, commercial and professional displays and projectors since 1996 going back to the old "3 gun" CRT projectors which required many hours and even days to align and finally calibrate correctly. I also helped design, field, test, market, install, calibrate and support CES and CEDIA award winning video processors known as the TAW Rock, Rock+ and Rock Pro. This is just to show and point out that I am not an amateur just guessing and "calibrating by eye" or being wowed by showroom bright shiny colors and blown out images. I am usually the one in Best Buy trying to teach them what good video is, haha!

I am also the one who discovered the now prevalent HDFury Dolby Vision/LLDV solution to allow Dolby Vision source video to display on otherwise "HDR10 ONLY" displays and projectors which also allows for Dynamic Tone Mapping and amazing HDR10 to LLDV/DV imaging otherwise only gotten if using a good flat panel, a JVC projector or ultra-expensive Video Processor such as the Lumagen Radiance Pro, MadVR PC or MadVR Labs Envy. So I know what peak white clipping in clouds and other scenarios is, and no I DO NOT have it, nor any of the other anomalies you speak of, like red faces, because my system is set correctly and calibrated as it should be.

Keep in mind, Vincent's video you posted is now almost 4 years old. The Dolby Vision algorithms and processing has improved a lot since then, and so has the source video that is feeding it. I was easily able to see the improvement when I installed and setup the new version AppleTV that recently came out as compared to the prior generation. Another trusted and technical YouTube poster, Andrew Poole of Home Theatre Engineering in Australia also posted a similar video but has since contacted me regarding my LLDV solution and he has also posted a video in regards to that. We are hoping to do a video together showing the merits of this.