Image quality on new Apple TV 4K

A few observations regarding Apple TV video problems:

  1. It seems the problems I was having (flickering on interlaced channels, intermittent judder on TVE) are not specific to the new Apple TV model, as I’ve reproduced them on the Apple TV HD on the same Samsung 4K TV.

  2. The flickering on interlaced channels seems to be caused by my Samsung TV’s Auto Motion Plus setting. I had it turned on but down to 0 to properly play 24p, but it seems even that is enough to cause issues. After turning it off, interlaced video looks as good as my Chromecast with hardware deinterlacing.

  3. The judder issue seems to happen intermittently, on TVE channels only, without using any experimental video drivers or deinterlacer (the latter won’t even work on an Apple TV HD). It does seem like it may be better on the experimental video driver with the last update, though as this only happens intermittently it may be a bit soon to say.

Just thought some here might be interested - it still seems like the Chromecast wins in video quality on my TV due to being able to enable Auto Motion Plus (which definitely improves 30fps TVE) without causing issues for interlaced content, though this at least solves some of the issues. Still have to figure out whether I’ll get another 4K Apple TV or a Shield for my 4K TV (I’d like a device with more power than the Chromecast)…

I had been running the experimental deinterlacer for a while but yesterday I was seeing slight hitching. This time I didn’t see frequent frame drops in the stats like before, but I switched to the linear deinterlacer and the problem went away.

Of course it’s also possible I’m losing my mind and it’s all in my head. But I don’t think so.

1 Like

I've been seeing the hiccup glitches recently too. Trying to figure out what's going on

2 Likes

To help us help you, is there an explicit place in the “show stats” on Apple TV that indicates whether the content is interlaced or not? There are a couple of stats that may indicate this but I’m not 100% sure. Would be nice to be able to tell easily. Sorry if I’m missing something…

If you go to your recordings on the server website (I.e. DVR-Manage at your IP:8089), it will say under Options-View Details whether a recording is interlaced or progressive.

Right, I was just hoping to be able to tell from “show stats” as well to help troubleshoot quickly.

I've been seeing the slight glitching on camera shots that were pretty still. It's really noticeable since there isn't much motion. I can go back and replay and then I don't see it again.

I'm uploading a new TestFlight beta now with more improvements to the experimental video driver.

2 Likes

New beta good so far. My only issue is TVE feeds that are sent 29.97 instead of 59.94. They take on that 3/2 pull down look. CNN is a good example.

I’m still seeing the issue on live tv on my Apple TV. It only occurs when I change the video driver to experimental.

Changing Deinterlacing mode doesn’t seem to effect it, just the experimental video driver. When I turn it back to default, the problem disappears.

Seems to be gone for me. Running all experimental modes now and using match content.

1 Like

So for me it is MUCH better. I didn't realize how much I was relying on my TV's post processing to clean up the picture. I turned all of that stuff off and switched to the experimental video driver. Now TVE doesn't look like crap anymore, it's actually watchable.
I haven't looked for the microstutter issue yet, it takes longer for that to show but in my quick testing I didn't see it. (that could have been my TV's post processing getting in the way)

1 Like

I noticed that the combination of settings is important. Experimental deinterlace along with experimental driver results in the ATV putting out 29.97 on 29.97 sources. But if you turn off experimental driver, then the stats for nerds says that 29.97 is going out as 59.94. So its a matter of do you want the ATV or your TV to do the conversion?

1 Like

So, I just got the new ATV4k, add Channels and HDHomeRun 4K OTA today. What should the ideal settings be in the Channel Settings app? Appreciate the help!

My advice:
For channels:
Run the testflight build
Set the deinterlacer and video driver to experimental

For ATV:
Set to 4k SDR
Match frame rate
Match dynamic range

For HDHR:
Check your signal strength on the HDHR, they tend to not like hot signals. I'm 12miles from the towers and the signal reads 100%. I have 9db of attenuation that I added and it now reads 97% on most channels.

If you are far away from your towers then you probably don't have to fool with attenuation.

1 Like

Thank you! Appreciate the guidance. Now going to setup my Harmony Elite Hub remote according to the tips from channels.

Also, you may want to turn off motion smoothing on your TV - on my Samsung TV it caused logos/text to flicker, particularly on interlaced channels. Though it is a trade off - some content may look better with motion smoothing, particularly 30fps TVE streams.

Yeah i also found on my Sony bravia I had to turn off sharpness as well as all motion smoothing.
With the latest testflight and experimental deinterlacer and video drivers, even TVE looks awesome now.
The only issue I have is with some of the standard def sub channels, motion is a little stuttered. Only thing I watch them for is old westerns so it doesn't bother me.

1 Like

This is not true. If it looks washed out, then it is a setup/calibration issue. The Dolby Vision processing taking place in the AppleTV (and other certified Dolby devices) takes into account the various video modes like SDR, HDR, DV, HLG and was designed to remap these signals appropriately so they display properly, and if setup well can look even better than standard SDR, HDR, etc.

This is why so many of these devices default to full time DV processing. The manufacturers and Dolby already know this to be the case, but they are just not good at getting the word out to the public and touting this very good video processing, to say the least! I firmly believe this is why Sony has DV forced to on at all times with their newer UHD Bluray line like the X700/X800M2/X1100 players when you click DV on in the menus. Sony are picture quality and video processor experts and this is widely known with their Reality Creation processing using their X1 line of chips. I doubt they would force the DV processing if they didn't know it was superior to theirs.

There are even videos on Dolby's own YouTube channel speaking of and comparing regular SDR to SDR mapped using their proprietary Dolby algorithms and processing. The increase in image quality is easily seen in their examples. I will have to dig up some links but I am sure anyone can search their channel and find them.

In fact, many of the video processor owners and calibrators, like Kris Deering when setting up and calibrating the Lumagen Radiance Pro for his clients, set it up to map ALL sources to an HDR mode to avoid things like JVC projectors from re-syncing when it changes video standards from HDR to SDR or back again. This takes forever on JVC projectors and is quite an annoyance, plus it is said to slightly increase the image quality.

All it is doing is remapping the rec709 SDR color points into HDR's BT2020 color gamut, and converting the power gamma 2.2 into HDR's absolute PQ "gamma". This is very easily done and there are even calibration modes within SpectraCal's CalMAN Calibration Software to test and setup this very thing (Rec709 and DCI-P3 color gamuts mapped into a BT2020 gamut). UHD Blurays are actually mastered at DCI-P3 color points for the most part, which is then mapped into the BT2020 color container. This is done for future proofing purposes so that when more equipment and video sources can take full use of BT2020, it is there ready for them to utilize and already within the existing specs.

I am not sure where you get that it can damage your TV? If the TV is designed to play back HDR content, then sending it remapped SDR to Dolby Vision (or even forced HDR10) should not damage the TV in any way. The ONLY thing I can see happening is that some TVs and especially Projectors display in a brighter mode when receiving HDR and DV, so this will decrease your light source's lifetime, especially as I said with something like a projector and it's lamp, LED or laser light source, which would have to be changed more often if you take this approach. Is this what you mean by "damage your TV", @Maddox?

Of course, there is also nothing wrong with setting the AppleTV to SDR mode and clicking Match Frame and Dynamic Range boxes either, but I just wanted to clear up the posted thought that using a setting to force DV or even HDR10 is inferior and could somehow damage your TV!

1 Like

HDR/DV all-the-time mode causes a lot of un-needed mode switching, delays channel changes, and degrades the user experience when using Channels. Also, it does not look as good for SDR content. The menu navigation is also more washed out. Try it. It is true on multiple Vizio and Samsung 4K TVs I have experimented with. If you think your Sony is different, then post an example, possibly with pictures. We are just giving our solution for the best setting to use for best picture + best user experience when using the Channels app. When set to 4K SDR with auto mode switching set in the ATV settings, true HDR/DV content will automatically switch anyway.

EDIT: I did an example with pictures a couple years ago. It is here: Poor video quality/posterization on Apple TV Channels app