Auto framrate + yadif x2 deinterlacing

I know there has been some discussion in here about auto framerate and possible yadif x2 deinterlacing on ATV 4K. I know you can turn sdr mode switching on and the channels app will change the framerate based on the material. Unfortunately, it does not always work and you have to turn it back on in settings for it to work. Is there any solution to this?
It would be great if it was possible to choose other forms of deinterlacing than blend and linear. On linear mode, movements are smooth, but logos are flickering. On blend mode there are no flickering, but movements are ghosty due to the 25 fps. Any plans on implementing other deinterlacing methods?

We have experimented with other deinterlacing methods, but the ATV hardware is not fast enough to do them. YADIF is optimized for intel chips and is not fast enough to work on ARM chips.

This is probably a bit of a stretch but any chance you could implement more advanced deinterlacing methods server side for Channels DVR users as an option for computers that are fast enough to perform them?


Following back up on this, I know you mentioned that the Apple TV isn't fast enough, but I recently found out that MrMC actually implemented Yadif 2x on the Apple TV 4k, and it looks fantastic. Streams 1080i MPEG2 video from my HDHomeRun at full motion 60fps, and no jagged logos or shaking text/logos. Any chance you could look into this again? I have no desire to switch to MrMC because the Channels UI/experience as a whole blows it out of the water, but as it stands right now, it is significantly better at de-interlacing content.


I think another interesting option here would be to have the DVR server post-process recordings to detect progressive frames in the interlaced sequence (similar to the Telecide filter from AVISynth), so that client devices can use that information to show the full resolution for applicable portions of recordings. Like commercial detection, this can be done entirely on the server (to avoid having to do it in realtime on a client device), and would degrade gracefully if processing is not complete (you can still watch recordings before processing finishes).

Out of curiosity, I did a little more digging into this and came across this, from the MrMC forums:

The best deinterlaced picture output quality for Broadcast TV is going to be found on the Apple TV 4K, Intel hardware, RPi's and the AMLogic S9XX boxes running LibreELEC Kodi Krypton.


TIP time: with MrMC on the Apple TV 4K, when TV is streaming make sure you go into:
Settings > Video > Deinterlaced Method > YADIF2x
Then "Set as default for all Media" down the bottom.

The Shield with MrMC will likely struggle with that setting and 1080i H264 TV, due to having an older slower CPU package vs the ATV 4K.

I'd guess this means that Yadif 2x is feasible on Apple TV 4K (assuming Channels doesn't need much more CPU headroom than MrMC does), and maybe on SHIELD (except for MPEG-4 1080i recordings), and also possibly on the fastest Amazon TV devices (assuming Geekbench results are representative of the capability needed here).

When I tried yadif2x on ATV4K with mpeg2 1080i, there was a lot of stuttering and frame drops.

Has anyone tried playing one of the HDHR 1080i recordings in MrMC with yadif2x? I would be surprised if it works.

I wonder if bwdif would be fast enough?

I tested bwdif too and its just as slow.

For what it's worth, it appears that at least in Comcast markets, this will soon be moot, as they are in the process of converting local channels to 720p MPEG-4:

(Comcast has already been providing all non-locally-originated HD channels in 720p MPEG-4 for a while now, in every market.)