I’ve done a ton of searching for this subject and maybe I’m using the incorrect terminology, but I’m hoping someone can advise accordingly on this.
In short, I have an HDDVR and everything 1080i or 720p looks great. No issues.
I also have TVE and anything 720p looks outstanding @ 60fps.
My issue is with TVE content that is presumably 1080i coming into Channels @ 29.97fps and not being considered interlaced. Because of this, it doesn’t appear the deinterlace functionality is being triggered and I’m left with terrible judder that my eye cannot unsee.
Maybe this doesn’t effect everyone equally, but I’ve noticed if I push the feed through VLC (http://X.X.X.X:8089/devices/ANY/channels/6000/stream.mpg?codec=copy&format=ts) and force enable deinterlacing (linear), the experience is much better.
Is there a way to manually tell Channels DVR to turn on the deinterlace filter?
Thanks!