We need Channels to match framerate

I have noticed motion issues with AppleTV/AndroidTV/FireTV with prime time content OTA and DVR'd. The mindset of everything is 60fps needs to go away. OTA is now broadcasting at native framerate as well as TVE content broadcasting at both 60fos and 30fps.

Below is a couple.images taken from OTA content on FireTV X-ray.

CW playing at 1080i @48fps and Fox playing at 720p @24fps.

If you click View Details on the web UI for those records what does it show for the fps

It doesn't show the actual framerate being sent. From the above images file details said 59.97fps for the 24fps image and 29.97fps for the 48fps image.

Those details are not as accurate as monitoring the actual content.

AFAIK USA ATSC broadcasts are only transmitted in 29.97fps or 59.97fps. My best guess is that either your TV or FTV are detecting the frame rate incorrectly.

You can try loading those recordings into various video software on a PC. I'm pretty sure they'll all say 29.97fps

This may be true but imported movies need to be matched imo, even Apple after 10+ years finally woke up and smelt the coffee.

Yea the devices are always wrong but it reads other programs just fine. It is only wrong while watching Prime time content. Sigh

Try using MediaInfo on one of the recordings to see what it says the frame rate is.

Also try playing one with VLC and view Tools -> Codec Information to see the frame rate.

Movies & TV Performance shows 60FPS while playing.
VLC and Mediainfo show the packaged codec at 29.97 but neither show live statistics. The streaming stats in VLC doesn't show the FPS.

I tried FRAPS but it doesn't appears to work on videos.

Not sure what you expect. Video encoded at 29.97fps (actually 30/1.001) will be displayed during playback at 29.97 frames each second (actually 30/1.001), or a frame every 0.03336666... seconds.

What I expect? I expect a video that is packaged at 30 not to play at 48fps on the FireTV and a video that is packaged at 60 not to play at 24fps on the FireTV..... Nor do I expect both to play at 60fps on the windows movie & tv player.

Here is tonight's recording on CW. The CW OTA sends a 1080i signal but one file detail says progressive! The tower also only sends a stereo audio signal but yet the details are a 5.1 track 1.

Could you try switching the FireTV to the software decoder and see if you get any better results?

Its hilarious watching Last Man Standing on CW and the Show has the motion glitches but the commercials are smooth as silk.

It’s a bummer the Sony motion smoothing screws up like this.

Yea, not happy!

I'm a video engineer and just noticed this thread and would like to clarify a few things. In the USA, all over-the-air or cable broadcasts are 59.94p progressive or 59.940i interlaced. The interlaced content is typically encoded at a 29.97 frame rate but that is not the rate at which the content needs to be displayed. Inside each 29.97 fps frame, you have an odd and even field. Each field was originally recorded at 59.940 rate and needs to be displayed at that rate to look correct. If you simply play the original full frames at 29.97 fps you will see horrible judder/stutter because each field will stay on the screen longer than it should. When playing interlaced content, the player needs to set the HDMI format to either 59.940i or 59.940p (if onboard de-interlacing to progressive 59.940p is available on the playback device like FireTV, AppleTV, etc.). Players that support proper deinterlacing will double the effective encoded frame rate by interpolating the missing field and then displaying the frame at the intended 59.940 rate.


i have my Nvidia Shield set to 60fps and don't see any issues with any fps content.
I dont think the human eye can see the difference in 59.94 and 60pfs. I have tried switching the Shields output to 59.94 and can not see any difference.
Perhaps the video processing/scullers etc are compensating to something.

There is also streaming content to consider. The frame rate reported on those are either 30fps or 60fps on certain things. So setting you output framerate to match NTSC OTA tv then messes with streaming video then i guess.

Thank you, yes this is what I see. I have to turn off Motionflow on my TV which what it does is pull the 24fps out of the 60fps stream to restore the framerate it was filmed at.

Ideally you want the HDMI format to change to match the content being played. So for your streaming sources like Netflix it should switch to 23.976 for most of their programming. Playing a 59.940 source at 60.0 would produce a repeated frame every 16 or so seconds. Some people are bothered by this while some are not. Best place to notice it would be things like smooth camera pans or scrolling news tickers at the bottom of some stations.

Once you enable TV motion settings that do processing like frame interpolation or inverse telecine to recover 23.976 from 59.940 source, then all bets are off. None of those algorithms are perfect so you may see occasional glitches. Varies by manufacturer. But if the source device is not sending the right data, then you're not going to get the best results regardless of television. Garbage--in, garbage-out.

1 Like

I am not using a tv, but a computer monitor, so i have no "smooth motion" things to use.
The tvs i have used that on, it is trash, super flickery and gives me eye strain and a head ache in just a couple min. no idea how folks can stand that.

Also seen 140 and 240hz "gaming" monitors. also are terrible to me.