Better dynamic bitrate/quality setting? Maybe per channel?

I recently did a throughput test of my remote server streaming to 6 local web client windows. Server did fine, but looking at the stats of the individual channels got me thinking.

The 6 channels all had the following bitrates:

4.2
6.2
8.2
10.3
11.4
12.5

My web player server quality is set to 10, and I have my clients enforced to 8. I have both the horsepower and bandwidth at both the remote (server) and local side to handle even more streams, transcoded or in original format.

So I am in theory at 10 Mbps client side needlessly upscaling 3 of the channels, and needlessly compressing 3 of the channels.

I avoided 'original' when setting up channels because some of my local channels when I was initially using Plex would have audio sync issues. I think it was a combination of the codecs for video or audio that channel was using combined with certain clients being incompatible with each other. Setting Plex to transcode everything fixed that, so that is why I am at my current settings.

So I guess the first question is, was that problem unique to Plex or has anyone ever seen the same issues requiring transcoding on specific channels using Channels DVR? I don't remember which channels they were, so figured I would throw out that question before doing a test of every channel on every client to see if it is safe to go with the original option. Plex's live TV functionality is borderline broken for remote use, and on the same LAN it isn't much better, so the problems could have been with their implemenation. I also think there may be a benefit to having the server upscale 480i and 720p content to 1080p so the client or TV doesn't have to?

In a perfect world, it would be awesome to select transcode settings per channel... Something like a 'always transcode this channel' or 'always transcode this channel to X bit rate' setting. If doing the second option, client streaming limits could then override and downgrade that, but it would also force transcoding regardless.

Next best thing would be that when setting a specific bitrate limit, content under that bitrate could optionally be played originally. That doesn't help me if certain channels with low bitrates needed to be transcoded to fix audio sync issues, but would still be a good upgrade to the existing settings where it transcodes streams to higher bitrates than the original content.

I think you may be overthinking this due to your past experiences with Plex. I share my DVR with two family members who stream remotely and “Original” works just fine as long as you have adequate upload bandwidth to serve the streams. No audio sync issues ever.

I do agree the web player transcode options should be updated. Sometimes my local CBS TVE stream is slightly above 10Mbps and gets needlessly transcoded down to 10Mbps instead of being remuxed. One workaround if you’re watching on the same network as your DVR is to access the direct stream URL and play in VLC to bypass the web player transcoding options.

In terms of upscaling… I could be totally off on this but I don’t think your server upscales any stream even if it’s being transcoded. I assume that’s handled by the client and even the weakest clients shouldn’t have any issues upscaling a lower resolution stream to your TV’s native resolution. One of the devs can confirm this or correct me.

I’ll give it a try then and see if I get any complaints. That’s what was keeping me from just making the change and testing, especially with more people home for the holidays. Clients are all remote from the server.

With upscaling I was thinking quality not ability to do it. Like certain devices upscale to 4K using better algorithms (thinking like nvidia’s AI upscaling) so the picture looks better.

That makes sense, I've heard good things about the NVIDIA's upscaling although I'm running mostly Apple TVs right now. Again, I don't know that much about transcoding and upscaling since I run "Original" on everything, but I can at least tell you that skipping transcoding has caused zero issues for me for the last 1.5 years.

Are the channels you're referring to OTA channels? If so, web browsers are unable to play the MPEG2 codec that is being broadcast and we have to transcode it to H.264 to be compliant with the HLS spec (and to be decoded by the browser).

As @alai said, you shouldn't see any sync issues using Original from our apps. If you have the bandwidth to support it (or are streaming from home), it will give you the quickest experience.

This is how our apps work. For any setting other than "Original" you are setting the maximum bitrate. If the content is already H.264 and is under that bitrate, it won't transcode it at all. If your connection can't handle the maximum bitrate you have specified, it will lower it automatically based on what the current network conditions can handle.

@alai is correct here too. We don't upscale the resolution for any content.

1 Like

Changed the setting to original earlier today and have not gotten any complaints so far. I only tested a couple of channels after I did.

The other answers clear things up - my throughput testing was done using the web player (easiest way to open 6 streams) and this is all OTA content, so that is why I was seeing lower bitrate content being converted up to the 10 Mbps setting.

Does deinterlacing still happen when not transcoding? Or does it just rely on the deinterlace to happen on the client side (or the TV the client is feeding) at that point. I'm thinking deinterlacing is part of transcoding, and wouldn't happen if not.

Deinterlacing can happen during transcoding, or at the client.

The encoders also use variable bitrates, so it should not produce 10mbps unless there's 10mbps worth of video content. If it's just a black screen it would compress much better and not use 10mbps for that part of the video.

Didn't think of it that way - so when you see all of these saying 10 Mbps it is a max - and for live it won't approach that because of compression and source data not having near that much data in some cases. With recorded shows will it potentially buffer ahead further and max out at 10 Mbps because it has future data to work from?