I just added a Quadro M2000 to my Windows server (which is remote to my network) and switched from software to hardware transcoding.
Now I can have more streams transcoding than I have upload bandwidth for on the server.
Next, I decided to switch the deinterlacer from Blend to Linear. There was a slight usage increase on the GPU's video decoder (like 10%) but there was still plenty of headroom.
All testing, however was done using web browser player windows. I just decided to try my FireTV's (also remote to the server) and most of the channels will not play back, I just get the spinning buffering blank screen. The status on Channels says that the transcoder is running.
The only one that actually played that I tested was a lower quality sub-channel for a local news station. According to wikipedia (I am not sure how to find out what format the content is in from the HDHR) the main channel for them (which is exhibiting the issue) is 720p, whereas the working subchannel not exhibiting the issue is in 480i.
A cursory lookup of the other non-working channels indicates they are also in 720p.
So why would the deinterlacer being set to linear ONLY affect channels serving progressive content, and not affect interlaced ones?
After switching back to blend, everything played back fine.
Any known issues with the FireTV and the deinterlacer? I would think that the output streaming to the client would be the same end result regardless - it is just how it gets the content processed that is different?