I've been using my HDHR with Windows Media Center on Windows 7 for quite a while now. I recently discovered Channels and I'm currently giving it a try. I really like the interface and the fact that I can consolidate all my TV watching on my Apple TV instead of switching between it and my Windows 7 box depending on what I want to watch. Right away I've noticed a difference in video quality however.
I've noticed all channels seem to have video quality degradation when viewed through Channels on my Apple TV vs through Windows Media Center. I've setup a test environment where I will record content in Windows Media Center and then watch content in Channels on my Apple TV. When viewing small text on channels that broadcast in 1080i the text flickers and flashes through Channels in ATV. When I switch back to WMC and watch the same content, the text is rock solid and clear, no flicker. Another example of this is when I watch football, the "scoreboard" at the bottom of the screen will flicker pretty badly, but WMC will be solid and very sharp. Again, this issue seems most prevalent with 1080i signals.
Using the same test environment with 720p signals I've noticed pretty much no flickering, but things like logos and the football "scoreboard" seem more soft and definitely not as sharp as content viewed on WMC. The first picture is from WMC while the second is from Channels on ATV.
You can see there is more noise around "Saints" on the picture from Channels on ATV.
I guess I'm just wondering what is causing this video degradation. Is is Channels or Apple TV? It's obviously not my HDHR or the signal quality because I'm viewing the same signal/channel on WMC and the quality is really good. Is the video stream from the HDHR being changed at all when being processed by Channels? Is it due to the hardware differences of WMC running on my computer vs Apple TV? I would like to make Channels work for me, so I'm definitely open to suggestions, but I don't think I want to sacrifice video quality to do so. Thanks much!