Anyone had success with an indoor antenna?

That sounds like 1080i. I get that all the time on my CBS channel. Never on Fox. I don't think the pixilation is uncommon. I think it has more to do with the settings form the broadcaster. Fox is 720 and football never pixilates like it doesn't on the 1080i stream from CBS.

100% untrue. Amplifiers should be used only after calculating the starting level of your signal less the loss for the entire environment. Antenna placement in the attic (unless you need to run an additional 100ft of rg6 because you live in a mansion) is not going to increase signal loss much at all. Even after calculating loss for rg6 and your splitters an amp may still not be necessary. I have an attic antenna that feeds a 4 way splitter. I lose about -2dbmv for the rg6, then -7dbmv for the 4 way splitter. I then have a -10db pad. So my total loss is -19dbmv.

It may sound a little crazy that I would add a 10db attenuator but my signal is so strong that I was overdriving my hdhrs.

Your results may vary. When my antenna was in the attic (now on the roof), I was running 100 feet of high quality RG-6. I could see a BIG difference in reception with the amplifier compared to a straight run without the amplifier. Calculations are fine, actual results are better.

100 ft of rg6 is going to drop you about -5dbmv. You probably have loss elsewhere as well (splitters etc) and the loss from the cable tipped you over the edge. Math doesn't lie :slightly_smiling_face:. Point is adding an amp should never be the rule of thumb. It can often cause more issues than it helps depending on the situation.

Also it's usually best to buy bulk rg6 and terminate yourself with compression fittings if you find that you have a huge excess of cable coiled up.

No splitters, bulk RG6 with compression fittings, no extra cable coiled up. It might be a little longer than 100 feet. Maybe 110 feet.

I am already on the edge. I am 47 miles from the transmitters, and my attic is about 35 to 40 feet above sea level.

I have been using an antenna amp for many years in two different locations, and have always found that it improves reception.

Bingo... 5-6db drop on the rg6 plus the distance = you need an amp. Not everyone does when adding an attic antenna, it's not a one size fits all approach.

From my attic to my hd turner is about 14 feet. No amp needed. Plus I'm roughly 10 and 14 miles from the tv towers. I added a pre-amp when first testing out an antenna and it dropped my tv signals to 70% and would get some odd pixilation going on. It then dawned on me I had way to hot of a signal. So I removed it and reception went from 70 to 90 and my pixel issues went away. My other signals are pegged at 100%.

350 Mile Antenna LOL

I purchased this one about 4 years ago, put it in the attic to grab stations about 25-30 miles away:
It seemed to work pretty good, but had trouble with a specific band of channels (Channel 10 out of Milwaukee). After about 2 years, it started to get kind of flakey, and after reading a bit, it appears the powered portion of it is super cheap and starts to fail.

I contacted Silicondust and they said antennas like that will have trouble with those specific channels - more prone to interference (some else here can probably explain why better than I can). They recommended this one:
and I purchased this booster that seemed to be good from reviews:

I put the antenna in the attic with the same mount from the old one (the new one didn't come with a mount) with a 25' run of coax, split it with a cheap splitter and feed (2) 4-tuner HDHR's with it. Never did need that booster. It's been rock solid, even through storms summer and winter. In my scenario, the antenna does point through the roof slope, so through a foot or two of snow.

So all of the sudden yesterday the pixelation went away completely on the WTAE 1.0 signal. They must have made some sort of change/fix on their end that resolved it for me.

I use a Mohu Arc with a Kitztech preamp. Amazing results.