De-interlacing quality regression?

I recently moved on from a plasma tv to a OLED one, so my Apple TV (4K, 4th gen) is now driving a 4K display instead of a full HD one. I noticed that 1080i tv channels are not very well de-interlaced; something a bit odd appears to be going on, as the video stream seems to alternate between a nicer de-interlacing process and a more crude one that looks like crude line doubling.

Channels app has “Deinterlacing mode” set to “Linear (60fps)”, Apple TV is set to output 4K SDR.

I suspect this is perhaps not new and more easily visible on a 4K display than on a HD one, hence me being now more annoyed by it.

How does it look when you change to the Experimental deinterlacer?

Logos and text look very clean with experimental. I will watch more to see if there are perceivable drawbacks but at first sight it seems a winner.

Thanks for the suggestion!