FastChannels - FAST Channels aggregator/manager

Just pulled V1.7.0 but had issue with 1.6.0 last night..
LG channels are the issue...


2026-03-18 05:56:54,354 INFO app.worker: FastChannels worker v1.7.0 starting
2026-03-18 05:56:54,358 INFO app.worker: [lg-channels] Scrape job started
2026-03-18 05:56:56,900 ERROR app.scrapers.base: [lg-channels] GET https://api.lgchannels.com/api/v1.0/schedulelist failed: 500 Server Error:  for url: https://api.lgchannels.com/api/v1.0/schedulelist
2026-03-18 05:56:56,901 ERROR app.scrapers.lg_channels: [lg-channels] schedulelist request failed
2026-03-18 05:56:59,101 ERROR app.scrapers.base: [lg-channels] GET https://api.lgchannels.com/api/v1.0/schedulelist failed: 500 Server Error:  for url: https://api.lgchannels.com/api/v1.0/schedulelist
2026-03-18 05:56:59,101 ERROR app.scrapers.lg_channels: [lg-channels] schedulelist request failed
2026-03-18 05:56:59,278 INFO app.worker: [lg-channels] Scrape complete — 0 channels, 0 programs (4.9s)

If I use US only, then this is the log:

2026-03-18 06:00:36,671 INFO app.worker: [lg-channels] Scrape job started
2026-03-18 06:00:37,482 INFO app.scrapers.lg_channels: [lg-channels] 191 channels fetched
2026-03-18 06:00:37,491 INFO app.scrapers.lg_channels: [lg-channels] 1921 EPG entries fetched
2026-03-18 06:00:38,158 INFO app.worker: [lg-channels] Scrape complete — 191 channels, 1921 programs (1.5s)
2026-03-18 06:00:38,160 INFO app.routes.images: [images] pre-warm starting: 191 URLs — 191 already fresh, 0 to fetch (8 workers)
2026-03-18 06:00:38,161 INFO app.routes.images: [images] pre-warm done: 0 cached, 191 already fresh, 0 failed (of 191 total)

got ya.. I suspect they just are geo-locked to US only. I'll play around with their API. If it is geo-locked, I'll probably have to remove options to add other countries.

Edit--> yep, their API rejects anything other than US. will hardcode US in next version and remove the config for other regions.

I think it would be useful to show when the last audit was done for each source.

I ran stream audit on several sources, but I don't see any flags on the Channels page.

1 Like

With the jgomez docker for Plex there was a dedicated gracenote feed. What's the correct way to get that in FastChannels too?

It takes litterally 1 minute to load the Dashboard page. Is it normal or do I have a problem?

1 Like

probably this stupid DB locking issue I can't seem to figure out. i assume you are on 1.7? anything in logs?

docker logs fastchannels -f

No, sorry, still on version 1.2. I tried to upgrade earlier in Portainer but that failed.

I will try again.

Works for me in Portainer
Stack

services:
  fastchannels:
    # GitHub home for this project: https://github.com/kineticman/FastChannels
    # Docker container home for this project: https://github.com/kineticman/FastChannels/pkgs/container/fastchannels
    image: ghcr.io/kineticman/fastchannels:latest
    container_name: fastchannels
    environment:
      # Set the timezone to your local timezone
      - TZ=America/Los_Angeles
    network_mode: "bridge"
    ports:
      - "5523:5523"
    restart: unless-stopped
    volumes:
      - db_data:/data
volumes:
  db_data:

brave_screenshot (1)

brave_screenshot

Yes, that's what I'm doing. I have upgraded it this way before but today I'm getting error 500.
Not my lucky day.

maybe go nuclear and delete container/image and start fresh?

@KineticMan This is a pipe dream/wish... I understand you're busy with getting it all working and then the possible database switch... so this is a "put it out there" to see if this is even possible, at some time in the future...
Would it be possible to do a manual add of either an external stream or playlist?
I ask, as I had previously asked if you could add the LG_UK playlist, as there are 2 channels that I'm interested in getting... "autentic history" and "autentic_travel"...
I had also previously listed the source for the streams... so if the LG is API is IP locked, and I could either add the individual stream or even the playlist I found them on, I would appreciate it...

Actually, I just found this... would it be possible to see if you can scrape/add these:
https://www.autentic.com/117/Channels.htm#autentic_travel
https://www.autentic.com/117/Channels.htm#autentic_history
There are other listed channels so I don't know if you can get them all in one go?

Found the problem: I recently ran WatchTower to upgrade everything in my Docker and that left behind old images that were taking disk space. I cleared that up and the upgrade succeeded. :sweat_smile:

There is a prune option on watchtower BTW. I recommend turning that on. Sometimes that residue leaves unnecessary space

WATCHTOWER_CLEANUP=true

2 Likes

Yea. I can only assume there needs to be some optimization done for this. I first was running this from docker hosted on a spare pi4b, that worked ok, but the cpu could not really keep up with this. Especially first run of it. Maxed cpu out for about 30min before the web ui would even load anything.
Now running on a HyperV on a Intel i9 12900k with 8 cores set to it. Still maxes out cpu for abit, and the web Ui takes a couple minutes to come up, first load up.

Question about these EPG Only sources. Amazon Prime Free Channels & Sling Freestream

I disabled these as I'm not sure how their guide data could be applied to other sources if we're not sure the DUP channels are an exact duplicate. Does it try to match by channel name or the actual guide data Title, Episode Title, Season and Episode, Original Air Date, etc?

I don't think it's my software. This is very lightweight- maybe a little heavy on initial setup opening Playwright on some scrapes might be heavy. You have something else sorry.

TBD on guide enrichment.... it was an idea I had when designing it, but not well implemented. ignore for now.

1 Like

Is it possible to backup all the settings by backing up the fastchannels.db file?
Or is there some way to export/import the settings?

Just don't want to have to start from scratch if something goes wrong. Invested too much time in changing sources, channels, feeds already.

I disagree. As other solutions take maybe 20 secs or so on the same hardware to fully initialize and only peak cpu for very brief time. They also only take up less than 200MB or RAM. This, pushes over 1GB easy (so says Portainer info). My issue with the pi may have been ram limited as it was the 1gb model and there was the overhead of the linux os i had on it as well. But dirt poor performance on the i9 12th gen, is not good. I am not gonna load it up on my older mini pc that is my main Docker setup that is a 2c/4t i7 Intel 7th gen NUC.

This software, i understand, is not the same as the single FAST provider dockers out there...this one has multiple, thus, i get it has a much heavier workload and que of things to do.
But, having to wait several minutes, just to bring up the admin UI, even at first run, seems a bit clunky.
Even portainer is hanging on "deploying stack" for a while as things fire up.

I am just saying, there may be a more refined way to do it that is worth considering at some point.
To lessen the initial load and work flow.
Maybe not have it go all out guns blazing first startup, and just give a fast initial startup to get the admin Ui up and then give the user options there to trigger a full initial scrape of everything, or custom initialization, let user pick which sources they want to enable/disable first, then run initialization scrapes.

I also would suggest adding user adjustable scrape time periods at some point, as i have previous mentioned, some of these sources need to be refreshed very often (i have all my current sources using other methods set to 1hr in channels) some fast sources only have a few hours of guide data, and having a 1 days rescrape is not going to be good.

Guide data, in my previous comparing, as i mentioned already, i am seeing differences and some minor issues overall, when compared to already mature options in place (for the ones i use for Pluto, Plex, Samsung) Xumo still is pretty much unusable. I see comments that Guide data things to be worked on still. Which is good.

Again, just my initial quick experience with this software of yours.
It really is impressive what you have gotten so far.
Looking forward to seeing it refined and working the bugs out.