FastChannels - FAST Channels aggregator/manager

It is how i do with Channels currently. Since it is one source, i have to filter out what i want from the list of all channels i want to watch. Same as with TVE or HDHR. It has one master list of channels per source, i have to go in and disable channels i do not want, and set to Fav the ones i want to see.

I can't seem really use this mass dump of multiple sources, when a certain channel is put into multi categories that vary in name slightly. A certain kids channel, even animated shows that are not kids, show up maybe in Kids, maybe Family, or Kids and Family, or even Anime in some cases.

Also, i do like to just browse the master list and see what is available and add as see fit or feel like adding. They also add channels with different shows from time to time.

So what i want may change, or some sources have a channel others do not have, that i do not even know of until i look at the master list of the source.

What i do know 100% is that i do NOT want ever to see Sports, News, and non-English channels.
So, just saying, a way to globally disable channels from all sources that fit into those categories (or others a user may want to disable), would be very welcome to me.
Or, if it can be done currently, just in a process or way i am not understanding.

Sure, i can build a feed of "what categories i want" but in that process, it is cluttered up with things i do not want to see or have to sort through. Looks like i can just leave the cats i want left out unchecked. But still, the resulting feed is still far to long and i have to then go in and trim that out, cause each included category included is full of hundreds of channels i do not want.
Thats is multiple layers and steps to filter out the END product. Vs, just doing it once at the source IN.
That is my logic, and how my brain works.

Again, just my initial input of a quick test of this. I know it still is a WIP and in active development.
Just trying to figure it out, and see if/how i can use it in my workflow to either replace the setup i have in place, or supplement it.

It IS very useful though, the search ability. I can search for a specific channel and see it in the table and it shows me what sources have it etc. That is very useful for me to compare what source is "best"

Oh,. and i would hope, you to add a import/export / way to backup/restore user settings and all of that.
I assume once things are ready for more stable release etc.

The other issue is channels 750 limit - SLM gets by that through multiple m3u custom channels - i'm currently running this through slm to get that but it would be nice to go directly

I agree. However, it seems that we may not be getting the gracenote channels identified properly.

I am trying to set up two sources for providers which have both Gracenote and non-gracenote (no-EPG) channels. My previous setup uses the jgomez177 containers for both Tubi and Plex. I have Gracenote and no-EPG sources set up for each of those providers.

Unfortunately, I'm not seeing a correlation between my old channel lists and the new ones I am attempting to set up. For instance:

Tubi Gracenote channels: jgomez - 132 FAST - 32
Tubi - no EPG channels" jgomez - 35 FAST - 142

Plex Gracenote channels : jgomez - 389 FAST - 0
Plex - no EPG channels: jgomez - 292 FAST - 678

Am I missing something or approaching this the wrong way?

Make multiple custom feeds.

Y’all gotta Stop thinking one master list. I know it’s completely different than what we’ve done before.

@KineticMan I, for one, like this new fastchannels set up, and I'm slowly setting it up to my liking...
I am not using this with any TV/phone apps, my setup is all my TVs have a computer connected and I use Windows iptv apps or container iptv players to watch what I want...

Now, with my request... Let's say I want to set up 5 custom feeds, let's say, Animals and Nature, Travel and Food, Music and Ambiance, Comedy, and Space and History.
I know I can set them up via the Feeds tab, and go thru the categories and pick the corresponding categories. Now I have the 5 custom feeds to put in my iptv players...

Question: WOULD it be possible to go to the Channels tab, go thru the entire list of, currently 4000 channels, and let's say click to select multiple channels and have a button that says add to a (custom) Feed, and then the list of available Feeds will pop up, for you to select the feed you want to add to?

Part of my reasoning is this, let's say the Travel and Food
In the categories, there's 2 to pick from: Travel, and Travel & Lifestyle...
I don't want the "Lifestyle" to be included in my custom feed... so if I could see that different channels in the Channels tab, I could add my own listings without including whatever is in Lifestyle...
Hope this makes sense...
Edit 2: So when I go to the Channels tab, I see I can even narrow it down by picking the Category to look thru, so I can pick the channels I want to add... even easier now...
Edit 3: I see that from the Channels tab, if I click to show the Category, I can disable channels so they don't appear in my custom feed, however, I don't want to disable the channels, as I may want to make another custom feed to include the other channels...

1 Like

Getting exceptions in the log and no way to paste the full log (too big).
It started when it was doing its scheduled scrape for stirr, here is that one

log
2026-03-17T22:58:05.783101830Z 2026-03-17 15:58:05,782 INFO __main__: [scheduler] Enqueued stirr (interval=360m, age=361m)
2026-03-17T22:58:05.928286366Z 2026-03-17 15:58:05,927 INFO app.worker: FastChannels worker v1.4.0 starting
2026-03-17T22:58:05.938928075Z 2026-03-17 15:58:05,938 INFO app.worker: [stirr] Scrape job started
2026-03-17T22:58:06.693193684Z 2026-03-17 15:58:06,692 INFO app.scrapers.stirr: [stirr] 156 channels fetched
2026-03-17T22:58:08.048153727Z 2026-03-17 15:58:08,047 INFO __main__: [worker] pruned 1441 expired EPG entries
2026-03-17T22:58:42.332610930Z 2026-03-17 15:58:42,332 INFO app.scrapers.stirr: [stirr] 33015 total programs fetched
2026-03-17T22:58:57.577309182Z 2026-03-17 15:58:57,418 ERROR app.worker: [stirr] Scrape failed after 51.5s
2026-03-17T22:58:57.577905847Z Traceback (most recent call last):
2026-03-17T22:58:57.578045751Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1936, in _exec_single_context
2026-03-17T22:58:57.578126324Z     self.dialect.do_executemany(
2026-03-17T22:58:57.578183297Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 949, in do_executemany
2026-03-17T22:58:57.578246430Z     cursor.executemany(statement, parameters)
2026-03-17T22:58:57.578294419Z sqlite3.OperationalError: database is locked
2026-03-17T22:58:57.578353046Z 
2026-03-17T22:58:57.578401354Z The above exception was the direct cause of the following exception:
2026-03-17T22:58:57.578469307Z 
2026-03-17T22:58:57.578508306Z Traceback (most recent call last):
2026-03-17T22:58:57.578559655Z   File "/app/app/worker.py", line 126, in run_scraper
2026-03-17T22:58:57.578616579Z     _upsert_channels(source, channels)
2026-03-17T22:58:57.578686779Z   File "/app/app/worker.py", line 603, in _upsert_channels
2026-03-17T22:58:57.578750400Z     db.session.flush()
2026-03-17T22:58:57.578792923Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/scoping.py", line 924, in flush
2026-03-17T22:58:57.578854472Z     return self._proxied.flush(objects=objects)
2026-03-17T22:58:57.578907991Z            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2026-03-17T22:58:57.578946495Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 4331, in flush
2026-03-17T22:58:57.579004611Z     self._flush(objects)
2026-03-17T22:58:57.579058619Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 4466, in _flush
2026-03-17T22:58:57.579106605Z     with util.safe_reraise():
2026-03-17T22:58:57.579157135Z          ^^^^^^^^^^^^^^^^^^^
2026-03-17T22:58:57.579208296Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/util/langhelpers.py", line 121, in __exit__
2026-03-17T22:58:57.579259080Z     raise exc_value.with_traceback(exc_tb)
2026-03-17T22:58:57.579312064Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 4427, in _flush
2026-03-17T22:58:57.579372653Z     flush_context.execute()
2026-03-17T22:58:57.579428965Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/unitofwork.py", line 466, in execute
2026-03-17T22:58:57.579478291Z     rec.execute(self)
2026-03-17T22:58:57.579530374Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/unitofwork.py", line 642, in execute
2026-03-17T22:58:57.579590852Z     util.preloaded.orm_persistence.save_obj(
2026-03-17T22:58:57.579636464Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/persistence.py", line 85, in save_obj
2026-03-17T22:58:57.579705760Z     _emit_update_statements(
2026-03-17T22:58:57.579754262Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/persistence.py", line 912, in _emit_update_statements
2026-03-17T22:58:57.579803012Z     c = connection.execute(
2026-03-17T22:58:57.579918713Z         ^^^^^^^^^^^^^^^^^^^
2026-03-17T22:58:57.579964841Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1419, in execute
2026-03-17T22:58:57.580013867Z     return meth(
2026-03-17T22:58:57.580054160Z            ^^^^^
2026-03-17T22:58:57.580095965Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/sql/elements.py", line 527, in _execute_on_connection
2026-03-17T22:58:57.580146016Z     return connection._execute_clauseelement(
2026-03-17T22:58:57.580191125Z            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2026-03-17T22:58:57.580228894Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1641, in _execute_clauseelement
2026-03-17T22:58:57.580276329Z     ret = self._execute_context(
2026-03-17T22:58:57.580320845Z           ^^^^^^^^^^^^^^^^^^^^^^
2026-03-17T22:58:57.580360447Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context
2026-03-17T22:58:57.580407919Z     return self._exec_single_context(
2026-03-17T22:58:57.580447525Z            ^^^^^^^^^^^^^^^^^^^^^^^^^^
2026-03-17T22:58:57.580484454Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context
2026-03-17T22:58:57.580531640Z     self._handle_dbapi_exception(
2026-03-17T22:58:57.580574813Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 2363, in _handle_dbapi_exception
2026-03-17T22:58:57.580622363Z     raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
2026-03-17T22:58:57.580677714Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1936, in _exec_single_context
2026-03-17T22:58:57.580724983Z     self.dialect.do_executemany(
2026-03-17T22:58:57.580764432Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 949, in do_executemany
2026-03-17T22:58:57.580816269Z     cursor.executemany(statement, parameters)
2026-03-17T22:58:57.580857302Z sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) database is locked
2026-03-17T22:58:57.580966039Z [SQL: UPDATE channels SET name=?, slug=?, updated_at=? WHERE channels.id = ?]
2026-03-17T22:58:57.581045071Z [parameters: [('News On 6 Tulsa OK', 'news-on-6-tulsa-ok', '2026-03-17 22:58:42.400222', 4141), ('News 9 Oklahoma City OK', 'news-9-oklahoma-city-ok', '2026-03-17 22:58:42.400241', 4142), ('ABC 5 Minneapolis-St. Paul MN', 'abc-5-minneapolis-st.-paul-mn', '2026-03-17 22:58:42.400245', 4143), ('KOB 4 Albuquerque NM', 'kob-4-albuquerque-nm', '2026-03-17 22:58:42.400247', 4144), ('NewsChannel 13 Albany NY', 'newschannel-13-albany-ny', '2026-03-17 22:58:42.400250', 4145), ('News10NBC Rochester NY', 'news10nbc-rochester-ny', '2026-03-17 22:58:42.400253', 4146), ('WDIO ABC News Duluth MN', 'wdio-abc-news-duluth-mn', '2026-03-17 22:58:42.400255', 4147), ('ABC 6 NEWS Minnesota & Iowa', 'abc-6-news-minnesota-&-iowa', '2026-03-17 22:58:42.400258', 4148)  ... displaying 10 of 34 total bound parameter sets ...  ('KSL-TV -5 Salt Lake City UT', 'ksl-tv--5-salt-lake-city-ut', '2026-03-17 22:58:42.400315', 4174), ('FOX 11 Green Bay WI 2', 'fox-11-green-bay-wi-2', '2026-03-17 22:58:42.400317', 4175)]]
2026-03-17T22:58:57.581196731Z (Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-03-17T22:58:57.624681799Z 2026-03-17 15:58:57,624 ERROR rq.worker: [Job bed81a6f-691c-47ec-a57a-6a0065345c12]: exception raised while executing (app.worker.run_scraper)
2026-03-17T22:58:57.624873323Z Traceback (most recent call last):
2026-03-17T22:58:57.624920825Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1936, in _exec_single_context
2026-03-17T22:58:57.625027424Z     self.dialect.do_executemany(
2026-03-17T22:58:57.625069369Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 949, in do_executemany
2026-03-17T22:58:57.625117779Z     cursor.executemany(statement, parameters)
2026-03-17T22:58:57.625160121Z sqlite3.OperationalError: database is locked
2026-03-17T22:58:57.625202118Z 
2026-03-17T22:58:57.625239087Z The above exception was the direct cause of the following exception:
2026-03-17T22:58:57.625284182Z 
2026-03-17T22:58:57.625320360Z Traceback (most recent call last):
2026-03-17T22:58:57.625360283Z   File "/app/app/worker.py", line 126, in run_scraper
2026-03-17T22:58:57.625403435Z     _upsert_channels(source, channels)
2026-03-17T22:58:57.625447466Z   File "/app/app/worker.py", line 603, in _upsert_channels
2026-03-17T22:58:57.625490839Z     db.session.flush()
2026-03-17T22:58:57.625533191Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/scoping.py", line 924, in flush
2026-03-17T22:58:57.625581253Z     return self._proxied.flush(objects=objects)
2026-03-17T22:58:57.625622250Z            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2026-03-17T22:58:57.625674634Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 4331, in flush
2026-03-17T22:58:57.625722428Z     self._flush(objects)
2026-03-17T22:58:57.625766794Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 4466, in _flush
2026-03-17T22:58:57.625813574Z     with util.safe_reraise():
2026-03-17T22:58:57.625852731Z          ^^^^^^^^^^^^^^^^^^^
2026-03-17T22:58:57.625889770Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/util/langhelpers.py", line 121, in __exit__
2026-03-17T22:58:57.625938246Z     raise exc_value.with_traceback(exc_tb)
2026-03-17T22:58:57.625980730Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 4427, in _flush
2026-03-17T22:58:57.626027300Z     flush_context.execute()
2026-03-17T22:58:57.626065983Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/unitofwork.py", line 466, in execute
2026-03-17T22:58:57.626152162Z     rec.execute(self)
2026-03-17T22:58:57.626192332Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/unitofwork.py", line 642, in execute
2026-03-17T22:58:57.626238488Z     util.preloaded.orm_persistence.save_obj(
2026-03-17T22:58:57.626279530Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/persistence.py", line 85, in save_obj
2026-03-17T22:58:57.626326927Z     _emit_update_statements(
2026-03-17T22:58:57.626368297Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/persistence.py", line 912, in _emit_update_statements
2026-03-17T22:58:57.626417359Z     c = connection.execute(
2026-03-17T22:58:57.626456698Z         ^^^^^^^^^^^^^^^^^^^
2026-03-17T22:58:57.626493648Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1419, in execute
2026-03-17T22:58:57.626548478Z     return meth(
2026-03-17T22:58:57.626606163Z            ^^^^^
2026-03-17T22:58:57.626650977Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/sql/elements.py", line 527, in _execute_on_connection
2026-03-17T22:58:57.626699354Z     return connection._execute_clauseelement(
2026-03-17T22:58:57.626739048Z            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2026-03-17T22:58:57.626782127Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1641, in _execute_clauseelement
2026-03-17T22:58:57.626828584Z     ret = self._execute_context(
2026-03-17T22:58:57.626868555Z           ^^^^^^^^^^^^^^^^^^^^^^
2026-03-17T22:58:57.626905468Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context
2026-03-17T22:58:57.626951216Z     return self._exec_single_context(
2026-03-17T22:58:57.626990027Z            ^^^^^^^^^^^^^^^^^^^^^^^^^^
2026-03-17T22:58:57.627026937Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context
2026-03-17T22:58:57.627073133Z     self._handle_dbapi_exception(
2026-03-17T22:58:57.627111599Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 2363, in _handle_dbapi_exception
2026-03-17T22:58:57.627157642Z     raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
2026-03-17T22:58:57.627199942Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1936, in _exec_single_context
2026-03-17T22:58:57.627249300Z     self.dialect.do_executemany(
2026-03-17T22:58:57.627292668Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 949, in do_executemany
2026-03-17T22:58:57.627339065Z     cursor.executemany(statement, parameters)
2026-03-17T22:58:57.627378962Z sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) database is locked
2026-03-17T22:58:57.627421572Z [SQL: UPDATE channels SET name=?, slug=?, updated_at=? WHERE channels.id = ?]
2026-03-17T22:58:57.627469047Z [parameters: [('News On 6 Tulsa OK', 'news-on-6-tulsa-ok', '2026-03-17 22:58:42.400222', 4141), ('News 9 Oklahoma City OK', 'news-9-oklahoma-city-ok', '2026-03-17 22:58:42.400241', 4142), ('ABC 5 Minneapolis-St. Paul MN', 'abc-5-minneapolis-st.-paul-mn', '2026-03-17 22:58:42.400245', 4143), ('KOB 4 Albuquerque NM', 'kob-4-albuquerque-nm', '2026-03-17 22:58:42.400247', 4144), ('NewsChannel 13 Albany NY', 'newschannel-13-albany-ny', '2026-03-17 22:58:42.400250', 4145), ('News10NBC Rochester NY', 'news10nbc-rochester-ny', '2026-03-17 22:58:42.400253', 4146), ('WDIO ABC News Duluth MN', 'wdio-abc-news-duluth-mn', '2026-03-17 22:58:42.400255', 4147), ('ABC 6 NEWS Minnesota & Iowa', 'abc-6-news-minnesota-&-iowa', '2026-03-17 22:58:42.400258', 4148)  ... displaying 10 of 34 total bound parameter sets ...  ('KSL-TV -5 Salt Lake City UT', 'ksl-tv--5-salt-lake-city-ut', '2026-03-17 22:58:42.400315', 4174), ('FOX 11 Green Bay WI 2', 'fox-11-green-bay-wi-2', '2026-03-17 22:58:42.400317', 4175)]]
2026-03-17T22:58:57.627591360Z (Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-03-17T22:58:57.627636613Z 
2026-03-17T22:58:57.627680775Z During handling of the above exception, another exception occurred:
2026-03-17T22:58:57.627726284Z 
2026-03-17T22:58:57.627762831Z Traceback (most recent call last):
2026-03-17T22:58:57.627822664Z   File "/usr/local/lib/python3.12/site-packages/rq/worker.py", line 1430, in perform_job
2026-03-17T22:58:57.627872186Z     rv = job.perform()
2026-03-17T22:58:57.627917286Z          ^^^^^^^^^^^^^
2026-03-17T22:58:57.627954628Z   File "/usr/local/lib/python3.12/site-packages/rq/job.py", line 1280, in perform
2026-03-17T22:58:57.628001401Z     self._result = self._execute()
2026-03-17T22:58:57.628041754Z                    ^^^^^^^^^^^^^^^
2026-03-17T22:58:57.628080786Z   File "/usr/local/lib/python3.12/site-packages/rq/job.py", line 1317, in _execute
2026-03-17T22:58:57.628129445Z     result = self.func(*self.args, **self.kwargs)
2026-03-17T22:58:57.628173091Z              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2026-03-17T22:58:57.628210570Z   File "/app/app/worker.py", line 163, in run_scraper
2026-03-17T22:58:57.628253359Z     db.session.commit()
2026-03-17T22:58:57.628293403Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/scoping.py", line 597, in commit
2026-03-17T22:58:57.628341565Z     return self._proxied.commit()
2026-03-17T22:58:57.628381120Z            ^^^^^^^^^^^^^^^^^^^^^^
2026-03-17T22:58:57.628418704Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 2030, in commit
2026-03-17T22:58:57.628465931Z     trans.commit(_to_root=True)
2026-03-17T22:58:57.628507949Z   File "<string>", line 2, in commit
2026-03-17T22:58:57.628551460Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/state_changes.py", line 101, in _go
2026-03-17T22:58:57.628600089Z     self._raise_for_prerequisite_state(fn.__name__, current_state)
2026-03-17T22:58:57.628652525Z   File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 971, in _raise_for_prerequisite_state
2026-03-17T22:58:57.628703062Z     raise sa_exc.PendingRollbackError(
2026-03-17T22:58:57.628749845Z sqlalchemy.exc.PendingRollbackError: This Session's transaction has been rolled back due to a previous exception during flush. To begin a new transaction with this Session, first issue Session.rollback(). Original exception was: (sqlite3.OperationalError) database is locked
2026-03-17T22:58:57.628838665Z [SQL: UPDATE channels SET name=?, slug=?, updated_at=? WHERE channels.id = ?]
2026-03-17T22:58:57.628885001Z [parameters: [('News On 6 Tulsa OK', 'news-on-6-tulsa-ok', '2026-03-17 22:58:42.400222', 4141), ('News 9 Oklahoma City OK', 'news-9-oklahoma-city-ok', '2026-03-17 22:58:42.400241', 4142), ('ABC 5 Minneapolis-St. Paul MN', 'abc-5-minneapolis-st.-paul-mn', '2026-03-17 22:58:42.400245', 4143), ('KOB 4 Albuquerque NM', 'kob-4-albuquerque-nm', '2026-03-17 22:58:42.400247', 4144), ('NewsChannel 13 Albany NY', 'newschannel-13-albany-ny', '2026-03-17 22:58:42.400250', 4145), ('News10NBC Rochester NY', 'news10nbc-rochester-ny', '2026-03-17 22:58:42.400253', 4146), ('WDIO ABC News Duluth MN', 'wdio-abc-news-duluth-mn', '2026-03-17 22:58:42.400255', 4147), ('ABC 6 NEWS Minnesota & Iowa', 'abc-6-news-minnesota-&-iowa', '2026-03-17 22:58:42.400258', 4148)  ... displaying 10 of 34 total bound parameter sets ...  ('KSL-TV -5 Salt Lake City UT', 'ksl-tv--5-salt-lake-city-ut', '2026-03-17 22:58:42.400315', 4174), ('FOX 11 Green Bay WI 2', 'fox-11-green-bay-wi-2', '2026-03-17 22:58:42.400317', 4175)]]
2026-03-17T22:58:57.628988728Z (Background on this error at: https://sqlalche.me/e/20/e3q8) (Background on this error at: https://sqlalche.me/e/20/7s2a)

and getting more of these as I'm trying to disable sources.
I figured I will wait until the exceptions stop and enable one source at a time to select channels from.

O.K. not seeing any more exceptions in the log and I have all sources disabled except for one.
I think I'll wait for the new version before continuing.

How do you manually refresh guide data for a source?
Is that scrape? That seems to also re-scrape all channels in the source.
I see guide data that only goes for 1hr in cases, and Channels then runs out.
Like in LocalNow.
Maybe its a source thing to only have current hour to next hour guide data?
Looks like LocalNow needs refresh at least every 1hr.

Yep I got some lingering bug with some workers holding up the DB and crashing. I wasn’t ready but gonna push the new version ASAP.

1 Like

v1.6.0 — What's new

Bulk Enable/Disable — You can now enable or disable all channels matching your current filter in one click from the Channels admin page. Filter by source, category, language, search term, etc., then hit "Disable All" or "Enable All". The buttons only appear when a filter is active so you don't nuke everything by accident. A toast notification confirms how many channels were affected.

Channel list improvement — Channel name sorting is now properly case-insensitive (no more "ZEE5" sorting before "abc Family"). The article-stripping sort (ignoring "The", "A", "An") also applies correctly across mixed-case names now.

Category cleanup — Because apparently "En Español", "español", "Spanish", "Spanish Language", and "latin" are all the same thing, I've normalized ~80 raw category variants down to 34 canonical labels. Channels now get categories like "Latino", "True Crime", "Sci-Fi", and "Home & DIY" instead of whatever mood the scraper was in that day. Existing users can run migration 009 to clean up old data, or just wait for re-scrape.

Amazon Prime Free — Fixed a pagination bug that was only returning 53 channels instead of the full 878.

Gracenote editor in Channels list - fixed a bug where numeric-only Gracenote IDs were incorrectly rejected by the inline editor.

SQLite DB locking fix — A few users on v1.4.0 hit "database is locked" errors during Stirr scrapes. The worker was doing redundant mid-scrape commits that collided with the background pruning thread. Fixed by removing the inline prune, increasing the busy timeout to 30s, and adding a 3-attempt retry loop on write failures. This may need further work if anyone sees funny business in their logs (or crashes), let me know ASAP please!

1 Like

Yea. I am not sure what is wrong...i added Xumo Feed with just all channels, EN selected, most channels have no guide data in Channels.
Also, no channel icons

I may have a bug there. I’ll work on it for next version. I think I know what it might be.

I stopped, and did the update/re-pull stack, think thats the way to "update" this....worked before.

But it seems to be crashing and will not show me the admin page, Internal server error.

      
             ^^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1419, in execute

    return meth(

           ^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/sql/elements.py", line 527, in _execute_on_connection

    return connection._execute_clauseelement(

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1641, in _execute_clauseelement

    ret = self._execute_context(

          ^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context

    return self._exec_single_context(

           ^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context

    self._handle_dbapi_exception(

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 2363, in _handle_dbapi_exception

    raise sqlalchemy_exception.with_traceback(exc_info[2]) from e

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context

    self.dialect.do_execute(

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 952, in do_execute

    cursor.execute(statement, parameters)

sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such column: channels.is_duplicate

[SQL: SELECT count(*) AS count_1 

FROM (SELECT channels.id AS channels_id, channels.source_id AS channels_source_id, channels.source_channel_id AS channels_source_channel_id, channels.name AS channels_name, channels.slug AS channels_slug, channels.logo_url AS channels_logo_url, channels.stream_url AS channels_stream_url, channels.stream_type AS channels_stream_type, channels.category AS channels_category, channels.language AS channels_language, channels.country AS channels_country, channels.number AS channels_number, channels.gracenote_id AS channels_gracenote_id, channels.disable_reason AS channels_disable_reason, channels.is_duplicate AS channels_is_duplicate, channels.is_active AS channels_is_active, channels.is_enabled AS channels_is_enabled, channels.created_at AS channels_created_at, channels.updated_at AS channels_updated_at 

FROM channels 

WHERE channels.is_active = 1 AND channels.is_enabled = 1) AS anon_1]

(Background on this error at: https://sqlalche.me/e/20/e3q8)

192.168.0.4 "GET / HTTP/1.1" 302 201 0s

2026-03-18 00:55:46,531 ERROR    app: Exception on /admin/ [GET]

Traceback (most recent call last):

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context

    self.dialect.do_execute(

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 952, in do_execute

    cursor.execute(statement, parameters)

sqlite3.OperationalError: no such column: channels.is_duplicate

The above exception was the direct cause of the following exception:

Traceback (most recent call last):

  File "/usr/local/lib/python3.12/site-packages/flask/app.py", line 1511, in wsgi_app

    response = self.full_dispatch_request()

               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/flask/app.py", line 919, in full_dispatch_request

    rv = self.handle_user_exception(e)

         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/flask/app.py", line 917, in full_dispatch_request

    rv = self.dispatch_request()

         ^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/flask/app.py", line 902, in dispatch_request

    return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)  # type: ignore[no-any-return]

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/app/app/routes/admin.py", line 21, in dashboard

    total_channels = Channel.query.filter_by(is_active=True, is_enabled=True).count()

                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/query.py", line 3146, in count

    self._legacy_from_self(col).enable_eagerloads(False).scalar()

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/query.py", line 2835, in scalar

    ret = self.one()

          ^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/query.py", line 2808, in one

    return self._iter().one()  # type: ignore

           ^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/query.py", line 2857, in _iter

    result: Union[ScalarResult[_T], Result[_T]] = self.session.execute(

                                                  ^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 2351, in execute

    return self._execute_internal(

           ^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/session.py", line 2249, in _execute_internal

    result: Result[Any] = compile_state_cls.orm_execute_statement(

                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/orm/context.py", line 306, in orm_execute_statement

    result = conn.execute(

             ^^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1419, in execute

    return meth(

           ^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/sql/elements.py", line 527, in _execute_on_connection

    return connection._execute_clauseelement(

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1641, in _execute_clauseelement

    ret = self._execute_context(

          ^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context

    return self._exec_single_context(

           ^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context

    self._handle_dbapi_exception(

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 2363, in _handle_dbapi_exception

    raise sqlalchemy_exception.with_traceback(exc_info[2]) from e

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context

    self.dialect.do_execute(

  File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 952, in do_execute

    cursor.execute(statement, parameters)

sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such column: channels.is_duplicate

[SQL: SELECT count(*) AS count_1 

FROM (SELECT channels.id AS channels_id, channels.source_id AS channels_source_id, channels.source_channel_id AS channels_source_channel_id, channels.name AS channels_name, channels.slug AS channels_slug, channels.logo_url AS channels_logo_url, channels.stream_url AS channels_stream_url, channels.stream_type AS channels_stream_type, channels.category AS channels_category, channels.language AS channels_language, channels.country AS channels_country, channels.number AS channels_number, channels.gracenote_id AS channels_gracenote_id, channels.disable_reason AS channels_disable_reason, channels.is_duplicate AS channels_is_duplicate, channels.is_active AS channels_is_active, channels.is_enabled AS channels_is_enabled, channels.created_at AS channels_created_at, channels.updated_at AS channels_updated_at 

FROM channels 

WHERE channels.is_active = 1 AND channels.is_enabled = 1) AS anon_1]

(Background on this error at: https://sqlalche.me/e/20/e3q8)

Oops. Migration bug. Pushing hot fix

Oh., so there was an issue? lol
I just posted that i removed everything in Portainer, and just created it new again.
and it is working.
But, if u see an issue to be fixed...cool

Confirming the hot fix for 1.6.0 is working now.

Well since you deleted it and restarted, it was essentially a fresh install.
You didn’t do anything wrong. I changed a column in the database and didn’t put a migration step for upgrading users. Sorry, I was in a rush to push that out with the worker bug that was going around crashing.

you can re-scrape from the main page.

I just added LocalNow so it's pretty raw - let me see if I improve EPG coverage at all. sometimes, its just a limitation of what the scraper has available to it and thats when I'll adjust timing of the auto-scrapes so it's seamless to end user.

I like that idea, but have you tried just creating a new feed and then selecting channels at the bottom? If I'm following along with you, doesn't that accomplish same thing?

full disclosure -- some of the Gracenote IDs come straight from the upstream scrape. I have not audited many, if at all.. You can override the auto Gracenote ID in channels page.

if you guys see a pattern or one-offs, please share with me. I intend to create a helper script that will keep them consistent, but it'll take feedback from you all.

???? Was that always there????
I swear I didn't see that down there... and I had to scroll pass it to "Save Feed"

OK, now that I see it, yes, it will work...
I just added a Test feed and I clicked one of the Categories and I see only the channels from that Category is listed down there...
So basically, yes, I'm good to go with my request...