OliveTin for Channels: An Interface for Misc Channels DVR Scripts & Tricks

Nope, me again. I gotta say I'm glad to have somebody besides me testing functionality for manual recordings -- so thanks! This one you can fix yourself in the interest of speed, but I'll update the container too of course.

In the config.yaml file you'll find in the directory on your host you have mapped to /config in the container, add double quotes around {{ time }} on this line:

    shell: /config/manualrecordings.sh "{{ name }}" {{ channel }} "{{ time }}" {{ duration }} "{{ summary }}" {{ image }}

Fixed here, so it should work for you too.

EDIT: Container updated now, so you can re-pull if you prefer.

That ended up working! Thank you!

I have another question. Is it possible to be able to add a genre to a recording? For instance, add Pro wrestling so it ends up in my wrestling library collection.

I'm not sure. I'll check it out and let you know though.

1 Like

Just set Airing.Genres to ["Pro wrestling"]

A look at your /dvr/jobs shows the structure

Thanks for this, I updated and it working fine. All of those "Tiny Desk Concert" videos are now tagged properly with YouTube's thumbnail.

This ran successfully too. I assigned the resulting XML to the custom source where my TWiT.tv livestream comes from, and all seems to be working. Super cool!

Now that @Fofer has tested TWiT.tv guide data generation, I've proceeded with the plan to make that a process that can be executed on a recurring schedule (i.e. x minutes or hours between runs).

In addition, there's an integration with healthchecks.io to execute a "ping" on every guide data update. You can view this online to confirm the background process is active, and set it up to generate an e-mail if the scheduled ping is missed.

Click on the TWiT.tv Guide Data button in OliveTin and you'll see:

Leave the defaults as they are, and it'll run once and terminate. Enter an interval of 24h, and it'll update every 24 hours, and so on. Add the ping URL from healthchecks.io in place of the default, and you'll be able to view the pings on the specified interval:

If you want to kill the background process for any reason, enter 0 for the interval and it will terminate. In all cases, leave the default healthchecks.io URL in place unless you want to add your own. OliveTin does not allow fields to be empty.

The generated XML file can be found under /data in whatever directory on your host you've mapped to /config in the container.

If you'd like to have that file statically hosted, so that you can access it via URL in Custom Channels settings, that can be easily done. I use another container for that called static-file-server. If you're interested in that let me know and I can do a quick post about it.

1 Like

I love all the new functions captured in OliveTin for Channels. A great place to capture all these utility scripts that I use to augment Channels functionality.
When I view the container logs, I don't see any date/time stamps so it gets a little hard to see what is going on with repeating scripts. Is there any way to add date & time stamps to log entries?

We'd probably have do some custom logging to get what I think you're after. Having some visibility on background script operation is the reason I added the healthchecks.io option. It would be pretty easy to add custom logging, but trickier to do it with log rotations, size limits, and the like.

Not that big of a deal. I was not planning on using healthchecks_io and was going to use the container log to verify it was still running. I guess I'll go the healthchecks_io route.

Would this work: set lines=10000, turn off auto-refresh, filter=ical_2_xmltv (or whatever), turn on display timestamps?

Can't believe I missed the display timestamps toggle. Actually I can believe it. Thanks!

Thanks, I've now got this TWiT guide data script running every 3 days, with a healthchecks.io pinging too. Your customized OliveTin will make setup a lot easier for folks just getting started. Much appreciated!

What I'd like to do next is have the YouTube Thumbnails Fix something that can be run on a recurring schedule. In addition, I'd like it to be able to handle multiple video groups.

A few possibilities for this, the first would be to have a text file containing a list of video groups to process. The second would be to use the API as described by @chDVRuser and process all of the files returned. Or, a possible third would be a hybrid option, where we'd create a text file from the API and then that file could be modified in case not all files should be processed.

Could you give the API a try, and post the output here so I know exactly what the output looks like?

You were not asking me @bnhf, but if you want to see the output of the command

curl http://192.168.0.11:8089/api/v1/video_groups

It looks like this (after applying the bbedit-pretty-json text filter in BBEdit).

[
{
"id": "videos-7939d590cedcc59327a180c1fd2dc6da51849140448ee1d34f340244c33aff18",
"name": "tinydesk",
"image_url": "http://192.168.0.11:8089/dvr/uploads/21/content",
"video_count": 19,
"number_unwatched": 19,
"favorited": false,
"last_watched_at": 1695179669416,
"created_at": 1690071730000,
"updated_at": 1695657499323
},
{
"id": "videos-564fdc121a69b17f988136ec5d815b579da39522c054c903fa72ce8b2f3a7b18",
"name": "veritasium",
"image_url": "http://192.168.0.11:8089/dvr/uploads/28/content",
"video_count": 36,
"number_unwatched": 14,
"favorited": true,
"last_watched_at": 1694745018292,
"created_at": 1669251072000,
"updated_at": 1694745018292
},
{
"id": "videos-e6e9b8e7c6470c95a347d2b4646357ce1fc911e2a910df1154c803026b6ddbec",
"name": "wiredgourmet",
"image_url": "http://192.168.0.11:8089/dvr/uploads/73/content",
"video_count": 67,
"number_unwatched": 49,
"favorited": true,
"last_watched_at": 1695434774605,
"created_at": 1675992720000,
"updated_at": 1695434774605
}
]

Thanks -- that's just what I need. Would you think most people would want to update all of their groups on a schedule -- with an on demand option? It'd be easiest to just do them all either way.

Could you link me to those scripts -- assuming you've posted them of course. :slight_smile: Now that I have a reasonably generic foreground script to launch background scripts, we should be able to add them to olivetin-for-channels.

Yes. Only videos with specifically-formatted filenames will work with this script anyway, so might as well apply it to all groups, by default. I think that’s what most people would want and expect anyway, as it would be easier than manually selecting each group that the script will interact with, or removing groups that it won't.

@Fofer @cyoungers

I built a version of OliveTin with the test flag (bnhf/olivetin:test) that should work for updating YouTube thumbnails for all of your video_groups, on a recurring schedule. Very similar to the TWiT.tv guide update routine -- in fact they both use the same front-end shell script.

If one or both of you could give it a try, as I'm not using this feature so I don't have any live data to use. It works with the sample JSON, but there's nothing like an actual test. Be sure to use a different healthchecks.io URL, as the idea is to be able to identify each background job.

I actually got an e-mail from them today, as I don't have the TWiT.tv script running anymore, and here's what it looks like:

screenshot-mail.google.com-2023.09.28-16_13_57

Cool, thanks I will give it a whirl tonight. What’s the easiest way to install that “test flag” version instead of the version I have currently, while retaining my setup? I have Portainer running so I’m assuming it’s an easy switchover?

It's definitely easy: Stop your olivetin stack. Change latest to test in your docker-compose. Click on Update and re-pull and redeploy for good measure. All your persistent data will be preserved.

EDIT: You will need to restart the TWiT.tv guide data background script. This will be true any time the container or docker host is restarted