DM Sent
For the moment let's not worry about the healthcheck not showing the env id.
Instead, let's see if we can Project One-Click spin-ups working. First, stop the OliveTin stack and change your PORTAINER_ENV value to 1, as we now know that's the correct env id. Start the stack again, and then modify these lines in portainerstack.sh:
Lines 15 & 16 should currently look like this:
portainerEnv=$(curl -s -k -H "X-API-Key: $portainerToken" "http://$portainerHost:9000/api/endpoints" | jq '.[] | select(.Name=="local") | .Id') \
&& [[ -z $portainerEnv ]] && portainerEnv=$(curl -s -k -H "X-API-Key: $portainerToken" "https://$portainerHost:$portainerPort/api/endpoints" | jq '.[] | select(.Name=="local") | .Id')
I'd like you to add a space and a backslash to the end of line 16, and then add the third line below between the current line 16 & line 17:
portainerEnv=$(curl -s -k -H "X-API-Key: $portainerToken" "http://$portainerHost:9000/api/endpoints" | jq '.[] | select(.Name=="local") | .Id') \
&& [[ -z $portainerEnv ]] && portainerEnv=$(curl -s -k -H "X-API-Key: $portainerToken" "https://$portainerHost:$portainerPort/api/endpoints" | jq '.[] | select(.Name=="local") | .Id') \
&& [[ -z $portainerEnv ]] && portainerEnv="$PORTAINER_ENV"
This should allow us a fallback of using the value in PORTAINER_ENV if the curl commands are unsuccessful in getting the env id.
After that, try to add any project via Project One-Click...
EDIT: The above shouldn't be necessary. I'm fairly certain the non-standard name you're using for the local Portainer environment is at the root of this. Check my response to your last PM, but if you rename primary back to local everything should start working.
I have Channels DVR and Portainer running on my Synology NAS. I am following the process to get OliveTin set up. I am able to do the first part in Portainer and access the OliveTin web interface on port 1337. I then am able to run the Environment Variables Generator/Tester successfully. I then stopped the olivetin stack, removed the two initial variables and then add in the text from the generator. When I then update the stack (I didn't selectit to do the re-pull and redeploy option), but then I get the following error message:
Failed to deploy a stack: compose up operation failed: Error response from daemon: Bind mount failed: '/data/olivetin' does not exist
Not sure how to proceed from that error and any help is appreciated.
I believe you need to create that folder on your Synology.
As @jagrim said, you need to create folders in advance on Synology, as they don't support Docker creating them.
I believe most Synology users find that adding two folders in advance -- /volume1/docker/olivetin and /volume1/docker/olivetin/data works well. Then use a value of HOST_DIR=/volume1/docker in the stack.
EDIT: Note the OliveTin Environment Variables Generator/Tester Action has some specifics on this requirement as well, when describing likely values for HOST_DIR:
Thank you both. That indeed was my issue. Once I created the two folders and updated the stack, the container launched as expected. I appreciate the help and the great tool! I look forward to learning more about it and using it.
I'm unable to launch the stack in Portainer. I copied and pasted exactly as posted at the beginning of this thread, only changing it by adding the two environmental variables you gave (-ezstart and host IP). I get the following error upon deployment:
Invalid interpolation format for services.olivetin.volumes.[]: "${HOST_DIR:-/unused}${HOST_DIR:+/olivetin:/config}"; you may need to escape any $ with another $
I'm a bit baffled because if that were an actual problem I imagine everyone else would be running into it, too, and I can't find any mention of it. And also, why only on that line and not the others that are similarly formatted?
For reference, I'm running Docker Desktop on Windows 11 Pro and using Portainer in the GUI. Docker Desktop version 4.45.0 (updated this morning, got the same error on v4.44.3) and Portainer version 2.33.1. Docker version 28.3.3
I had to update the OliveTin Docker Compose recently, due to a change in acceptable syntax allowed by Portainer. I just update post #1 in this thread to reflect that, so update your compose accordingly.
The error you're seeing is new to me though. The compose should not be edited, with the two env vars added in the Environment variables section of Portainer (Advanced mode is easiest). Like this:
Ah, thanks for the clarification! I copied the updated compose, put the two environment variables in the right section this time, and I still get that same error. So then I replaced the variable string with the actual volume mount path (/local/path/to/olivetin/config:/config).
Now I'm getting this error instead:
Error response from daemon: rpc error: code = InvalidArgument desc = ContainerSpec: duplicate mount point: +
Correct me if I'm wrong, but I believe the problem is that it's not interpreting the variables correctly -- it seems to think that the "+" is the path?
Looks like EZ-start is not in the cards for me! Lol!
You're using the full version of Portainer correct (i.e. not the extension version installed from within Docker Desktop)?
After your first message, I confirmed that nothing has changed when installing Portainer via a WSL2 distro (I used Debian), and that OliveTin installs fine from there. Tested on Windows 11 Pro 24H2, Docker Desktop 28.3.3, Portainer 2.33.1 LTS (full version), Debian 11.11 (for WSL2) and WSL2 2.5.9.0.
Ahhhhha! I'm using the extension in the GUI for Docker Desktop. Let me try the full version!
Haha, just kidding. I uninstalled the extension, installed Portainer Community Edition 23.3.1 LTS. Added the stack, clicked deploy, and got the same errors. Swapped the actual path for the config volume and cleared that error, but I'm still getting the "duplicate mount point: +" error. I am on WSL2 version 2.5.10.0, but there is a chance I updated it during troubleshooting all of this.
It appears that whatever the cause, my system isn't parsing the variables correctly, so I'll just write up my docker-compose file without them!
Not sure what could be going on there, but you might consider deleting Portainer, along with its image and volume:
docker stop portainer
docker rm portainer
docker volume rm portainer_data
docker rmi portainer/portainer-ce:latest
And then follow the instructions here for installing Portainer and the full version of OliveTin-for-Channels using a variation on the EZ-Start process:
There'd be no need for you to install anything via the Portainer-Stacks Editor. You'd be installing EZ-Start via the command line to port 1338, and then use it to install Portainer and the full version of OliveTin-for-Channels on port 1337 using Project One-Click.
BRILLIANT!!! Worked like a charm. Thank you for helping me through that!
I am stuck 
I am running on a Mac with Docker Desktop, Portainer installed
When running OliveTin Environment Variables Generator/Tester
I get this:
Error: No such object: olivetin
cat: /config/olivetin.token: No such file or directory
I played around with Host_Dir, even setting the variable manually. It does create and update files in /Users/admin/olivetin
I created the /olivetin/config directory manually, gave full read/write to everyone on the entire olivetin folder
I have gone as far as resetting my Docker desktop back to defaults.
I can get SLM running with no issues, just not olivetin. I even tried from the command line installing olivetin -> Portainer, I hit a different error.
TAG=latest
DOMAIN=
HOST_PORT=1337
CHANNELS_DVR_HOST=192.168.144.146
CHANNELS_DVR_PORT=8089
CHANNELS_CLIENTS=
ALERT_SMTP_SERVER=
ALERT_EMAIL_FROM=
ALERT_EMAIL_PASS=
ALERT_EMAIL_TO=
UPDATE_YAMLS=true
UPDATE_SCRIPTS=true
TZ=US/Eastern
HOST_DIR=/Users/admin/
DVR_SHARE=/Users/admin/channels-data
LOGS_SHARE=/Users/admin/Library/Application Support/ChannelsDVR
TUBEARCHIVIST_SHARE=/Users/admin/channels-data
DVR2_SHARE=
LOGS2_SHARE=
TUBEARCHIVIST2_SHARE=
DVR3_SHARE=
LOGS3_SHARE=
TUBEARCHIVIST3_SHARE=
HOST_SFS_PORT=8080
FOLDER=/web
PORTAINER_TOKEN=
PORTAINER_HOST=192.168.144.146
PORTAINER_PORT=9443
PORTAINER_ENV=2
PERSISTENT_LOGS=false
Any help would be greatly appreciated!
Do you have the full version of Portainer installed? Or, did you install the Portainer extension from within Docker Desktop?
I tried first with the extension, then I also tried installing the full version using the advanced method.
Currently I am back to the extension.
The extension won't work, you need the full version.
I'd suggest clearing the decks, by deleting the extension and the full version, and then use the "next generation" version of EZ-Start to install both Portainer and the full version of OliveTin.
Be sure to delete the Docker Volume portainer_data too, or the Portainer token creation process will fail. You should probably delete any portainer or olivetin images as well, in the interest of really starting fresh.
@bnhf
Finally got around to installing Olive Tin on my test server NAS (Synology). My ultimate goal is to install Project One-Click on the Test Server then add to the Production Server.
I believe everything installed correctly but thought I'd put my Post Health Check for inspection as well as the compose and environments before I move on to the next step.
Post Health Check:
Checking your OliveTin-for-Channels installation...
(extended_check=false)
OliveTin Container Version 2025.09.26
OliveTin Docker Compose Version 2025.08.25
----------------------------------------
Checking that your selected Channels DVR server (10.0.1.194:8089) is reachable by URL:
HTTP Status: 200 indicates success...
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 1276 100 1276 0 0 1246k 0 --:--:-- --:--:-- --:--:-- 1246k
HTTP Status: 200
Effective URL: http://10.0.1.194:8089/
----------------------------------------
Checking that your selected Channels DVR server's data files (/mnt/10.0.1.194-8089) are accessible:
Folders with the names Database, Images, Imports, Logs, Movies, Streaming and TV should be visible...
total 0
d--------- 1 root root 22 Sep 27 2023 #recycle
d--------- 1 root root 116 Nov 13 06:00 .
drwxr-xr-x 1 root root 106 Nov 13 11:12 ..
drwxrwxrwx 1 root root 8 Sep 23 16:32 @eaDir
drwx------ 1 242120 242120 1320 Nov 13 05:32 Database
drwx------ 1 242120 242120 104 Nov 8 10:14 Images
drwx------ 1 242120 242120 28 Sep 26 2023 Imports
drwx------ 1 242120 242120 32 Sep 26 2023 Logs
drwx------ 1 242120 242120 4 Nov 7 20:42 Metadata
drwx------ 1 242120 242120 40 Nov 13 06:00 Streaming
drwx------ 1 242120 242120 96 Nov 13 06:00 TV
Docker reports your current DVR_SHARE setting as...
/volume1/ChannelsDVR
If the listed folders are NOT visible, AND you have your Channels DVR and Docker on the same system:
Channels reports this path as...
/volume1/ChannelsDVR
----------------------------------------
Checking that your selected Channels DVR server's log files (/mnt/10.0.1.194-8089_logs) are accessible:
Folders with the names data and latest should be visible...
total 4
drwxr-xr-x 1 242120 242120 200 Nov 13 05:32 .
drwxr-xr-x 1 root root 106 Nov 13 11:12 ..
drwxr-xr-x 1 242120 242120 154 Sep 25 17:27 2025.09.25.2220
drwxr-xr-x 1 242120 242120 154 Sep 25 20:28 2025.09.26.0118
drwxr-xr-x 1 242120 242120 154 Oct 9 15:18 2025.10.04.0225
drwxr-xr-x 1 242120 242120 134 Oct 29 13:19 2025.10.28.0018
drwxr-xr-x 1 242120 242120 134 Oct 29 21:19 2025.10.30.0047
drwxr-xr-x 1 242120 242120 154 Nov 13 05:32 2025.11.07.0832
drwxr-xr-x 1 242120 242120 1048 Nov 13 11:05 data
lrwxrwxrwx 1 242120 242120 15 Nov 13 05:32 latest -> 2025.11.07.0832
Docker reports your current LOGS_SHARE setting as...
/var/packages/ChannelsDVR/target/channels-dvr
If the listed folders are NOT visible, AND you have your Channels DVR and Docker on the same system:
Channels reports this path as...
/var/packages/ChannelsDVR/target/channels-dvr
----------------------------------------
Checking if your Portainer token is working on ports 9000 and/or 9443:
Portainer http response on port 9000 reports version 2.33.3
Portainer Environment ID for local is 2
Portainer https response on port 9443 reports version
Portainer Environment ID for local is
----------------------------------------
Here's a list of your current OliveTin-related settings:
HOSTNAME=olivetin
CHANNELS_DVR=10.0.1.194:8089
CHANNELS_DVR_ALTERNATES=
CHANNELS_CLIENTS=
ALERT_SMTP_SERVER=
ALERT_EMAIL_FROM=[Redacted]@
ALERT_EMAIL_PASS=[Redacted]
ALERT_EMAIL_TO=[Redacted]@
UPDATE_YAMLS=true
UPDATE_SCRIPTS=true
PORTAINER_TOKEN=[Redacted]
PORTAINER_HOST=10.0.1.194
PORTAINER_PORT=9443
PORTAINER_ENV=2
----------------------------------------
Here's the contents of /etc/resolv.conf from inside the container:
search local
nameserver 127.0.0.11
options ndots:0
----------------------------------------
Here's the contents of /etc/hosts from inside the container:
127.0.0.1 localhost
::1 localhost ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters
172.23.0.2 olivetin
Compose:
services:
olivetin: # This docker-compose typically requires no editing. Use the Environment variables section of Portainer to set your values.
# 2025.08.25
# GitHub home for this project: https://github.com/bnhf/OliveTin.
# Docker container home for this project with setup instructions: https://hub.docker.com/r/bnhf/olivetin.
image: bnhf/olivetin:${TAG:-latest} # Add the tag like latest or test to the environment variables below.
container_name: ${OLIVETIN_NAME:-olivetin}${EZ_START}
hostname: ${OLIVETIN_NAME:-olivetin}${EZ_START}
dns_search: ${DOMAIN:+${DOMAIN}} # For Tailscale users using Magic DNS, add your Tailnet (tailxxxxx.ts.net) to use hostnames for remote nodes, otherwise use your local domain name.
ports:
- ${HOST_PORT:-1337}:1337
environment:
- OLIVETIN_COMPOSE=2025.08.25${EZ_START} # Do not change this value.
- CHANNELS_DVR=${CHANNELS_DVR_HOST}:${CHANNELS_DVR_PORT:-8089} # Add your Channels DVR server in the form CHANNELS_DVR_HOST=<hostname or ip> and CHANNELS_DVR_PORT=<port>.
- CHANNELS_DVR_ALTERNATES=${CHANNELS_DVR2_HOST:+${CHANNELS_DVR2_HOST}:${CHANNELS_DVR2_PORT}}${CHANNELS_DVR3_HOST:+ ${CHANNELS_DVR3_HOST}:${CHANNELS_DVR3_PORT}} # Space separated list of alternate Channels DVR servers to choose from in the form hostname:port or ip:port.
- CHANNELS_CLIENTS=${CHANNELS_CLIENTS} # Space separated list of Channels DVR clients you'd like notifications sent to in the form hostname or IP.
- ALERT_SMTP_SERVER=${ALERT_SMTP_SERVER} # SMTP server to use for sending alert e-mails. smtp.gmail.com:587 for example.
- ALERT_EMAIL_FROM=${ALERT_EMAIL_FROM} # Sender address for alert e-mails.
- ALERT_EMAIL_PASS=${ALERT_EMAIL_PASS} # SMTP "app" password established through GMail or Yahoo Mail. Do not use your everyday e-mail address.
- ALERT_EMAIL_TO=${ALERT_EMAIL_TO} # Recipient address for alert e-mails.
- UPDATE_YAMLS=${UPDATE_YAMLS:-true} # Set this to true to update config.yaml.
- UPDATE_SCRIPTS=${UPDATE_SCRIPTS:-true} # Set this to true to update all included scripts.
- TZ=${TZ} # Add your local timezone in standard linux format. E.G. US/Eastern, US/Central, US/Mountain, US/Pacific, etc.
- PORTAINER_TOKEN=${PORTAINER_TOKEN} # Generate via <username> dropdown (upper right of WebUI), "My account", API tokens.
- PORTAINER_HOST=${PORTAINER_HOST:-$CHANNELS_DVR_HOST} # Hostname or IP of the Docker host you're running Portainer on.
- PORTAINER_PORT=${PORTAINER_PORT:-9443} # https port you're running Portainer on. 9443 is the default.
- PORTAINER_ENV=${PORTAINER_ENV:-2} # Set this is if you're using an alternate Portainer Environment for some reason. 2 is the default.
- PERSISTENT_LOGS=${PERSISTENT_LOGS:-false} # For supported Actions, log files are retained on an ongoing basis. false is the default.
volumes:
- ${HOST_DIR:-/unused}${HOST_DIR:+/olivetin:/config} # Add the parent directory on your Docker host you'd like to use.
- ${DVR_SHARE:-/unused}${DVR_SHARE:+:/mnt/${CHANNELS_DVR_HOST}-${CHANNELS_DVR_PORT}} # This can either be a Docker volume or a host directory that's connected via Samba or NFS to your Channels DVR network share.
- ${LOGS_SHARE:-/unused}${LOGS_SHARE:+:/mnt/${CHANNELS_DVR_HOST}-${CHANNELS_DVR_PORT}_logs} # This can either be a Docker volume or a host directory that's connected via Samba or NFS to your Channels DVR logs network share.
- ${TUBEARCHIVIST_SHARE:-/unused}${TUBEARCHIVIST_SHARE:+:/mnt/${CHANNELS_DVR_HOST}-${CHANNELS_DVR_PORT}_ta} # This can either be a Docker volume or a host directory that's connected via Samba or NFS to your TubeArchivist videos network share.
- ${DVR2_SHARE:-/unused}${DVR2_SHARE:+:/mnt/${CHANNELS_DVR2_HOST}-${CHANNELS_DVR2_PORT}} # Note that these volume mounts should always be to /mnt/hostname-port or /mnt/ip-port (dash rather than a colon between).
- ${LOGS2_SHARE:-/unused}${LOGS2_SHARE:+:/mnt/${CHANNELS_DVR2_HOST}-${CHANNELS_DVR2_PORT}_logs} # This can either be a Docker volume or a host directory that's connected via Samba or NFS to your Channels DVR logs network share.
- ${TUBEARCHIVIST2_SHARE:-/unused}${TUBEARCHIVIST2_SHARE:+:/mnt/${CHANNELS_DVR2_HOST}-${CHANNELS_DVR2_PORT}_ta} # This can either be a Docker volume or a host directory that's connected via Samba or NFS to your TubeArchivist videos network share.
- ${DVR3_SHARE:-/unused}${DVR3_SHARE:+:/mnt/${CHANNELS_DVR3_HOST}-${CHANNELS_DVR3_PORT}} # Note that these volume mounts should always be to /mnt/hostname-port or /mnt/ip-port (dash rather than a colon between).
- ${LOGS3_SHARE:-/unused}${LOGS3_SHARE:+:/mnt/${CHANNELS_DVR3_HOST}-${CHANNELS_DVR3_PORT}_logs} # This can either be a Docker volume or a host directory that's connected via Samba or NFS to your Channels DVR logs network share.
- ${TUBEARCHIVIST3_SHARE:-/unused}${TUBEARCHIVIST3_SHARE:+:/mnt/${CHANNELS_DVR3_HOST}-${CHANNELS_DVR3_PORT}_ta} # This can either be a Docker volume or a host directory that's connected via Samba or NFS to your TubeArchivist videos network share.
- /var/run/docker.sock:/var/run/docker.sock
restart: unless-stopped
static-file-server:
image: halverneus/static-file-server:latest
container_name: ${SFS_NAME:-static-file-server}${EZ_START}
dns_search: ${DOMAIN}
ports:
- ${HOST_SFS_PORT:-0}:8080
environment:
- FOLDER=${FOLDER:-/web}
volumes:
- ${HOST_DIR:-/unused}${HOST_DIR:+/olivetin/data:${FOLDER:-/web}}
restart: unless-stopped
#0#volumes: # Remove the #x# to enable. Use this section if you've setup a docker volume named channels-dvr, with CIFS or NFS, to bind to /mnt/dvr inside the container. Set ${DVR_SHARE} to channels-dvr (DVR_SHARE=channels_dvr) in that example.
#1#channels-dvr:
#1#external: true
#2#channels-dvr-logs:
#2#external: true
#3#tubearchivist:
#3#external: true
#4#channels-dvr2:
#4#external: true
#5#channels-dvr2-logs:
#5#external: true
#6#tubearchivist2:
#6#external: true
#7#channels-dvr3:
#7#external: true
#8#channels-dvr3-logs:
#8#external: true
#9#tubearchivist3:
#9#external: true
Environment Variables:
TAG=latest
DOMAIN=local
HOST_PORT=1337
CHANNELS_DVR_HOST=10.0.1.194
CHANNELS_DVR_PORT=8089
CHANNELS_CLIENTS=
ALERT_SMTP_SERVER=
ALERT_EMAIL_FROM=
ALERT_EMAIL_PASS=
ALERT_EMAIL_TO=
UPDATE_YAMLS=true
UPDATE_SCRIPTS=true
TZ=US/Central
HOST_DIR=/volume1/data
DVR_SHARE=/volume1/ChannelsDVR
LOGS_SHARE=/var/packages/ChannelsDVR/target/channels-dvr
TUBEARCHIVIST_SHARE=/volume1/ChannelsDVR
DVR2_SHARE=
LOGS2_SHARE=
TUBEARCHIVIST2_SHARE=
DVR3_SHARE=
LOGS3_SHARE=
TUBEARCHIVIST3_SHARE=
HOST_SFS_PORT=8080
FOLDER=/web
PORTAINER_TOKEN=redacted
PORTAINER_HOST=10.0.1.194
PORTAINER_PORT=9443
PORTAINER_ENV=2
PERSISTENT_LOGS=false
Looks good. A couple of 200 responses and the desired folders are visible. No response from Portainer on 9443 is not a show-stopper, given it's responding on 9000 via http. I'd say you're good-to-go. 

