Web Player: No HTTPS support?

Looks like Plex pays digicert for an SSL certificate for every one of their users: https://support.plex.tv/hc/en-us/articles/206225077-How-to-Use-Secure-Server-Connections

Unfortunately we don’t have the size or budget to do this. We could use self signed certs, but they are quite annoying to use and can be MITM just as easily.

Ideally we could use letsencrypt, but currently that only works if you run a service on port <1024, which requires root access.

1 Like

The TVOS and IOS client could setup a secure communication without setting up certificates, since the Channels site can verify the identity of both parties. it would just exclude the web brower clients

Thanks for your candid answer. At least this lets us know the challenges you guys are dealing with. It’s good that we know it’s not because of technical barrier.

It's quite interesting how Plex is doing this. Here's an article with some more technical detail:

https://blog.filippo.io/how-plex-is-doing-https-for-all-its-users/

2 Likes

This works on synology if a LetsEncrypt certificate is already set up:

That’s sweet!

1 Like

on second thought, this is insecure because it breaks the remote access authentication. Maybe a separate authentication could be created, but I haven’t looked into this yet.

I have my Channels DVR set up on a real host name, protected with a Let’s Encrypt cert, via an nginx reverse proxy.

Yes, when you do this, your server will use HTTPS but will then also not have any authentication. You will need to protect it with your own means. I do this with simple basic auth.

While setting up the cert and a basic auth htpasswd file is outside the scope of my comment, you can find that information pretty easy online. But I thought I’d at least provide my nginx vhost config as an example:

# --- + PROXY + ---

# Template variables:
#
# * domain = channels.mydomain.com
# * name = channels
# * type = proxy
# * host = 192.168.1.198
# * port = 8089

map $http_upgrade $connection_upgrade {
    default upgrade;
    ''      close;
}

upstream channels-lb {
    server 192.168.1.198:8089;
}

server {
   listen 80;
   server_name channels.mydomain.com;
   return 301 https://channels.mydomain.com$request_uri;
}

server {
    server_name channels.mydomain.com;
    listen 443 ssl;

    location / {
        proxy_set_header Host $host;
        proxy_set_header X-Forwarded-Proto $scheme;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;

        proxy_pass http://channels-lb;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection $connection_upgrade;

        auth_basic "Username and Password Required";
        auth_basic_user_file /etc/nginx/.htpasswd;
    }

    ssl_certificate /etc/letsencrypt/live/channels.mydomain.com/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/channels.mydomain.com/privkey.pem;

    add_header Strict-Transport-Security "max-age=31536000; includeSubdomains";
    ssl on;
    ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
    ssl_ciphers "EECDH+AESGCM:EDH+AESGCM:AES256+EECDH:AES256+EDH:!aNULL:!eNULL:!EXPORT:!DES:!MD5:!PSK:!RC4";
    ssl_prefer_server_ciphers on;
    ssl_session_cache shared:SSL:10m;

    proxy_buffering off;

    access_log /var/log/nginx/channels-access.log;
    error_log  /var/log/nginx/channels-error.log;
}
2 Likes

@maddox Thanks for this, it works! I do notice though, that the guide view doesn’t show the recording clock icons on the programs when viewing in safari. It also doesn’t seem to see the what is recording from guide view. Chrome works though. I think something is breaking some part of the javascript.

Also, when trying to play live stream remotely on an iphone, the htpasswd auth asks for un/pw multiple times and the stream doesnt load.

This has been a fun experiment, but the non-ssl connection without using any proxy results in a more functional website.

1 Like

FYI, let’s encrypt will begin supporting wildcard certs in January:

You could probably use this to accomplish what Plex is doing, if you really wanted to.

1 Like

Wildcard certs don’t really help us, because using one would mean every customer shares the same cert. And if you get a copy of the shared private cert key just by downloading Channels DVR, then you can easily use that to MITM anyone else’s DVR.

Plex has an agreement with a CA which issues a separate cert for each of their users. We could do something similar with Let’s Encrypt, but they cap each account to a maximum of 20 new certificates a week.

1 Like

Seems there’s no easy way.

Plex’s specific technique uses a wildcard cert for each user (described in the article earlier in this thread). That said, I didn’t know about the 20/day limit - that would be a show stopper.

but each user could get his/her own wildcard cert from Let’s Encrypt. They would only need 1. 1 < 20
so, really all we need is for the channels devs to make an interface that allows a direct connection over https.

1 Like

I too would love to see full HTTPS support, but I understand the limitations of Let’s Encrypt.

I refuse to expose any unencrypted services to the WAN. At the very least, a stolen DVR auth token would allow an attacker to write very large files to arbitrary locations on your system.

Right now, I’m using an NGINX reverse proxy with simple cookie-based auth (since I can’t stand basic auth prompts). I’d love to be able to define which IP addresses are considered local in the DVR backend so I wouldn’t have to rely on my own auth scheme. Alternatively, Channels DVR could require all clients to authenticate using a temporary 6-8 character code like many other TV apps.

Since HTTPS clearly presents obstacles, have you considered building virtual private networking into the app itself using something like the ZeroTier SDK?

1 Like

Could you paste an example of your vhost conf for this? I’d love to get rid of basic auth, I used a strong password and its ruining my life lol.

Sure! There are two files involved, the NGINX config and a simple page that sets a cookie named customauth. The config just ensures that any request without the appropriate cookie is redirected to the login page.

I don’t want my password stored in plaintext anywhere, so the cookie value is actually the SHA1 hash of my password. (I know, SHA1 is not secure for password hashing, but I use a very long, randomly generated password.)

You can get the SHA1 hash of your password by running this command:

echo -n "MYPASSWORD" | openssl sha1

The cookie is set for 365 days, so you won’t have to login very often!

Also, I noticed the web interface sometimes pulls insecure assets, so I added a Content-Security-Policy header to fix that.

Which assets? We should fix that…

Its the json pipe that causes problems when it breaks over the reverse proxy