Model add gives a SSL certification verification error when hosting my local TaskingAI docker

How can I configure docker so that certificate validation is skipped. I tried setting DOCKER_TLS_VERIFY=0 in the docker compose yaml file but when I try to add a model while communicating with huggingface the SSL connection fails. I want to disable the certificate verification. Please help.

Of course, please provide some information to help us with the diagnosis:

  1. Your docker compose content
  2. The specific provider you chose, is it Hugging Face Inference Endpoint or Hugging Face Inference API?

I did not change anything in docker compose initially. When I got the certificate exception I edited it to explicitly not look for TLS verify as below. I tried to use Huggingface Inference API but it dows not matter. Any https connection outbound would give this exception unless the TLS verify is switched off.

version: “3.3”

services:
frontend:
image: taskingai/taskingai-console:v0.2.1
environment:
DOCKER_TLS_VERIFY: 0
depends_on:
- backend-web
- backend-api

backend-inference:
image: taskingai/taskingai-inference:v0.2.2
environment:
DOCKER_TLS_VERIFY: 0
AES_ENCRYPTION_KEY: b90e4648ad699c3bdf62c0860e09eb9efc098ee75f215bf750847ae19d41e4b0 # replace with your own
ICON_URL_PREFIX: http://localhost:8080 # replace with your own

backend-plugin:
image: taskingai/taskingai-plugin:v0.2.1
environment:
DOCKER_TLS_VERIFY: 0
AES_ENCRYPTION_KEY: b90e4648ad699c3bdf62c0860e09eb9efc098ee75f215bf750847ae19d41e4b0 # replace with your own
ICON_URL_PREFIX: http://localhost:8080 # replace with your own

backend-api:
image: taskingai/taskingai-server:v0.2.1
environment:
DOCKER_TLS_VERIFY: 0
POSTGRES_URL: postgres://postgres:TaskingAI321@db:5432/taskingai
REDIS_URL: redis://:TaskingAI321@cache:6379/0
TASKINGAI_INFERENCE_URL: http://backend-inference:8000
TASKINGAI_PLUGIN_URL: http://backend-plugin:8000
AES_ENCRYPTION_KEY: b90e4648ad699c3bdf62c0860e09eb9efc098ee75f215bf750847ae19d41e4b0 # replace with your own
depends_on:
- db
- cache
- backend-inference
- backend-plugin

backend-web:
image: taskingai/taskingai-server:v0.2.1
environment:
DOCKER_TLS_VERIFY: 0
POSTGRES_URL: postgres://postgres:TaskingAI321@db:5432/taskingai
REDIS_URL: redis://:TaskingAI321@cache:6379/0
TASKINGAI_INFERENCE_URL: http://backend-inference:8000
TASKINGAI_PLUGIN_URL: http://backend-plugin:8000
AES_ENCRYPTION_KEY: b90e4648ad699c3bdf62c0860e09eb9efc098ee75f215bf750847ae19d41e4b0 # replace with your own
JWT_SECRET_KEY: dbefe42f34473990a3fa903a6a3283acdc3a910beb1ae271a6463ffa5a926bfb # replace with your own
PURPOSE: WEB
DEFAULT_ADMIN_USERNAME: admin
DEFAULT_ADMIN_PASSWORD: TaskingAI321 # replace with your own
depends_on:
- db
- cache
- backend-inference
- backend-plugin

db:
image: ankane/pgvector:v0.5.1
environment:
DOCKER_TLS_VERIFY: 0
POSTGRES_DB: taskingai
POSTGRES_USER: postgres
POSTGRES_PASSWORD: TaskingAI321
volumes:
- ./data/postgres:/var/lib/postgresql/data
healthcheck:
test: [“CMD-SHELL”, “pg_isready -U postgres”]
interval: 5s
timeout: 5s
retries: 10
restart: always

cache:
image: redis:7-alpine
command: [“redis-server”, “–requirepass”, “TaskingAI321”]
volumes:
- ./data/redis:/data
healthcheck:
test: [“CMD”, “redis-cli”, “auth”, “password”, “ping”]
interval: 5s
timeout: 5s
retries: 10
restart: always

nginx:
image: nginx:1.24
ports:
- “8080:80”
# edit this to change the port, for example to “8080:80” to use port 8080
volumes:
- ./nginx/conf:/etc/nginx/conf.d
- ./data/nginx_cache:/var/cache/nginx
depends_on:
- frontend
- backend-web
- backend-api

I’m sorry, we have looked into this issue, and it is due to the TLS authentication required by Huggingface, which cannot be resolved from TaskingAI’s perspective. Perhaps you could try an alternative solution.

After disabling TLS authentication, models from other providers are still usable.

Every other model is giving similar message. Basically any outbound https connection is issuing this message. I do not think it is only specific to Hugging Face.