open-webui can't connect to ollama

ctyrrell-versovaPRO

3 days ago

Greetings all. Just getting settled into a new job, first task is getting an inherited Open WebUI (from October '24) based app running locally in Docker, then in Railway. Project ID 55c60558-3d94-46ba-8f9e-76d269118868.

When it's running in Railway, I see this in the logs:

INFO [open_webui.apps.ollama.main] get_all_models()
ERROR [open_webui.apps.ollama.main] Connection error: Cannot connect to host ollama:11434 ssl:default [Name or service not known]

I came across a suggestion to change OLLAMABASEURL to use ollama.railway.internal instead, but the result is the same.

In railway CLI, the command

railway run docker exec -it open-webui curl [http://ollama:11434/api/tags](http://ollama:11434/api/tags)

returns

{"models":[{"name":"llama3.1:latest","model":"llama3.1:latest","modified_at":"2025-06-03T18:48:49.017518Z","size":4920753328,"digest":"46e0c10c039e019119339687c3c1757cc81b9da49709a3b3924863ba87ca666e","details":{"parent_model":"","format":"gguf","family":"llama","families":["llama"],"parameter_size":"8.0B","quantization_level":"Q4_K_M"}}]}%

Based on this, it looks to me like open-webui can connect to ollama, but the logs say otherwise.

HELP!

$10 Bounty

0 Replies

3 days ago

Hey, yes you're required to use ollama.railway.internal (if your Ollama service is called by that name).


3 days ago

๐Ÿฆ™


3 days ago

thatโ€™s my contribution


3 days ago

and railway run runs locally on your machine so what you're probably getting is your local ollama service running and not Railway's.


3 days ago

if you want to ssh into your service, you can use railway ssh instead.


ctyrrell-versovaPRO

3 days ago

Ah, that makes a difference.


ctyrrell-versovaPRO

3 days ago

OK, via ssh
exec curl [http://ollama:11434/api/tags](http://ollama:11434/api/tags)
Could not resolve host: ollama

exec curl [http://ollama.railway.internal:11434/api/tags](http://ollama.railway.internal:11434/api/tags)
Could not resolve host: ollama.railway.internal

Is it even there?


3 days ago

are you sure there's a ollama service in your project?


ctyrrell-versovaPRO

3 days ago

I'm sure that I think there should be. ๐Ÿ˜‰

services: ollama: volumes: - ollama:/root/.ollama container_name: ollama pull_policy: always tty: true restart: unless-stopped image: ollama/ollama:${OLLAMA_DOCKER_TAG-latest} entrypoint: ["/bin/sh", "-c", "ollama serve & sleep 10 && ollama pull llama3.1 && wait"] ports: - "11434:11434"


3 days ago

but that's a docker-compose, Railway does not support that


ctyrrell-versovaPRO

3 days ago

๐Ÿ˜ฉ


3 days ago

maybe you can deploy the following ollama template and use the specific open webui version you want to use?
https://railway.com/deploy/T9CQ5w


ctyrrell-versovaPRO

3 days ago

Thank you, I'll give that a look.