How to Deploy Ollama with Docker and Determine the Correct Endpoint URL
prblsing
HOBBYOP

a year ago

I'm trying to deploy Ollama using Docker and need guidance on the following points:

  1. Deployment Process:

    • Are there any official Docker files or community-contributed ones for deploying Ollama?

    • If I need to create my own Dockerfile, what steps should I follow?

  2. Endpoint URL:

    • Once deployed, how do I determine the correct endpoint URL to use for API requests?

    • Should it default to http://localhost:11434 or something else when deployed in Docker?

  3. My Current Setup:

    • I have a Docker environment set up on [local machine/remote server].

    • I am using the Ollama model (e.g., llama2, llama3).

    • Here is my current Dockerfile configuration (if applicable):

      dockerfile

      FROM ubuntu:22.04

      RUN apt-get update && apt-get install -y \

      curl \

      && rm -rf /var/lib/apt/lists/*

      RUN curl -L https://ollama.ai/install.sh | sh

      EXPOSE 11434

      # Start Ollama and preload the model

      CMD ["sh", "-c", "ollama serve & sleep 10 && ollama pull llama3.2 && tail -f /dev/null"]

  4. Error or Confusion:

    • If I run the above Docker container, I'm unsure of the endpoint URL to use for testing the deployed Ollama API.

    • Are there additional configurations or flags required during deployment to ensure the API runs on the expected port?

  5. Goal:

    • I want to deploy Ollama, access the API using a custom URL, and integrate it into my applications.

6 Replies

brody
EMPLOYEE

a year ago

Hello,

Have you checked out our Ollama template?

https://railway.com/template/T9CQ5w


brody

Hello,Have you checked out our Ollama template?https://railway.com/template/T9CQ5w

prblsing
HOBBYOP

a year ago

Yes, but this doesn't download model.


brody
EMPLOYEE

a year ago

I'm not familiar with Ollama, but can you initiate a model download via the WebUI?


brody

I'm not familiar with Ollama, but can you initiate a model download via the WebUI?

prblsing
HOBBYOP

a year ago

Not using WebUI, have my own custom app, is that integration possible?


brody
EMPLOYEE

a year ago

Right, but can you initiate a model download via the provided WebUI? then you can use Ollama in your own app.


prblsing
HOBBYOP

a year ago

No, it will not work that way for the use case I am trying to build. Will try to explore though, thanks.


Loading...