ollama docker

๊ธฐ์กด์—๋Š” systemd๋กœ ํŒจํ‚ค์ง€๋ฅผ ์„ค์น˜ํ›„ ์‚ฌ์šฉํ–‡๋‹ค. docker๋กœ ์‚ฌ์šฉํ•ด๋ณด์ž.

https://ollama.ai/blog/ollama-is-now-available-as-an-official-docker-image

docker run

์‹คํ–‰

docker run -d \
          -p 21434:11434 \
          --name ollama \
          ollama/ollama

์™ธ๋ถ€ ์˜คํ”ˆ ํฌํŠธ๋Š” ์ ๋‹นํžˆ ๋ฐ”๊พธ์ž.

docker exec -it ollama bash
ollama run llama2

์ž˜ ๋œ๋‹ค.

api๋กœ ํ…Œ์ŠคํŠธ (on host)

curl -X POST http://localhost:21434/api/generate -d '{
  "model": "llama2",
  "prompt":"Why is the sky blue?"
}'

docker-compose ์‚ฌ์šฉ

version: '3.8'
services:
  ollama:
    image: ollama/ollama
    container_name: ollama
    ports:
      - 21434:11434
    restart: unless-stopped
docker exec -it ollama bash
ollama run llama2

curl -X POST http://localhost:21434/api/generate -d '{
  "model": "llama2",
  "prompt":"Why is the sky blue?"
}'

๊ฐœ์„ 

docker๋ฅผ ์ง€์šฐ๊ณ  ๋‹ค์‹œ ์ƒ์„ฑํ•˜๋ฉด ๋ชจ๋ธ์„ ๋‹ค์‹œ ๋‹ค์šดํ•ด์•ผํ•˜๋Š” ๋ฌธ์ œ

๋งˆ์šดํŠธ๋ฅผ ํ•ด์„œ ํ•ด๊ฒฐํ•˜์ž.

docker run -d \
          -v ~/.ollama:/root/.ollama \
          -p 21434:11434 \
          --name ollama \
          ollama/ollama
version: '3.8'
services:
  ollama:
    image: ollama/ollama
    container_name: ollama
    ports:
      - 21434:11434
    restart: unless-stopped
    volumes:
      - ~/.ollama:/root/.ollama

ํ™ˆ ๋””๋ ‰ํ† ๋ฆฌ์— .ollama ํด๋”์— ๋งˆ์šดํŠธํ•ด์„œ ์ด์ œ docker๋ฅผ ์ง€์šฐ๊ณ  ๋‹ค์‹œ ์ƒ์„ฑํ•ด๋„ ๋ชจ๋ธ์„ ๋‹ค์‹œ ๋‹ค์šด๋ฐ›์ง€ ์•Š๋Š”๋‹ค.

docker exec -it ollama bash

root@c4efa0c84132:/# ollama list
NAME                      	ID          	SIZE  	MODIFIED
llama2-uncensored:70b-chat	bdd0ec2f5ec5	38 GB 	25 hours ago
llama2-uncensored:7b-chat 	44040b922233	3.8 GB	24 hours ago
mistral:latest            	1ab49bc0b6a8	4.1 GB	26 hours ago

Last updated

Was this helpful?