Spaces:
Runtime error
Runtime error
tech-envision
commited on
Commit
·
afd68f9
1
Parent(s):
04ad610
Update Dockerfile with Ollama and models
Browse files- .dockerignore +16 -0
- Dockerfile +36 -0
- README.md +18 -0
- entrypoint.sh +22 -0
.dockerignore
ADDED
@@ -0,0 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# Python caches
|
2 |
+
__pycache__/
|
3 |
+
*.py[cod]
|
4 |
+
|
5 |
+
# Git and GitHub
|
6 |
+
.git
|
7 |
+
.github
|
8 |
+
|
9 |
+
# Development envs
|
10 |
+
.env
|
11 |
+
.venv
|
12 |
+
venv/
|
13 |
+
|
14 |
+
# Other ignored files
|
15 |
+
*.db
|
16 |
+
*.log
|
Dockerfile
ADDED
@@ -0,0 +1,36 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# syntax=docker/dockerfile:1
|
2 |
+
|
3 |
+
FROM python:3.10-slim as base
|
4 |
+
|
5 |
+
# Install system dependencies and Ollama
|
6 |
+
RUN apt-get update && \
|
7 |
+
apt-get install -y curl gnupg && \
|
8 |
+
rm -rf /var/lib/apt/lists/* && \
|
9 |
+
curl -fsSL https://ollama.com/install.sh | sh
|
10 |
+
|
11 |
+
# Create non-root user
|
12 |
+
RUN useradd --create-home --uid 1000 botuser
|
13 |
+
|
14 |
+
WORKDIR /app
|
15 |
+
|
16 |
+
# Pull models at build time
|
17 |
+
ENV OLLAMA_MODEL="qwen3"
|
18 |
+
ENV OLLAMA_EMBEDDING_MODEL="nomic-embed-text"
|
19 |
+
RUN ollama pull "$OLLAMA_MODEL" && ollama pull "$OLLAMA_EMBEDDING_MODEL"
|
20 |
+
|
21 |
+
# Install Python dependencies
|
22 |
+
COPY requirements.txt ./
|
23 |
+
RUN pip install --no-cache-dir --upgrade pip && \
|
24 |
+
pip install --no-cache-dir -r requirements.txt
|
25 |
+
|
26 |
+
# Copy application source
|
27 |
+
COPY . .
|
28 |
+
|
29 |
+
# Entrypoint script manages Ollama and the bot
|
30 |
+
COPY entrypoint.sh /entrypoint.sh
|
31 |
+
RUN chmod +x /entrypoint.sh
|
32 |
+
|
33 |
+
USER botuser
|
34 |
+
ENV PYTHONUNBUFFERED=1
|
35 |
+
|
36 |
+
ENTRYPOINT ["/entrypoint.sh"]
|
README.md
CHANGED
@@ -19,3 +19,21 @@ python run.py
|
|
19 |
```
|
20 |
|
21 |
The script will instruct the model to run a simple shell command and print the result. Conversations are automatically persisted to `chat.db` and are now associated with a user and session.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
19 |
```
|
20 |
|
21 |
The script will instruct the model to run a simple shell command and print the result. Conversations are automatically persisted to `chat.db` and are now associated with a user and session.
|
22 |
+
|
23 |
+
## Docker
|
24 |
+
|
25 |
+
A Dockerfile is provided to run the Discord bot along with an Ollama server. The image installs Ollama, pulls the LLM and embedding models, and starts both the server and the bot.
|
26 |
+
|
27 |
+
Build the image:
|
28 |
+
|
29 |
+
```bash
|
30 |
+
docker build -t llm-discord-bot .
|
31 |
+
```
|
32 |
+
|
33 |
+
Run the container:
|
34 |
+
|
35 |
+
```bash
|
36 |
+
docker run -e DISCORD_TOKEN=your-token llm-discord-bot
|
37 |
+
```
|
38 |
+
|
39 |
+
The environment variables `OLLAMA_MODEL` and `OLLAMA_EMBEDDING_MODEL` can be set at build or run time to specify which models to download.
|
entrypoint.sh
ADDED
@@ -0,0 +1,22 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
#!/bin/sh
|
2 |
+
set -e
|
3 |
+
|
4 |
+
# Start Ollama server in the background
|
5 |
+
ollama serve >/tmp/ollama.log 2>&1 &
|
6 |
+
OLLAMA_PID=$!
|
7 |
+
|
8 |
+
cleanup() {
|
9 |
+
kill "$OLLAMA_PID"
|
10 |
+
}
|
11 |
+
trap cleanup EXIT
|
12 |
+
|
13 |
+
# Wait until the server is ready
|
14 |
+
for i in $(seq 1 30); do
|
15 |
+
if curl -sf http://localhost:11434/api/tags >/dev/null 2>&1; then
|
16 |
+
break
|
17 |
+
fi
|
18 |
+
sleep 1
|
19 |
+
done
|
20 |
+
|
21 |
+
# Run the Discord bot
|
22 |
+
exec python -m bot.discord_bot
|