v4.1.0: DMS email documents, category-specific Nachweis linking, version system
Some checks failed
CI/CD Pipeline / test (push) Has been cancelled
CI/CD Pipeline / deploy (push) Has been cancelled
Code Quality / quality (push) Has been cancelled

- Save cover email body as DMS document with new 'email' context type
- Show email body separately from attachments in email detail view
- Add per-category DMS document assignment in quarterly confirmation
  (Studiennachweis, Einkommenssituation, Vermögenssituation)
- Add VERSION file and context processor for automatic version display
- Add MCP server, agent system, import/export, and new migrations
- Update compose files and production environment template

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
SysAdmin Agent
2026-03-15 18:48:52 +00:00
parent faeb7c1073
commit e0b377014c
49 changed files with 5913 additions and 55 deletions

View File

@@ -113,6 +113,69 @@ services:
- db
command: ["celery", "-A", "core", "beat", "-l", "info"]
mcp:
build: ./app
depends_on:
db:
condition: service_healthy
environment:
- POSTGRES_DB=${POSTGRES_DB}
- POSTGRES_USER=${POSTGRES_USER}
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
- DB_HOST=${DB_HOST}
- DB_PORT=${DB_PORT}
- DJANGO_SECRET_KEY=${DJANGO_SECRET_KEY}
- DJANGO_DEBUG=0
- DJANGO_ALLOWED_HOSTS=localhost
- LANGUAGE_CODE=${LANGUAGE_CODE}
- TIME_ZONE=${TIME_ZONE}
- MCP_TOKEN_READONLY=${MCP_TOKEN_READONLY}
- MCP_TOKEN_EDITOR=${MCP_TOKEN_EDITOR}
- MCP_TOKEN_ADMIN=${MCP_TOKEN_ADMIN}
# Kein Port-Mapping nur internes Netz
# Start via: docker compose run --rm -e MCP_AUTH_TOKEN=<token> mcp
stdin_open: true
command: ["python", "-m", "mcp_server"]
ollama:
image: ollama/ollama:latest
# Kein externes Port-Mapping — nur über internes Docker-Netzwerk erreichbar
# Django-App: http://ollama:11434
environment:
- OLLAMA_MAX_LOADED_MODELS=1
- OLLAMA_NUM_PARALLEL=1
- OLLAMA_DEFAULT_MODEL=${OLLAMA_DEFAULT_MODEL:-qwen2.5:3b}
volumes:
- ollama_data:/root/.ollama
restart: unless-stopped
healthcheck:
test: ["CMD-SHELL", "curl -sf http://localhost:11434/api/tags || exit 1"]
interval: 30s
timeout: 10s
retries: 5
start_period: 60s
# Beim ersten Start: Ollama starten, dann Modell laden (falls nicht vorhanden)
entrypoint: >
sh -c "
ollama serve &
OLLAMA_PID=$$!
echo '[ollama] Warte auf API...'
RETRIES=0
until curl -sf http://localhost:11434/api/tags > /dev/null 2>&1; do
RETRIES=$$((RETRIES + 1))
[ $$RETRIES -ge 60 ] && echo '[ollama] FEHLER: API nicht bereit.' && exit 1
sleep 1
done
MODEL=$${OLLAMA_DEFAULT_MODEL:-qwen2.5:3b}
if ollama list | grep -q \"$$MODEL\"; then
echo \"[ollama] Modell '$$MODEL' bereits vorhanden.\"
else
echo \"[ollama] Lade Modell '$$MODEL'...\"
ollama pull \"$$MODEL\"
fi
wait $$OLLAMA_PID
"
grampsweb:
image: ghcr.io/gramps-project/grampsweb:latest
ports:
@@ -138,3 +201,4 @@ services:
volumes:
dbdata:
gramps_data:
ollama_data: