CloudStore is a full-stack demo e-commerce system composed of a Java backend, a Python Streamlit frontend, and MySQL + Redis infrastructure.
From repository root:
cp .env.example .envThen edit .env and set values, especially:
JWT_SECRETmust be Base64 and at least 32 bytes after decoding.LLM_SERVICE_URLmust point to the assistant endpoint used by backend (default in compose:http://llm-assistant:8000/chat).
Generate a secure secret:
openssl rand -base64 32Paste it into JWT_SECRET in .env.
docker compose up -d --build- New container:
llm-assistant(FastAPI) is started by Docker Compose. - New free LLM runtime container:
ollama. - Backend calls it through
LLM_SERVICE_URLand exposes customer-only facade methodgetCustomerShoppingAdvice. - Frontend customer view includes a
Shopping Assistanttab.
Required Ollama setup in .env:
OLLAMA_BASE_URL=http://ollama:11434
OLLAMA_MODEL=llama3.2:3bThen pull the model once:
docker compose exec ollama ollama pull llama3.2:3bIf model is not available yet, assistant still works with rule-based suggestions.
List keys with type and TTL:
docker compose exec redis sh -lc 'redis-cli --scan | while read k; do t=$(redis-cli TYPE "$k"); ttl=$(redis-cli TTL "$k"); printf "%-60s | %-8s | TTL=%s\n" "$k" "$t" "$ttl"; done | sort'