Complete setup instructions for running the game with AI-powered NPC dialogue.
git clone https://github.com/Unity-Lab-AI/Medieval-Trading-Game.git
cd Medieval-Trading-GameDownload from: https://ollama.ai
- Windows: Run
OllamaSetup.exe - Mac: Download and drag to Applications
- Linux:
curl -fsSL https://ollama.ai/install.sh | sh
Ollama runs as a background service and auto-starts on boot.
Simply open index.html in your browser:
- Double-click
index.html, OR - Right-click → Open with → Chrome/Firefox/Edge
- Loading screen checks for Ollama
- Shows "🦙 Checking Ollama AI..."
- If model missing → Auto-downloads Mistral (4.4GB) with progress bar
- Shows "🦙 Ollama AI Model ✓"
- Game starts with AI-powered NPC dialogue
- Loading screen shows install prompt with options:
- "✓ I Have Ollama - Check Connection" (if already installed)
- "⬇️ Download Ollama" → Opens ollama.ai
- "Skip" → Game uses pre-written dialogue
- Game is fully playable either way!
- Ollama integration is automatically disabled
- NPCs use pre-written fallback dialogue
- No prompts shown (can't use localhost from hosted site)
- Any modern browser (Chrome, Firefox, Edge, Safari)
- 100MB disk space for game files
- 8GB RAM minimum
- 6GB free disk space (for Mistral model)
- Ollama installed and running
The game uses Mistral 7B Instruct by default (~4.4GB).
- Model downloads automatically during loading screen
- Progress shown: "Downloading... 2.07GB / 4.40GB"
- Only downloads once - saved to Ollama's model folder
- Windows:
C:\Users\<username>\.ollama\models - Mac/Linux:
~/.ollama/models
You can install any Ollama model manually:
ollama pull llama3:8b # Newer, slightly larger
ollama pull phi:latest # Smaller, faster (1.7GB)
ollama pull tinyllama # Ultra-light (637MB)All installed models appear in Settings → AI Voice.
- Make sure Ollama is installed from https://ollama.ai
- Click "✓ I Have Ollama - Check Connection" button
- If still not working, Ollama may not be running as a service
- Check system tray (Windows) or menu bar (Mac) for Ollama icon
- If not running, open terminal and run:
ollama serve - Refresh the game page
- Note: If you see "bind: Only one usage of each socket address" error, Ollama IS already running - just refresh the game!
- Check your internet connection
- Try downloading manually:
ollama pull mistral:7b-instruct - Restart Ollama: Close and reopen terminal, run
ollama serve
- This is normal if Ollama isn't installed
- NPCs use pre-written fallback dialogue
- Install Ollama for dynamic AI conversations
For the best experience, you can run a local server:
cd Medieval-Trading-Game
python -m http.server 8000
# Open http://localhost:8000 in browsernpx serve
# Open the URL shown in terminalInstall "Live Server" extension, right-click index.html → "Open with Live Server"
| Key | Action |
|---|---|
| SPACE | Pause/Unpause time |
| I | Open Inventory |
| M | Open Market |
| T | Open Travel Map |
| Q | Open Quests |
| ESC | Close current panel |
Access via gear icon (⚙️) in top-right:
- Model: Select from installed Ollama models
- Temperature: 0.1 (focused) to 1.0 (creative)
- Volume: Voice playback volume
- Weather effects
- Lighting
- Particle effects
- Music volume
- Sound effects
Medieval-Trading-Game/
├── index.html # Main game file - open this!
├── SETUP.md # This file
├── models/ # AI model documentation
│ └── MODELS.md # Model info (actual models in Ollama folder)
├── src/
│ ├── js/ # Game code
│ ├── css/ # Styles
│ └── data/ # Game data
└── assets/ # Images, audio, fonts
- Issues: https://github.com/Unity-Lab-AI/Medieval-Trading-Game/issues
- Ollama Docs: https://ollama.ai/docs
Unity AI Lab | Medieval Trading Game v0.92.00