Bradley Charles
Rae Maffei
Livan Hagi Osman
Mark Yosinao
We are designing a prototype for a secure, in-house chatbot system with the focus on compliance, security, data privacy, and protecting intellectual property that are typically at risk with AI chatbots. Our proposed solution will include research, a detailed project report, and a final deliverable of a simple, working prototype. If possible, we wanted it to have multilingual capabilities.
- Compliance with HIPAA, FERPA, etc.
- Protection for Intellectual Property
- Run on internal, in-house infrastructure
- Provide articulate and helpful responses based on approved knowledge
- Use authentication tools to secure interactions
- Maintain audit logs as per compliance regulations
- Frontend
- TypeScript
- Next.js
- Backend
- Python
- FastAPI
- LLM
- Ollama (TinyLlama-1.1B-Chat)
- Authentication
- Basic token support (WIP)
- Python 3.12+
- Node.js 24+ (with npm)
- Git
git clone https://github.com/OC-Chatbot/Secure-Internal-Chatbot-Design.git
cd Secure-Internal-Chatbot-Designpython -m venv .venv
source .venv/bin/activate # Windows: .venv\\Scripts\\activate
pip install -r backend/requirements.txtnpm installRecommended (single command):
python start_services.pyThis starts uvicorn backend.main:app on port 8000 and next dev on port 3000. It also sets NEXT_PUBLIC_API_URL to http://localhost:8000/api if unset.
Manual alternative:
# Terminal 1
source .venv/bin/activate
uvicorn backend.main:app --reload --host 0.0.0.0 --port 8000
# Terminal 2
npm run dev -- --port 3000- Frontend: http://localhost:3000 (chat UI at
/chat) - Backend docs: http://localhost:8000/docs
- The backend loads the TinyLlama model from Hugging Face; first run requires network access or a pre-cached model in your HF cache.
- Configure
NEXT_PUBLIC_API_URLif your backend runs on a different host/port. You can export it beforenpm run devor set it in.env.local.
The application runs automatically on VPS reboot via systemd services.
- File:
/etc/systemd/system/oc-backend.service - Start:
sudo systemctl start oc-backend.service - Status:
sudo systemctl status oc-backend.service
- File:
/etc/systemd/system/oc-frontend.service - Start:
sudo systemctl start oc-frontend.service - Status:
sudo systemctl status oc-frontend.service
cd /root/Secure-Internal-Chatbot-Design
# Create backend service
sudo nano /etc/systemd/system/oc-backend.service
# [Paste backend service config]
# Create frontend service
sudo nano /etc/systemd/system/oc-frontend.service
# [Paste frontend service config]
# Enable and start
sudo systemctl daemon-reload
sudo systemctl enable oc-backend.service oc-frontend.service
sudo systemctl start oc-backend.service oc-frontend.service