Thesis Helper is an AI-powered thesis planning platform built to help students move from vague ideas to a structured, trackable, and ethically supported research workflow. It combines topic brainstorming, personalized timeline generation, task execution support, an integrated research assistant, Notion syncing, and daily email nudges in one system.
The project was developed as a University of Bonn Team TK thesis-management prototype with a Next.js frontend and a FastAPI backend. It supports both local AI through Ollama and cloud AI through Google Gemini.
Students often struggle with the same thesis problems:
- choosing a topic that is specific enough to execute
- breaking a large research project into realistic phases and tasks
- recovering when they fall behind schedule
- knowing which academic tools to use for a specific task
- getting AI support without drifting into academic-integrity risks
Thesis Helper is designed to address those gaps with a guided workflow instead of a single chat box.
The app starts with a guided brainstorming chat that helps turn rough interests into a focused research direction.
Students then complete a structured questionnaire covering topic details, deadline, schedule, work style, notifications, and AI provider preferences.
The backend turns the questionnaire into a phased plan with milestones, task estimates, daily assignments, and "today's tasks" guidance.
The app now includes a dedicated Research Assistant mode backed by the research_agent Git submodule. It can search arXiv, pull Wikipedia context, summarize papers, refine research queries, and generate concept graphs without leaving Thesis Helper.
Generated timelines can be pushed into a Notion workspace with task databases, milestone tracking, and progress views.
The system can send a daily progress summary with today's tasks, motivation, and status updates.
Each thesis task can be opened in a dedicated workspace with chat support, deliverable tracking, and specialized academic tools.
The system can surface academic-integrity reminders when AI usage starts drifting toward replacement work instead of support.
- AI brainstorming flow for refining thesis topics
- personalized timeline generation with phases, milestones, and task estimates
- integrated research mode powered by the
research_agentrepository - local persistence so thesis projects can be resumed later
- Notion workspace creation and timeline syncing
- daily email summaries and motivational nudges
- task-specific work sessions with 18 academic helper tools
- ethics monitoring to encourage responsible AI usage
- support for local AI with Ollama and cloud AI with Gemini
- Frontend: Next.js 14, React 18, TypeScript, Tailwind CSS
- Backend: FastAPI, Pydantic, SQLAlchemy
- Research orchestration: LangChain, LangGraph
- Database: SQLite by default, PostgreSQL-compatible configuration
- AI providers: Ollama, Google Gemini
- Integrations: Notion, Gmail SMTP, arXiv, Wikipedia, Google Custom Search, helper services
backend/ FastAPI app, models, services, integrations
frontend/ Next.js UI, components, styles, API client
research_agent/ Git submodule used for the integrated research assistant
screenshots/ README/demo assets
docs/ Project notes and status docs
Thesis Helper - TK Team.pdf
Thesis Helper Agent.pdf
git clone --recurse-submodules https://github.com/ananmouaz/thesis_agent.git
cd thesis_agentIf you already cloned without submodules:
git submodule update --init --recursivepython -m venv venvmacOS/Linux:
source venv/bin/activateWindows PowerShell:
.\venv\Scripts\Activate.ps1pip install -r requirements.txtcd frontend
npm install
cd ..cp env.template .envWindows PowerShell:
Copy-Item env.template .envThen update .env with the values you need:
EMAIL_USERandEMAIL_PASSWORDfor Gmail deliveryNOTION_TOKENfor Notion workspace creation and syncingGEMINI_API_KEYif you want to use Gemini instead of OllamaGOOGLE_SEARCH_API_KEYandGOOGLE_SEARCH_ENGINE_IDif you want the research assistant's Google Custom Search toolDATABASE_URLif you want a database other than local SQLiteDEBUGmust betrueorfalse
ollama pull llama3.2
ollama serveStart the backend from the backend/ directory:
cd backend
python -m uvicorn app.simple_main:app --reload --host 0.0.0.0 --port 8000In a second terminal, start the frontend:
cd frontend
npm run devOpen the app at:
http://localhost:3000
AI_PROVIDER=ollamais the default local-first setup.- If Ollama is not running, cloud generation via Gemini requires a valid
GEMINI_API_KEY. - The integrated Research Assistant can use arXiv and Wikipedia without Google search keys, but Google Custom Search features require
GOOGLE_SEARCH_API_KEYandGOOGLE_SEARCH_ENGINE_ID. - Notion and Gmail features are optional, but the related buttons/actions require valid credentials.
- The default SQLite path is resolved relative to the backend working directory, so start the API from
backend/if you use the default config.
- If the backend complains about
DEBUG, set it totrueorfalsein.envinstead of values likerelease. - If timeline generation fails with Ollama, confirm
ollama serveis running andllama3.2is installed. - If the Research Assistant is shown as unavailable, confirm the
research_agent/submodule was initialized and reinstall backend dependencies fromrequirements.txt. - If Gemini is selected for the Research Assistant, set a valid
GEMINI_API_KEY; otherwise use Ollama. - If Notion actions fail, verify the integration token and that the integration has access to the target workspace.
- If emails fail, use a Gmail app password instead of a normal account password.
Additional project context and presentation material live in:
The repository already contains the core end-to-end experience:
- brainstorming
- integrated research assistant
- questionnaire-driven planning
- timeline generation
- Notion sync
- email progress summaries
- task execution workspace with academic tools
The README screenshots were refreshed from the latest project report and presentation assets in this repository.







