Note: This is an experimental app.
InterviewAlly is an AI-powered interview assistant. It uses Retrieval-Augmented Generation (RAG) to ingest an interview handbook and answer questions contextually using Groq LLMs and local embeddings.
- RAG Architecture: Retrieves context from a local markdown knowledge base.
- LLM Provider: Uses Groq for high-speed inference.
- Embeddings: Uses local HuggingFace Transformers (
all-MiniLM-L6-v2) to avoid external embedding costs. - Interface: Includes a simple web UI and a REST API.
- Runtime: Node.js, TypeScript
- Web Framework: Express
- AI Framework: LangChain
- LLM: Groq
- Vector Store: In-Memory
- Embeddings: HuggingFace Transformers
- Node.js (v18 or higher recommended)
- A Groq API Key
-
Install dependencies:
npm install
-
Prepare the Knowledge Base: Ensure you have your source material located at:
data/interview-handbook.md -
Environment Configuration: Create a
.envfile in the root directory with the following variables:LLM_API_KEY=your_groq_api_key LLM_MODEL=llama3-8b-8192
Start the server:
npx ts-node src/server.tsThe server will start at http://localhost:3000.
