Skip to content

scabbiaza/interview-ally

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

InterviewAlly

Note: This is an experimental app.

InterviewAlly is an AI-powered interview assistant. It uses Retrieval-Augmented Generation (RAG) to ingest an interview handbook and answer questions contextually using Groq LLMs and local embeddings.

App Screenshot

Features

  • RAG Architecture: Retrieves context from a local markdown knowledge base.
  • LLM Provider: Uses Groq for high-speed inference.
  • Embeddings: Uses local HuggingFace Transformers (all-MiniLM-L6-v2) to avoid external embedding costs.
  • Interface: Includes a simple web UI and a REST API.

Technologies

  • Runtime: Node.js, TypeScript
  • Web Framework: Express
  • AI Framework: LangChain
  • LLM: Groq
  • Vector Store: In-Memory
  • Embeddings: HuggingFace Transformers

Prerequisites

  • Node.js (v18 or higher recommended)
  • A Groq API Key

Installation

  1. Install dependencies:

    npm install
  2. Prepare the Knowledge Base: Ensure you have your source material located at:

    data/interview-handbook.md
    
  3. Environment Configuration: Create a .env file in the root directory with the following variables:

    LLM_API_KEY=your_groq_api_key
    LLM_MODEL=llama3-8b-8192

Running the Application

Start the server:

npx ts-node src/server.ts

The server will start at http://localhost:3000.

About

AI-powered interview assistant for candidates

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors