Skip to content

Sanjoy-droid/Brain-Buddy-AI

Repository files navigation

Brain-Buddy-AI

Brain-Buddy-AI is a full-stack AI chatbot application built with Next.js, leveraging the powerful Groq API with Llama 3.1 to deliver intelligent, lightning-fast conversations.

🚀 Features

  • Ultra-fast AI responses (1-2 seconds) powered by Groq
  • 🤖 AI-powered chatbot with natural language understanding
  • 💬 Multiple conversation modes: Chat, Email Assistant, Daily Planner, Fun Facts
  • 🎨 Seamless Next.js frontend with React 19
  • 🔌 Groq API integration using Axios
  • 🎯 Tailwind CSS for sleek UI design
  • ⚡ Optimized performance with Next.js 15

📦 Tech Stack

  • Framework: Next.js 15
  • Frontend: React 19, Tailwind CSS
  • API Handling: Axios
  • AI Model: Llama 3.1 8B Instant (via Groq API)
  • Deployment: Vercel

🛠️ Setup & Installation

1️⃣ Clone the repository

git clone https://github.com/Sanjoy-droid/Brain-Buddy-AI
cd Brain-Buddy-AI

2️⃣ Install dependencies

npm install

3️⃣ Set up environment variables

Create a .env.local file in the root directory and add the following:

GROQ_API_KEY=your_groq_api_key_here

Get your free Groq API key:

  1. Visit console.groq.com
  2. Sign up for a free account
  3. Generate an API key
  4. Copy and paste it into your .env.local file

4️⃣ Run the development server

npm run dev

Open http://localhost:3000 in your browser to access the app.

🏗️ Build & Deploy

To create a production build:

npm run build

To start the production server:

npm run start

🚀 Deploy on Vercel

The easiest way to deploy Brain-Buddy-AI is via Vercel:

  1. Push your code to GitHub
  2. Connect your repository to Vercel
  3. Add the GROQ_API_KEY environment variable in Vercel's project settings
  4. Deploy!

For more details, check out Next.js Deployment Documentation.

🎯 Features Overview

💬 Conversation Mode

Engage in natural conversations with AI assistance.

📧 Email Assistant

Generate professional emails with AI help.

📅 Daily Planner

Plan your day with AI-powered suggestions.

🎲 Fun Facts

Discover interesting facts powered by AI.

🤖 Why Groq?

  • Lightning Fast: 1-2 second response times (10x faster than traditional LLMs)
  • Free Tier: Generous free usage limits
  • High Quality: Powered by Meta's Llama 3.1 model
  • Reliable: Perfect for production deployments on Vercel

📖 Learn More

🤝 Contributing

Feel free to contribute by opening an issue or submitting a pull request!

📜 License

MIT License


Made with ❤️ using Next.js and Groq

About

Experience the future of conversation with our advanced AI chatbot. Powered by cutting-edge technology for human-like interactions.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages