JAN AI is a Visual Studio Code Extension which use DeepSeek AI to run AI models locally, now it have support for deepseek 7B and deepseek 1.5B models, its a active development open source project and contributions always welcome!
Its a Github copilot alternative that runs 100% offline on your device
- Maximum Privacy & Security – Your data stays on your device—no external API calls or data leaks.
- Faster Response Times – Get instant AI-powered assistance without network latency.
- Full Offline Functionality – Run AI models anytime, even without internet access.
Currently, JAN AI supports:
- Deepseek 1.5B – A lightweight yet powerful AI model optimized for local execution.
- Deepseek 7B – A more advanced model offering deeper reasoning and enhanced contextual understanding.
- Future Updates – More AI models, including Mistral and LLaMA, will be supported soon!
Ollama AI Runner works effortlessly within your VS Code environment, offering:
- Intelligent Code Suggestions – Auto-complete and generate code snippets based on your prompts.
- AI-Powered Debugging – Get AI-driven insights to debug and optimize your code efficiently.
- Context-Aware Chat – Ask programming questions, get documentation explanations, and receive AI-powered guidance inside VS Code.
JAN AI is optimized for performance, ensuring even large AI models run smoothly without hogging system resources.
- No Data Leaves Your Device – Zero risk of exposing sensitive code or data to third-party services.
- Full Control Over AI Models – Customize and fine-tune models as needed without vendor restrictions.
- Open VS Code
- Go to Extensions (
Ctrl + Shift + X) - Search for "JAN AI"
- Click Install
🔹 Upon installation, an instruction site will pop up, guiding you through:
- Downloading and configuring the required models.
- Setting up the local environment for smooth execution.
- Exploring customization options for an enhanced experience.
- Support for additional AI models (e.g., Mistral, LLaMA, and more).
- Custom model fine-tuning for personalized AI assistance.
- Performance optimizations to handle even larger models efficiently.
Ollama AI Runner is the perfect extension for developers who want powerful AI capabilities without compromising privacy, speed, or security. Whether you’re coding, debugging, or seeking AI-powered insights, this extension brings the future of local AI directly to your fingertips.
Apache 2.0 - Because sharing is caring.
🔗 Download & Get Started Today!
📢 Need Help? Open an issue on GitHub!