Skip to content

paulchen180616/QuanAgent

 
 

Repository files navigation

cover-v5-optimized

📌 QuanAgent - A customized fork of Dify for quantitative analysis and AI-powered workflows

QuanAgent Repository · Self-hosting Guide · Documentation

Static Badge Static Badge chat on Discord join Reddit follow on X(Twitter) follow on LinkedIn Docker Pulls Commits last month Issues closed Discussion posts LFX Health Score LFX Contributors LFX Active Contributors

README in English 繁體中文文件 简体中文文件 日本語のREADME README en Español README en Français README tlhIngan Hol README in Korean README بالعربية Türkçe README README Tiếng Việt README in Deutsch README in বাংলা

QuanAgent is a customized fork of Dify, an open-source platform for developing LLM applications. It maintains all the powerful features of Dify including agentic AI workflows, RAG pipelines, agent capabilities, model management, and observability features—with custom branding and configurations for quantitative analysis use cases.

Quick start

Before installing QuanAgent, make sure your machine meets the following minimum system requirements:

  • CPU >= 2 Core
  • RAM >= 4 GiB

🚀 Production Deployment

For production deployment with Nginx reverse proxy and SSL support, please refer to our comprehensive guide:

📖 Production Deployment Guide

This guide covers:

  • Docker + Docker Compose setup
  • Nginx reverse proxy configuration
  • SSL certificate with Let's Encrypt
  • Domain configuration
  • Performance optimization
  • Common troubleshooting

🛠️ Local Development

The easiest way to start the QuanAgent server locally is through Docker Compose:

cd QuanAgent/docker
cp .env.example .env
docker compose up -d

After running, access the dashboard at http://localhost/install and start the initialization process.

Seeking help

Key features

1. Workflow: Build and test powerful AI workflows on a visual canvas, leveraging all the following features and beyond.

2. Comprehensive model support: Seamless integration with hundreds of proprietary / open-source LLMs from dozens of inference providers and self-hosted solutions, covering GPT, Mistral, Llama3, and any OpenAI API-compatible models. A full list of supported model providers can be found here.

providers-v5

3. Prompt IDE: Intuitive interface for crafting prompts, comparing model performance, and adding additional features such as text-to-speech to a chat-based app.

4. RAG Pipeline: Extensive RAG capabilities that cover everything from document ingestion to retrieval, with out-of-box support for text extraction from PDFs, PPTs, and other common document formats.

5. Agent capabilities: You can define agents based on LLM Function Calling or ReAct, and add pre-built or custom tools for the agent. The platform provides 50+ built-in tools for AI agents, such as Google Search, DALL·E, Stable Diffusion and WolframAlpha.

6. LLMOps: Monitor and analyze application logs and performance over time. You could continuously improve prompts, datasets, and models based on production data and annotations.

7. Backend-as-a-Service: All offerings come with corresponding APIs, so you could effortlessly integrate QuanAgent into your own business logic.

Using QuanAgent

  • Self-hosting QuanAgent
    Quickly get QuanAgent running in your environment with this starter guide. Use the Dify documentation for further references and more in-depth instructions, as QuanAgent is based on Dify.

  • About Dify
    QuanAgent is a fork of Dify, an excellent open-source LLM application platform. For the original Dify Cloud service and enterprise features, please visit dify.ai.

Staying ahead

Star QuanAgent on GitHub and be instantly notified of new releases. Also consider starring the original Dify project to support the upstream development.

star-us

Advanced Setup

Custom configurations

If you need to customize the configuration, please refer to the comments in our .env.example file and update the corresponding values in your .env file. Additionally, you might need to make adjustments to the docker-compose.yaml file itself, such as changing image versions, port mappings, or volume mounts, based on your specific deployment environment and requirements. After making any changes, please re-run docker-compose up -d. You can find the full list of available environment variables here.

Metrics Monitoring with Grafana

Import the dashboard to Grafana, using QuanAgent's PostgreSQL database as data source, to monitor metrics in granularity of apps, tenants, messages, and more.

Deployment with Kubernetes

If you'd like to configure a highly-available setup, there are community-contributed Helm Charts and YAML files which allow QuanAgent (based on Dify) to be deployed on Kubernetes.

Using Terraform for Deployment

Deploy Dify to Cloud Platform with a single click using terraform

Azure Global
Google Cloud

Using AWS CDK for Deployment

Deploy Dify to AWS with CDK

AWS

Using Alibaba Cloud Computing Nest

Quickly deploy Dify to Alibaba cloud with Alibaba Cloud Computing Nest

Using Alibaba Cloud Data Management

One-Click deploy Dify to Alibaba Cloud with Alibaba Cloud Data Management

Deploy to AKS with Azure Devops Pipeline

One-Click deploy Dify to AKS with Azure Devops Pipeline Helm Chart by @LeoZhang

Contributing

For those who'd like to contribute code, see our Contribution Guide. At the same time, please consider supporting Dify by sharing it on social media and at events and conferences.

We are looking for contributors to help translate Dify into languages other than Mandarin or English. If you are interested in helping, please see the i18n README for more information, and leave us a comment in the global-users channel of our Discord Community Server.

Community & contact

  • GitHub Discussion. Best for: sharing feedback and asking questions.
  • GitHub Issues. Best for: bugs you encounter using Dify.AI, and feature proposals. See our Contribution Guide.
  • Discord. Best for: sharing your applications and hanging out with the community.
  • X(Twitter). Best for: sharing your applications and hanging out with the community.

Contributors

Star history

Star History Chart

Security disclosure

To protect your privacy, please avoid posting security issues on GitHub. Instead, report issues to security@dify.ai, and our team will respond with detailed answer.

License

This repository is licensed under the Dify Open Source License, based on Apache 2.0 with additional conditions.

About

Production-ready platform for agentic workflow development.

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • TypeScript 47.6%
  • Python 45.0%
  • JavaScript 4.4%
  • MDX 1.9%
  • CSS 0.5%
  • HTML 0.4%
  • Other 0.2%