PsyLite: A lightweight mental-support AI agent based on InternLM2.5-7B-Chat with appropriate crosstalks
- PsyLite
- [2025/05] Finish developing pipelines for PsyLite ! What's new?
- [2025/04] Finish training model internlm2.5_7b_distill and internlm2.5_7b_distill_orpo .
A large model application for mild psychological counseling with low hardware requirements and deep thinking ability developed based on internlm2.5-7b-chat
Condition RAG: Determine whether it is suitable for the current user to use crosstalk
for the purpose of livening up the atmosphere, narrowing the mutual distance, etc.
- If it is suitable, RAG retrieves the crosstalk corpus and provides it to the model to generate an answer at the same time.
- If it is not suitable and is not a dangerous conversation, skip RAG directly.
- If it is not suitable and is a dangerous conversation, answer with preset phrases to prevent dangerous conversations and suggest the user seek for professional help.
This enables the retrieval of the corpus only in appropriate situations during psychological counseling to provide crosstalk segments to improve the user's experience.
| Platform | Model |
|---|---|
| Hugging Face | internlm2.5_7b_distill |
| Hugging Face | internlm2.5_7b_distill_orpo |
| Ollama | internlm2.5_7b_distill_q4_k_m |
| Ollama | internlm2.5_7b_distill_orpo_q4_k_m |
welcome Star⭐、PR and Issues
- Install Ollama
- Install And Configure Open-webui
- Install And Get PIPELINES Connected to open-webui
- Import PsyLite.py to Pipelines And Configure its Valves Parameters、RAG File Path
- Have fun !
Tip
To get better performance, we reccommand to set system prompt, see here
the base model of internlm2.5_7b_distill and internlm2.5_7b_distill_orpo is internlm2_5-7b-chat 。
Architecture Diagram
model download
git lfs install
git clone https://huggingface.co/juneup/internlm2.5_7b_distillIf you want to clone without large files - just their pointers:
GIT_LFS_SKIP_SMUDGE=1 git clone https://huggingface.co/juneup/internlm2.5_7b_distillOllama
ollama run Juneup/internlm2.5_7b_distill:q4_k_mjuneup/psy-mix-gen-distill-13k
Architecture Diagram
model download
git lfs install
git clone https://huggingface.co/juneup/internlm2.5_7b_distill_orpoIf you want to clone without large files - just their pointers:
GIT_LFS_SKIP_SMUDGE=1 git clone https://huggingface.co/juneup/internlm2.5_7b_distill_orpoOllama
ollama run Juneup/internlm2.5_7b_distill:orpo_q4_k_mTip
INSTALL PIPELINES BEFORE USING PsyLite!
| Team | Description |
|---|---|
| Shanghai Artificial Intelligence Laboratory | Thanks for the technical and platform support |
| Xtuner | Thanks for the training toolkits |
| OpenCompass | Thanks for the evaluation toolkits |
| llama.cpp | Thanks for the model weight file converter |
| Open-webui | Thanks for the excellent deployment platform |
| Pieplines | Thanks for pipelines for custom workflow |
@misc{ding2025psylitetechnicalreport,
title={PsyLite Technical Report},
author={Fangjun Ding and Renyu Zhang and Xinyu Feng and Chengye Xie and Zheng Zhang and Yanting Zhang},
year={2025},
eprint={2506.21536},
archivePrefix={arXiv},
primaryClass={cs.AI},
url={https://arxiv.org/abs/2506.21536},
}






