Skip to content

[Improvement] max_token setting improvement #2919

@SimengBian

Description

@SimengBian

Improvement Description

当前模型接入时会默认设置max_token为4096,实际上并未做任何限制,目前模型的上下文都比较长,考虑是否取消设置模型max_token的配置项,直接去获取后存入数据库使用。(当前一对话就显示Token使用百分百,但其实还能继续对话,体验不好)

Image

Metadata

Metadata

Assignees

Labels

HighHighest priority

Projects

Status

No status

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions