Skip to content

Conversation

@sadra-barikbin
Copy link

@sadra-barikbin sadra-barikbin commented Jan 20, 2025

Hi there!

To add LGAI-EXAONE-3.5 model support to Lorax.

I tested the result against the vLLM counterpart + an LoRA adapter and seemingly they were the same.
Update on the above sentence: Sometimes they were the same and sometimes that of the vLLM changed.

flash_exaone_modeling is inspired by flash_cohere_modeling, flash_llama_modeling and also the custom model code in the model HF repo.

Fixes #743

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add model LGAI-EXAONE/EXAONE

1 participant