Skip to content

Fix Mixtral MoE layer configuration to use moe_affine_linear

93dbfc8
Select commit
Loading
Failed to load commit list.
Draft

[Prototype] Add GPT-OSS converter with heterogeneous block pattern support #374

Fix Mixtral MoE layer configuration to use moe_affine_linear
93dbfc8
Select commit
Loading
Failed to load commit list.