Skip to content

error in evaluation #5

@fyyyym

Description

@fyyyym

When performing python tools/test.py projects/configs/SparseOccVLA/sparseoccvla_stage3_4d_600q_forecasting.py ckpts/stage3_4d_600q_forecasting.pth --eval_occ ,the following error occurs. Does it matter? How to correct it
The model and loaded state dict do not match exactly
unexpected key in source state_dict: lm_head.language_model.model.layers.0.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.0.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.0.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.1.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.1.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.1.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.2.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.2.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.2.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.3.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.3.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.3.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.4.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.4.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.4.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.5.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.5.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.5.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.6.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.6.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.6.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.7.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.7.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.7.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.8.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.8.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.8.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.9.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.9.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.9.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.10.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.10.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.10.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.11.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.11.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.11.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.12.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.12.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.12.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.13.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.13.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.13.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.14.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.14.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.14.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.15.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.15.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.15.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.16.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.16.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.16.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.17.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.17.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.17.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.18.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.18.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.18.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.19.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.19.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.19.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.20.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.20.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.20.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.21.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.21.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.21.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.22.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.22.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.22.attention.rotary_emb.sin_cached, lm_head.language_model.model.layers.23.attention.rotary_emb.inv_freq, lm_head.language_model.model.layers.23.attention.rotary_emb.cos_cached, lm_head.language_model.model.layers.23.attention.rotary_emb.sin_cached

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions