Skip to content

error converting StreamPETR to onnx #63

@rod409

Description

@rod409

Follow the steps to convert StreamPETR to onnx, I get two error. The first I can get past.

The errors occur when converting the head of the model

CUBLAS_WORKSPACE_CONFIG=:4096:8 python tools/pth2onnx.py projects/configs/test_speed/stream_petr_r50_704_bs2_seq_428q_nui_speed_test.py --section pts_head_memory

Traceback (most recent call last):
File "tools/pth2onnx.py", line 302, in
main()
File "tools/pth2onnx.py", line 284, in main
torch.onnx.export(
File "/home/rod/anaconda3/envs/streampetr/lib/python3.8/site-packages/torch/onnx/init.py", line 275, in export
return utils.export(model, args, f, export_params, verbose, training,
File "/home/rod/anaconda3/envs/streampetr/lib/python3.8/site-packages/torch/onnx/utils.py", line 88, in export
_export(model, args, f, export_params, verbose, training, input_names, output_names,
File "/home/rod/anaconda3/envs/streampetr/lib/python3.8/site-packages/torch/onnx/utils.py", line 689, in _export
_model_to_graph(model, args, verbose, input_names,
File "/home/rod/anaconda3/envs/streampetr/lib/python3.8/site-packages/torch/onnx/utils.py", line 463, in _model_to_graph
graph = _optimize_graph(graph, operator_export_type,
File "/home/rod/anaconda3/envs/streampetr/lib/python3.8/site-packages/torch/onnx/utils.py", line 200, in _optimize_graph
graph = torch._C._jit_pass_onnx(graph, operator_export_type)
File "/home/rod/anaconda3/envs/streampetr/lib/python3.8/site-packages/torch/onnx/init.py", line 313, in _run_symbolic_function
return utils._run_symbolic_function(*args, **kwargs)
File "/home/rod/anaconda3/envs/streampetr/lib/python3.8/site-packages/torch/onnx/utils.py", line 994, in _run_symbolic_function
return symbolic_fn(g, *inputs, **attrs)
File "/home/rod/anaconda3/envs/streampetr/lib/python3.8/site-packages/torch/onnx/symbolic_opset9.py", line 1859, in slice
raise RuntimeError("step!=1 is currently not supported")
RuntimeError: step!=1 is currently not supported

I got past this first one by setting the opset to 11 in the pth2onnx.py file. After doing so I run into this error

Traceback (most recent call last):
File "tools/pth2onnx.py", line 303, in
main()
File "tools/pth2onnx.py", line 284, in main
torch.onnx.export(
File "/home/rod/anaconda3/envs/streampetr/lib/python3.8/site-packages/torch/onnx/init.py", line 275, in export
return utils.export(model, args, f, export_params, verbose, training,
File "/home/rod/anaconda3/envs/streampetr/lib/python3.8/site-packages/torch/onnx/utils.py", line 88, in export
_export(model, args, f, export_params, verbose, training, input_names, output_names,
File "/home/rod/anaconda3/envs/streampetr/lib/python3.8/site-packages/torch/onnx/utils.py", line 689, in _export
_model_to_graph(model, args, verbose, input_names,
File "/home/rod/anaconda3/envs/streampetr/lib/python3.8/site-packages/torch/onnx/utils.py", line 501, in _model_to_graph
params_dict = torch._C._jit_pass_onnx_constant_fold(graph, params_dict,
RuntimeError: expected scalar type Long but found Float

Are there additional steps or requirements in the setup? I'm not sure what is causing a type issue here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions