You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
when infer the onnx model,don‘t find meta_data.json file?
if i want infer onnx,must be using openvino libs?
maybe provide some scripts for onnx model infer ,such as using onnxruntime infer? or other?
The text was updated successfully, but these errors were encountered:
it's solved,but have a lit problem,why must to export_mode == "openvino",the meta_data.json file would be created,when I only export_mode == "onnx“,the file wondn't created,maybe ,should adjust the order,
Such as:
torch.onnx.export(
model.model,
torch.zeros((1, 3, height, width)).to(model.device),
onnx_path,
opset_version=12,
input_names=["input"],
output_names=["output"],
)
with open(Path(export_path) / "meta_data.json", "w", encoding="utf-8") as metadata_file:
meta_data = get_model_metadata(model)
# Convert metadata from torch
for key, value in meta_data.items():
if isinstance(value, Tensor):
meta_data[key] = value.numpy().tolist()
json.dump(meta_data, metadata_file, ensure_ascii=False, indent=4)
if export_mode == "openvino":
Is your feature request related to a problem? Please describe.
when infer the onnx model,don‘t find meta_data.json file?
if i want infer onnx,must be using openvino libs?
maybe provide some scripts for onnx model infer ,such as using onnxruntime infer? or other?
The text was updated successfully, but these errors were encountered: