Add onnxruntime as wasi-nn backend#4485
Conversation
1, Adapt to latest wasi-nn arch and support WAMR_BUILD_WASI_EPHEMERAL_NN |
aa88085 to
1fb25ad
Compare
1e60909 to
29e4dd5
Compare
1, type converter btw wasi-nn and onnx runtime returns bool instead of type 2, out_buffer_size does not hold the expected size. 3, onnx runtime does not need calculate input_tenser size.
0dcdcab to
cd3cb6c
Compare
using this model, i had to use non-zero index for get_output. thus i had to fix nn-cli bug. |
I'm using this one: https://github.com/onnx/models/blob/main/validated/vision/object_detection_segmentation/ssd-mobilenetv1/model/ssd_mobilenet_v1_10.onnx |
thank you. however, this model looks same in the regard. (have 4 outputs) |
maybe you somehow interpreted only the first (idx=0) output, which contains bounding boxes? |
output shape is [1, 100, 4]: in which, one bounding box for example: it looks good for my test picure (http://images.cocodataset.org/val2017/000000088462.jpg) |
ok. i understood. actually, the model has 4 outputs as documented as wasi-nn doesn't have get-output-by-name, you need to use integer indexes. you were only looking at detection_boxes. |
OK, Thank you! |
|
@lum1n0us could you help review the code? |
remove dead code release memory
There was a problem hiding this comment.
The code looks good to me. But, could you provide some documentation to guide people on how to use these backends(include a few words about installation)? Most importantly, a smoke test to verify its functionality and serve as a demo would be helpful.
I wrote some text about how to set up the backend and test it, it may help to understand how to use it. |
|
Sure. |
|
LGTM |
* Add onnxruntime as wasi-nn backend * remove global context * put checks under the lock * tensor type will not support legacy wasi-nn abi * Manually set the imported target with name space
Steps to verify:
Generate output.bin, with shape [1, 100, 4] and f32 type, which contents match the sample's output