-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Build] WASM build of v1.20.1
with --use_xnnpack
fails
#23460
Comments
Apparently there was an update of the XNNPACK dependency in b94ba09 from google/XNNPACK@0da379f to google/XNNPACK@309b75c. The old version of the XNNPACK dependency still contained the |
I could make the Release build configuration compile by removing the source files of # kernels
+ # list(APPEND wasm_srcs ${XNNPACK_DIR}/src/amalgam/gen/scalar.c)
+ # list(APPEND wasm_srcs ${XNNPACK_DIR}/src/amalgam/gen/wasm.c)
- list(APPEND wasm_srcs ${XNNPACK_DIR}/src/amalgam/gen/scalar.c)
- list(APPEND wasm_srcs ${XNNPACK_DIR}/src/amalgam/gen/wasm.c)
if(onnxruntime_ENABLE_WEBASSEMBLY_SIMD)
+ # list(APPEND wasm_srcs ${XNNPACK_DIR}/src/amalgam/gen/wasmsimd.c)
- list(APPEND wasm_srcs ${XNNPACK_DIR}/src/amalgam/gen/wasmsimd.c)
target_compile_options(XNNPACK PRIVATE "-msimd128")
endif() However, there are following errors when linking
|
I could reproduce the issue in a workflow on GitHub actions:
|
we did not see much perf gain and use for xnnpack. We think spending some time to optimize wasm for mlas it the better choice but we have not gotten to that yet. |
Thank you @guschmue for the feedback! In terms of execution providers for the Web, the choices are a bit awkward. WebGL as the most stable one is now deprecated, the WebGPU one only works on Chromium with developer flags and Firefox Nightly so far and WebNN is not available in most Web browsers. I was counting on XNNPACK to get a better performance especially on Firefox, which is on CPU execution provider 2-3x slower than Chromium with the same task. Do you have any recommendation what to head for? |
webgpu is enabled on chromium since some time and stable on all platforms (linux it takes a few extra steps to use webgpu but it works well there too). |
Cool! However, to my understanding it is not yet possible to compile |
Describe the issue
Hello 👋
I want to utilize the ONNX runtime in a Web application. I successfully built the static library of ONNX runtime with SIMD and threading and linked it within my Emscripten project. The CPU execution providers works great. Now I tried to also include the XNNPACK execution provider with the
--use_xnnpack
flag to improve the performance of the inferencing.However, the
src/amalgam
directory is missing from the XNNPACK distribution.xnnpack.cmake
says:I cannot find any documentation at the XNNPACK how to manually generate these amalgam (?) microkernels. I also cannot find an example in this repository. How to build the XNNPACK execution provider for the Web?
Urgency
No response
Target platform
WASM
Build script
Error / output
Visual Studio Version
No response
GCC / Compiler Version
No response
The text was updated successfully, but these errors were encountered: