-
Notifications
You must be signed in to change notification settings - Fork 870
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error: Model not initialized #1135
Comments
I tried your repository and found the model initialization problem might be due to the absence of a wasm file. Basically Transformers.js loads wasm file from this link. ONNX_ENV.wasm.wasmPaths = `https://cdn.jsdelivr.net/npm/@huggingface/transformers@${env.version}/dist/`; The wasm file name your project was trying to find was You can temporarily fix your model initialization problem if you set the wasmPath to a different one. Since you have the dependency "onnxruntime-web": "1.21.0-dev.20241205-d27fecd3d3", you can set your wasm path to onnxruntime-web's cdn. env.backends.onnx.wasm!.wasmPaths =
"https://cdn.jsdelivr.net/npm/[email protected]/dist/"; You must set the proxy to False if you would like to use a custom wasmPath. Your final code would be: export async function initializeModel(
onProgress?: (progress: number) => void
): Promise<void> {
try {
env.allowLocalModels = false;
env.allowRemoteModels = true;
env.localModelPath = `${process.env.NEXT_PUBLIC_ASSETS_ENDPOINT}/models`;
if (env.backends?.onnx?.wasm) {
env.backends.onnx.wasm!.wasmPaths =
"https://cdn.jsdelivr.net/npm/[email protected]/dist/";
}
console.log("Model loading");
state.model = await AutoModel.from_pretrained(MODEL_ID, {
dtype: "fp32",
// device: "webgpu",
progress_callback: (progress: ProgressInfo) => {
if (onProgress && progress.status === "progress") {
onProgress(progress.progress);
}
},
});
console.log("Model loaded");
state.processor = await AutoProcessor.from_pretrained(MODEL_ID, {});
state.currentModelId = MODEL_ID;
} catch (error) {
console.error(error);
throw new Error(
error instanceof Error
? error.message
: "Failed to initialize background removal model"
);
}
} With this code, you can successfully see that "Model loaded" is shown in the console. However, executing the model does not seem to work, and I believe it is another problem. |
Thanks, @hunkim98. It did fix the issue. However, I am not sure why it is trying to use WASM when I have a really good GPU. The model I am trying to use (onnx-community/BiRefNet_lite) has some issues with ONNX Runtime Web. Their team is working on a fix, but I'm uncertain when it will be resolved. |
System Info
"@huggingface/transformers": "^3.2.4",
"next": "15.1.3",
"onnxruntime-web": "1.21.0-dev.20241205-d27fecd3d3",
"react": "^19.0.0",
"react-dom": "^19.0.0",
"sharp": "^0.33.5"
OS: Windows 11
Node: v20.17.0
Browser: Version 131.0.6778.205
Environment/Platform
Description
Hi,
I am trying to load a model in a Next.js App. It is mostly working like the model is downloaded correctly but not initialized.
I have this part of the code where the Model loaded is never printed in the console. Idk why but the AutoModel promise never resolves or rejects. Check this repo for full code.
And, when I still try to process the image, I get this error Error: Failed to process the image or a custom error Model not initialized. Why? bcoz, the promise was never resolved or rejected. So the next code was never executed.
Reproduction
The text was updated successfully, but these errors were encountered: