Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: Model not initialized #1135

Open
1 of 5 tasks
vlohar08 opened this issue Jan 3, 2025 · 2 comments
Open
1 of 5 tasks

Error: Model not initialized #1135

vlohar08 opened this issue Jan 3, 2025 · 2 comments
Labels
bug Something isn't working

Comments

@vlohar08
Copy link

vlohar08 commented Jan 3, 2025

System Info

"@huggingface/transformers": "^3.2.4",
"next": "15.1.3",
"onnxruntime-web": "1.21.0-dev.20241205-d27fecd3d3",
"react": "^19.0.0",
"react-dom": "^19.0.0",
"sharp": "^0.33.5"

OS: Windows 11
Node: v20.17.0
Browser: Version 131.0.6778.205

Environment/Platform

  • Website/web-app
  • Browser extension
  • Server-side (e.g., Node.js, Deno, Bun)
  • Desktop app (e.g., Electron)
  • Other (e.g., VSCode extension)

Description

Hi,
I am trying to load a model in a Next.js App. It is mostly working like the model is downloaded correctly but not initialized.
I have this part of the code where the Model loaded is never printed in the console. Idk why but the AutoModel promise never resolves or rejects. Check this repo for full code.

    console.log("Model loading");

    state.model = await AutoModel.from_pretrained(MODEL_ID, {
      dtype: "fp16",
      device: "webgpu",
      progress_callback: (progress: ProgressInfo) => {
        if (onProgress && progress.status === "progress") {
          onProgress(progress.progress);
        }
      },
    });

    console.log("Model loaded");

And, when I still try to process the image, I get this error Error: Failed to process the image or a custom error Model not initialized. Why? bcoz, the promise was never resolved or rejected. So the next code was never executed.

Reproduction

  1. Clone this repo
  2. Install all dependencies. (I use pnpm)
  3. Start the dev server (pnpm dev)
  4. Wait for the model to load
  5. Upload an image and click on the Remove Background button
  6. Check the console for error
@vlohar08 vlohar08 added the bug Something isn't working label Jan 3, 2025
@hunkim98
Copy link

hunkim98 commented Jan 15, 2025

I tried your repository and found the model initialization problem might be due to the absence of a wasm file. Basically Transformers.js loads wasm file from this link.

ONNX_ENV.wasm.wasmPaths = `https://cdn.jsdelivr.net/npm/@huggingface/transformers@${env.version}/dist/`;

The wasm file name your project was trying to find was ort-wasm-simd-threaded.jsep.mjs, which does not exist in the cdn. In fact, I am not sure why your package is trying to locate the jsep.mjs file, not the jsep.wasm file.

You can temporarily fix your model initialization problem if you set the wasmPath to a different one. Since you have the dependency "onnxruntime-web": "1.21.0-dev.20241205-d27fecd3d3", you can set your wasm path to onnxruntime-web's cdn.

env.backends.onnx.wasm!.wasmPaths =
   "https://cdn.jsdelivr.net/npm/[email protected]/dist/";

You must set the proxy to False if you would like to use a custom wasmPath.

Your final code would be:

export async function initializeModel(
  onProgress?: (progress: number) => void
): Promise<void> {
  try {
    env.allowLocalModels = false;
    env.allowRemoteModels = true;
    env.localModelPath = `${process.env.NEXT_PUBLIC_ASSETS_ENDPOINT}/models`;

    if (env.backends?.onnx?.wasm) {
      env.backends.onnx.wasm!.wasmPaths =
        "https://cdn.jsdelivr.net/npm/[email protected]/dist/";
    }

    console.log("Model loading");

    state.model = await AutoModel.from_pretrained(MODEL_ID, {
      dtype: "fp32",
      // device: "webgpu",
      progress_callback: (progress: ProgressInfo) => {
        if (onProgress && progress.status === "progress") {
          onProgress(progress.progress);
        }
      },
    });

    console.log("Model loaded");

    state.processor = await AutoProcessor.from_pretrained(MODEL_ID, {});

    state.currentModelId = MODEL_ID;
  } catch (error) {
    console.error(error);
    throw new Error(
      error instanceof Error
        ? error.message
        : "Failed to initialize background removal model"
    );
  }
}

With this code, you can successfully see that "Model loaded" is shown in the console. However, executing the model does not seem to work, and I believe it is another problem.

@vlohar08
Copy link
Author

Thanks, @hunkim98. It did fix the issue. However, I am not sure why it is trying to use WASM when I have a really good GPU.

The model I am trying to use (onnx-community/BiRefNet_lite) has some issues with ONNX Runtime Web. Their team is working on a fix, but I'm uncertain when it will be resolved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants