Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gradio version 3.44.3 and ASGI exception #2159

Open
encoded-evolution opened this issue Jun 19, 2024 · 0 comments
Open

Gradio version 3.44.3 and ASGI exception #2159

encoded-evolution opened this issue Jun 19, 2024 · 0 comments

Comments

@encoded-evolution
Copy link

encoded-evolution commented Jun 19, 2024

I installed shark as per the instructions, and tried many different settings - none of them producing an image. Even using "stock" settings. Seeing that there was a "IMPORTANT: You are using gradio version 3.44.3, however version 4.29.0 is available, please upgrade." message, I used pip to update to the latest versions specified in the requirements.txt. All of the dependencies were successfully upgraded.

For the record:
Windows 10
image

Then I ran shark once again, using the default settings the UI presented to me:
image
The advanced settings:
image

I still get errors, no images generated. Here is the trace from a run with default specs, and an automatic prompt.

What are the troubleshooting steps for this?

`
shark_tank local cache is located at C:\Users[userfolder].local/shark_tank/ . You may change this by setting the --local_tank_cache= flag
gradio temporary image cache located at C:\Users[userfolder]\OneDrive\Desktop\Nodaishark\shark_tmp/gradio. You may change this by setting the GRADIO_TEMP_DIR environment variable.
No temporary images files to clear.
vulkan devices are available.
metal devices are not available.
cuda devices are not available.
rocm devices are available.
shark_tank local cache is located at C:\Users[userfolder].local/shark_tank/ . You may change this by setting the --local_tank_cache= flag
local-sync devices are available.
shark_tank local cache is located at C:\Users[userfolder].local/shark_tank/ . You may change this by setting the --local_tank_cache= flag
local-task devices are available.
shark_tank local cache is located at C:\Users[userfolder].local/shark_tank/ . You may change this by setting the --local_tank_cache= flag
Running on local URL: http://0.0.0.0:8080
IMPORTANT: You are using gradio version 3.44.3, however version 4.29.0 is available, please upgrade.

IMPORTANT: You are using gradio version 3.44.3, however version 4.29.0 is available, please upgrade.

IMPORTANT: You are using gradio version 3.44.3, however version 4.29.0 is available, please upgrade.

IMPORTANT: You are using gradio version 3.44.3, however version 4.29.0 is available, please upgrade.
IMPORTANT: You are using gradio version 3.44.3, however version 4.29.0 is available, please upgrade.


IMPORTANT: You are using gradio version 3.44.3, however version 4.29.0 is available, please upgrade.

IMPORTANT: You are using gradio version 3.44.3, however version 4.29.0 is available, please upgrade.

IMPORTANT: You are using gradio version 3.44.3, however version 4.29.0 is available, please upgrade.

IMPORTANT: You are using gradio version 3.44.3, however version 4.29.0 is available, please upgrade.

IMPORTANT: You are using gradio version 3.44.3, however version 4.29.0 is available, please upgrade.

IMPORTANT: You are using gradio version 3.44.3, however version 4.29.0 is available, please upgrade.

shark_tank local cache is located at C:\Users[userfolder].local/shark_tank/ . You may change this by setting the --local_tank_cache= flag

To create a public link, set share=True in launch().
Found device AMD Radeon RX 6800. Using target triple rdna2-unknown-windows.
Using tuned models for stabilityai/stable-diffusion-2-1(fp16) on device vulkan://00000000-1100-0000-0000-000000000000.
saving euler_scale_model_input_1_512_512_vulkan_fp16_torch_linalg.mlir to C:\Users[userfolder]\AppData\Local\Temp
loading existing vmfb from: C:\Users[userfolder]\OneDrive\Desktop\Nodaishark\euler_scale_model_input_1_512_512_vulkan_fp16.vmfb
WARNING: [Loader Message] Code 0 : windows_read_data_files_in_registry: Registry lookup failed to get layer manifest files.
Loading module C:\Users[userfolder]\OneDrive\Desktop\Nodaishark\euler_scale_model_input_1_512_512_vulkan_fp16.vmfb...
Compiling Vulkan shaders. This may take a few minutes.
saving euler_step_1_512_512_vulkan_fp16_torch_linalg.mlir to C:\Users[userfolder]\AppData\Local\Temp
loading existing vmfb from: C:\Users[userfolder]\OneDrive\Desktop\Nodaishark\euler_step_1_512_512_vulkan_fp16.vmfb
Loading module C:\Users[userfolder]\OneDrive\Desktop\Nodaishark\euler_step_1_512_512_vulkan_fp16.vmfb...
Compiling Vulkan shaders. This may take a few minutes.
use_tuned? sharkify: True
_1_64_512_512_fp16_tuned_stable-diffusion-2-1-base
Loading module C:\Users[userfolder]\OneDrive\Desktop\Nodaishark\clip_1_64_512_512_fp16_tuned_stable-diffusion-2-1-base_vulkan.vmfb...
Compiling Vulkan shaders. This may take a few minutes.
torch\fx\node.py:263: UserWarning: Trying to prepend a node to itself. This behavior has no effect on the graph.
warnings.warn("Trying to prepend a node to itself. This behavior has no effect on the graph.")
Loading Winograd config file from C:\Users[userfolder].local/shark_tank/configs\unet_winograd_vulkan.json
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "asyncio\runners.py", line 190, in run
File "asyncio\runners.py", line 118, in run
File "asyncio\base_events.py", line 640, in run_until_complete
File "asyncio\windows_events.py", line 321, in run_forever
File "asyncio\base_events.py", line 607, in run_forever
File "asyncio\base_events.py", line 1922, in _run_once
File "asyncio\events.py", line 80, in _run
File "gradio\queueing.py", line 431, in process_events
File "gradio\queueing.py", line 388, in call_prediction
File "gradio\route_utils.py", line 219, in call_process_api
File "gradio\blocks.py", line 1437, in process_api
File "gradio\blocks.py", line 1123, in call_function
File "gradio\utils.py", line 503, in async_iteration
File "gradio\utils.py", line 496, in anext
File "anyio\to_thread.py", line 33, in run_sync
File "anyio_backends_asyncio.py", line 877, in run_sync_in_worker_thread
File "anyio_backends_asyncio.py", line 807, in run
File "gradio\utils.py", line 479, in run_sync_iterator_async
File "gradio\utils.py", line 629, in gen_wrapper
File "ui\txt2img_ui.py", line 195, in txt2img_inf
File "apps\stable_diffusion\src\pipelines\pipeline_shark_stable_diffusion_txt2img.py", line 134, in generate_images
File "apps\stable_diffusion\src\pipelines\pipeline_shark_stable_diffusion_utils.py", line 235, in produce_img_latents
File "apps\stable_diffusion\src\pipelines\pipeline_shark_stable_diffusion_utils.py", line 114, in load_unet
File "apps\stable_diffusion\src\models\model_wrappers.py", line 858, in unet
File "apps\stable_diffusion\src\models\model_wrappers.py", line 821, in unet
File "apps\stable_diffusion\src\models\model_wrappers.py", line 781, in compile_unet_variants
File "apps\stable_diffusion\src\models\model_wrappers.py", line 619, in get_unet
File "apps\stable_diffusion\src\utils\utils.py", line 167, in compile_through_fx
File "apps\stable_diffusion\src\utils\sd_annotation.py", line 272, in sd_model_annotation
File "apps\stable_diffusion\src\utils\sd_annotation.py", line 77, in load_winograd_configs
File "shark\shark_downloader.py", line 48, in download_public_file
for blob in blobs:
File "google\api_core\page_iterator.py", line 208, in _items_iter
File "google\api_core\page_iterator.py", line 244, in _page_iter
File "google\api_core\page_iterator.py", line 373, in _next_page
File "google\api_core\page_iterator.py", line 432, in get_next_page_response
File "google\cloud\storage_http.py", line 72, in api_request
File "google\api_core\retry.py", line 366, in retry_wrapped_func
File "google\api_core\retry.py", line 204, in retry_target
File "google\cloud_http_init
.py", line 494, in api_request
SystemExit: 404 GET https://storage.googleapis.com/storage/v1/b/shark_tank/o?projection=noAcl&prefix=sd_tuned%2Fconfigs&prettyPrint=false: The specified bucket does not exist.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "uvicorn\protocols\websockets\websockets_impl.py", line 247, in run_asgi
File "uvicorn\middleware\proxy_headers.py", line 84, in call
File "fastapi\applications.py", line 292, in call
File "starlette\applications.py", line 122, in call
File "starlette\middleware\errors.py", line 149, in call
File "starlette\middleware\cors.py", line 75, in call
File "starlette\middleware\exceptions.py", line 68, in call
File "fastapi\middleware\asyncexitstack.py", line 17, in call
File "starlette\routing.py", line 718, in call
File "starlette\routing.py", line 341, in handle
File "starlette\routing.py", line 82, in app
File "fastapi\routing.py", line 324, in app
File "gradio\routes.py", line 578, in join_queue
File "asyncio\tasks.py", line 639, in sleep
asyncio.exceptions.CancelledError
ERROR: Traceback (most recent call last):
File "asyncio\runners.py", line 190, in run
File "asyncio\runners.py", line 118, in run
File "asyncio\base_events.py", line 640, in run_until_complete
File "asyncio\windows_events.py", line 321, in run_forever
File "asyncio\base_events.py", line 607, in run_forever
File "asyncio\base_events.py", line 1922, in _run_once
File "asyncio\events.py", line 80, in _run
File "gradio\queueing.py", line 431, in process_events
File "gradio\queueing.py", line 388, in call_prediction
File "gradio\route_utils.py", line 219, in call_process_api
File "gradio\blocks.py", line 1437, in process_api
File "gradio\blocks.py", line 1123, in call_function
File "gradio\utils.py", line 503, in async_iteration
File "gradio\utils.py", line 496, in anext
File "anyio\to_thread.py", line 33, in run_sync
File "anyio_backends_asyncio.py", line 877, in run_sync_in_worker_thread
File "anyio_backends_asyncio.py", line 807, in run
File "gradio\utils.py", line 479, in run_sync_iterator_async
File "gradio\utils.py", line 629, in gen_wrapper
File "ui\txt2img_ui.py", line 195, in txt2img_inf
File "apps\stable_diffusion\src\pipelines\pipeline_shark_stable_diffusion_txt2img.py", line 134, in generate_images
File "apps\stable_diffusion\src\pipelines\pipeline_shark_stable_diffusion_utils.py", line 235, in produce_img_latents
File "apps\stable_diffusion\src\pipelines\pipeline_shark_stable_diffusion_utils.py", line 114, in load_unet
File "apps\stable_diffusion\src\models\model_wrappers.py", line 858, in unet
File "apps\stable_diffusion\src\models\model_wrappers.py", line 821, in unet
File "apps\stable_diffusion\src\models\model_wrappers.py", line 781, in compile_unet_variants
File "apps\stable_diffusion\src\models\model_wrappers.py", line 619, in get_unet
File "apps\stable_diffusion\src\utils\utils.py", line 167, in compile_through_fx
File "apps\stable_diffusion\src\utils\sd_annotation.py", line 272, in sd_model_annotation
File "apps\stable_diffusion\src\utils\sd_annotation.py", line 77, in load_winograd_configs
File "shark\shark_downloader.py", line 48, in download_public_file
for blob in blobs:
File "google\api_core\page_iterator.py", line 208, in _items_iter
File "google\api_core\page_iterator.py", line 244, in _page_iter
File "google\api_core\page_iterator.py", line 373, in _next_page
File "google\api_core\page_iterator.py", line 432, in get_next_page_response
File "google\cloud\storage_http.py", line 72, in api_request
File "google\api_core\retry.py", line 366, in retry_wrapped_func
File "google\api_core\retry.py", line 204, in retry_target
File "google\cloud_http_init
.py", line 494, in api_request
SystemExit: 404 GET https://storage.googleapis.com/storage/v1/b/shark_tank/o?projection=noAcl&prefix=sd_tuned%2Fconfigs&prettyPrint=false: The specified bucket does not exist.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "starlette\routing.py", line 686, in lifespan
File "uvicorn\lifespan\on.py", line 137, in receive
File "asyncio\queues.py", line 158, in get
asyncio.exceptions.CancelledError

`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant