Skip to content

llama-cpp multi server support #198

llama-cpp multi server support

llama-cpp multi server support #198

Triggered via pull request October 21, 2024 19:08
@cdoerncdoern
synchronize #316
Status Cancelled
Total duration 16m 5s
Artifacts

e2e-nvidia-t4-x1.yml

on: pull_request_target
Start external EC2 runner
2m 45s
Start external EC2 runner
Stop external EC2 runner
3s
Stop external EC2 runner
e2e-workflow-complete
0s
e2e-workflow-complete
Fit to window
Zoom out
Zoom in

Annotations

2 errors
E2E Test
Canceling since a higher priority waiting request for 'E2E (NVIDIA Tesla T4 x1)-316' exists
E2E Test
The operation was canceled.