Skip to content

vllm comps support openai API ChatCompletionRequest (#1032) #245

vllm comps support openai API ChatCompletionRequest (#1032)

vllm comps support openai API ChatCompletionRequest (#1032) #245

Triggered via push December 13, 2024 09:56
Status Success
Total duration 40s
Artifacts
get-build-matrix
4s
get-build-matrix
Matrix: image-build
Fit to window
Zoom out
Zoom in

Annotations

3 errors and 5 warnings
image-build (llms, docker-build-gaudi)
Process completed with exit code 17.
image-build (llms, docker-build-gaudi)
no submodule mapping found in .gitmodules for path 'vllm-fork'
image-build (llms, docker-build-xeon)
Process completed with exit code 17.
get-build-matrix
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
get-build-matrix
The process '/usr/bin/git' failed with exit code 128
image-build (llms, docker-build-gaudi)
The process '/usr/bin/git' failed with exit code 128
image-build (llms, docker-build-gaudi)
Unable to clean or reset the repository. The repository will be recreated instead.
image-build (llms, docker-build-xeon)
The process '/usr/bin/git' failed with exit code 128