Skip to content

fix(llama.cpp): set -1 as default for max tokens #2330

fix(llama.cpp): set -1 as default for max tokens

fix(llama.cpp): set -1 as default for max tokens #2330

Triggered via pull request April 20, 2024 14:59
@mudlermudler
synchronize #2087
default_max
Status Success
Total duration 6m 48s
Artifacts

test-extra.yml

on: pull_request
tests-transformers
5m 13s
tests-transformers
tests-sentencetransformers
5m 54s
tests-sentencetransformers
tests-diffusers
4m 5s
tests-diffusers
tests-parler-tts
4m 34s
tests-parler-tts
tests-transformers-musicgen
6m 21s
tests-transformers-musicgen
tests-vallex
6m 13s
tests-vallex
tests-coqui
5m 21s
tests-coqui
Fit to window
Zoom out
Zoom in