Skip to content

Releases: BerriAI/litellm

v1.60.0.dev4

03 Feb 17:10
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.60.0.dev2...v1.60.0.dev4

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.60.0.dev4

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 150.0 179.79463683736907 6.359486247668494 0.0 1900 0 123.9115270000184 3798.7273850000065
Aggregated Passed ✅ 150.0 179.79463683736907 6.359486247668494 0.0 1900 0 123.9115270000184 3798.7273850000065

v1.60.0.dev2

02 Feb 02:13
8ba60bf
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.60.0...v1.60.0.dev2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.60.0.dev2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 160.0 179.3387644765704 6.274867330705683 0.0 1878 0 134.8906900000202 3148.732781000035
Aggregated Passed ✅ 160.0 179.3387644765704 6.274867330705683 0.0 1878 0 134.8906900000202 3148.732781000035

v1.60.0.dev1

02 Feb 01:13
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.60.0...v1.60.0.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.60.0.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 170.0 193.07171769197802 6.24812141882662 0.0 1870 0 149.06627900001013 846.9972659999883
Aggregated Passed ✅ 170.0 193.07171769197802 6.24812141882662 0.0 1870 0 149.06627900001013 846.9972659999883

v1.60.0

01 Feb 19:28
Compare
Choose a tag to compare

What's Changed

Important Changes between v1.50.xx to 1.60.0

Known Issues

🚨 Detected issue with Langfuse Logging when Langfuse credentials are stored in DB

New Contributors

Full Changelog: v1.59.10...v1.60.0

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.60.0

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 240.0 281.07272626532927 6.158354312051399 0.0 1843 0 215.79772499995897 3928.489000000013
Aggregated Passed ✅ 240.0 281.07272626532927 6.158354312051399 0.0 1843 0 215.79772499995897 3928.489000000013

v1.59.10

30 Jan 16:47
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.59.9...v1.59.10

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.59.10

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 210.0 239.24647793068146 6.21745665443628 0.00334092243655899 1861 1 73.25327600000264 3903.3159660000083
Aggregated Passed ✅ 210.0 239.24647793068146 6.21745665443628 0.00334092243655899 1861 1 73.25327600000264 3903.3159660000083

v1.59.8-stable

31 Jan 04:56
Compare
Choose a tag to compare

Full Changelog: v1.57.8-stable...v1.59.8-stable

Known Issues

🚨 Detected issue with Langfuse Logging when Langfuse credentials are stored in DB

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.59.8-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 260.0 291.2207591958183 6.075260080470321 0.0 1818 0 223.10552599998346 3813.1267819999266
Aggregated Passed ✅ 260.0 291.2207591958183 6.075260080470321 0.0 1818 0 223.10552599998346 3813.1267819999266

v1.59.9

29 Jan 15:44
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.59.8...v1.59.9

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.59.9

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 270.0 301.01550717582927 6.14169679840119 0.0 1837 0 234.85362500002793 3027.238808999982
Aggregated Failed ❌ 270.0 301.01550717582927 6.14169679840119 0.0 1837 0 234.85362500002793 3027.238808999982

v1.59.8-dev1

29 Jan 08:11
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.59.8...v1.59.8-dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.59.8-dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 230.0 253.74562668371757 6.073890684010945 0.0 1818 0 198.74819999995452 1957.5085989999934
Aggregated Passed ✅ 230.0 253.74562668371757 6.073890684010945 0.0 1818 0 198.74819999995452 1957.5085989999934

v1.59.8

27 Jan 16:15
Compare
Choose a tag to compare

What's Changed

  • refactor: cleanup dead codeblock by @krrishdholakia in #7936
  • add type annotation for litellm.api_base (#7980) by @krrishdholakia in #7994
  • (QA / testing) - Add unit testing for key model access checks by @ishaan-jaff in #7999
  • (Prometheus) - emit key budget metrics on startup by @ishaan-jaff in #8002
  • (Feat) set guardrails per team by @ishaan-jaff in #7993
  • Supported nested json schema on anthropic calls via proxy + fix langfuse sync sdk issues by @krrishdholakia in #8003
  • Bug fix - [Bug]: If you create a key tied to a user that does not belong to a team, and then edit the key to add it to a team (the user is still not a part of a team), using that key results in an unexpected error by @ishaan-jaff in #8008
  • (QA / testing) - Add e2e tests for key model access auth checks by @ishaan-jaff in #8000
  • (Fix) langfuse - setting LANGFUSE_FLUSH_INTERVAL by @ishaan-jaff in #8007

Full Changelog: v1.59.7...v1.59.8

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.59.8

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 280.0 325.48398318207154 6.003526201462839 0.0 1796 0 234.56590200004257 3690.442290999954
Aggregated Failed ❌ 280.0 325.48398318207154 6.003526201462839 0.0 1796 0 234.56590200004257 3690.442290999954

v1.59.7

25 Jan 06:44
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.59.6...v1.59.7

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.59.7

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 260.0 294.5630730660492 6.1254059494010225 0.0 1832 0 231.04980300001898 2728.9633709999634
Aggregated Passed ✅ 260.0 294.5630730660492 6.1254059494010225 0.0 1832 0 231.04980300001898 2728.9633709999634