Skip to content

Releases: substratusai/helm

vllm-0.5.5

19 Aug 21:34
aca7994
Compare
Choose a tag to compare

A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.

vllm-0.5.4

04 Aug 18:21
26c20ce
Compare
Choose a tag to compare

A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.

vllm-0.5.3

25 Jul 16:01
09c0eea
Compare
Choose a tag to compare

A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.

lingo-0.2.1

08 Jul 05:59
1dd9f2c
Compare
Choose a tag to compare

A Helm chart for Lingo, the K8s LLM proxy and autoscaler

lingo-0.2.0

06 Jul 06:37
001c596
Compare
Choose a tag to compare

A Helm chart for Lingo, the K8s LLM proxy and autoscaler

vllm-0.4.7

22 May 20:02
72380ce
Compare
Choose a tag to compare

A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.

vllm-0.4.6

21 May 03:14
2e17be6
Compare
Choose a tag to compare

A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.

vllm-0.4.5

21 May 02:29
1bbddc0
Compare
Choose a tag to compare

A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.

vllm-0.4.4

11 May 16:03
5f3a4c1
Compare
Choose a tag to compare

A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.

vllm-0.4.3

08 May 05:43
55945db
Compare
Choose a tag to compare

A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.