Releases: substratusai/helm
vllm-0.5.5
A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.
vllm-0.5.4
A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.
vllm-0.5.3
A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.
lingo-0.2.1
A Helm chart for Lingo, the K8s LLM proxy and autoscaler
lingo-0.2.0
A Helm chart for Lingo, the K8s LLM proxy and autoscaler
vllm-0.4.7
A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.
vllm-0.4.6
A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.
vllm-0.4.5
A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.
vllm-0.4.4
A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.
vllm-0.4.3
A Helm chart for deploying vLLM. vLLM is a fast and easy-to-use library for LLM inference and serving.