diff --git a/docs/content/docs/getting-started/kubernetes.md b/docs/content/docs/getting-started/kubernetes.md index fb08b0467021..aea28f3ecddc 100644 --- a/docs/content/docs/getting-started/kubernetes.md +++ b/docs/content/docs/getting-started/kubernetes.md @@ -10,13 +10,13 @@ ico = "rocket_launch" For installing LocalAI in Kubernetes, the deployment file from the `examples` can be used and customized as prefered: ``` -kubectl apply -f https://raw.githubusercontent.com/mudler/LocalAI/master/examples/kubernetes/deployment.yaml +kubectl apply -f https://raw.githubusercontent.com/mudler/LocalAI-examples/refs/heads/main/kubernetes/deployment.yaml ``` For Nvidia GPUs: ``` -kubectl apply -f https://raw.githubusercontent.com/mudler/LocalAI/master/examples/kubernetes/deployment-nvidia.yaml +kubectl apply -f https://raw.githubusercontent.com/mudler/LocalAI-examples/refs/heads/main/kubernetes/deployment-nvidia.yaml ``` Alternatively, the [helm chart](https://github.com/go-skynet/helm-charts) can be used as well: