微信扫码
与创始人交个朋友
我要投稿
在现代云原生应用的开发和部署过程中,Kubernetes 已成为最流行的容器编排工具。而 Ollama 作为一款高效安装大模型的工具,能与 Kubernetes 完美结合,实现高效、可扩展的大模型部署。本文将带你在 10 分钟内学会如何在 Kubernetes 中部署 Ollama。
在开始之前,请确保你已经完成以下准备工作:
1. 安装 Kubernetes:确保你的机器上已经安装并配置好了 Kubernetes 集群。如果没有,请参考Kubernetes 官方文档进行安装。
2. 安装 kubectl:kubectl 是与 Kubernetes 集群交互的命令行工具。请参考kubectl 安装指南进行安装。
3. Docker 镜像:确保你已经构建并推送了 Ollama 的 Docker 镜像到你的容器镜像仓库(如 Docker Hub)。
如果是简单尝试的话,也可以使用 microk8s - https://microk8s.io/ 或者 k3s - https://k3s.io/ 进行安装。
首先,我们需要编写一个 Kubernetes 部署文件,用于定义 Ollama 服务的部署方式。创建一个名为 ollama-deployment.yaml
的文件,并添加以下内容:
apiVersion: apps/v1
kind: Deployment
metadata:
name: ollama
spec:
selector:
matchLabels:
app: ollama
template:
metadata:
labels:
app: ollama
spec:
containers:
name: ollama
image: ollama/ollama:latest
ports:
name: http
containerPort: 11434
protocol: TCP
---
apiVersion: v1
kind: Service
metadata:
name: ollama-svc
spec:
selector:
app: ollama
ports:
protocol: TCP
port: 11434
使用 kubectl 命令将部署文件应用到 Kubernetes 集群中:
kubectl apply -f ollama-deployment.yaml
执行该命令后,Kubernetes 将根据部署文件创建 Ollama 服务和负载均衡器。
通过以下命令检查部署状态:
kubectl get deployments
kubectl get pods
kubectl get services
或者通过 dashboard 查看部署的状态
你应该能看到 Ollama 部署的相关信息,包括副本数量、Pod 状态和服务的外部 IP 地址。
进入 pod 里面,执行以下命令安装对应的模型。你也可以自己构建一个对应模型的镜像在之前的步骤中部署,这样就不用再手动安装模型了。
ollama run llama3
安装完之后即可在 namespace 下其他 pod 通过 curl 命令请求安装的模型。
$ curl -L 'http://ollama-svc:11434/api/generate' \
> -H 'Content-Type: application/json' \
> -d '{
> "model": "llama3",
> "prompt": "如何处理职场冲突",
> "format": "json",
> "stream": false
> }'
{"model":"llama3-zh:latest","created_at":"2024-07-09T12:13:05.90114Z","response":"{ \"message\": \"在职场中处理冲突是一项重要的技能。以下是一些可能有助于你解决冲突的策略:\\r\\n\\r\\n1. **倾听对方** - 给你的同事一个机会,详细地了解他们的观点和担忧。你可以通过重复、总结或提问来表明你在认真聆听。\\r\\n2. **保持冷静** - 尽量不让情绪影响对话。深呼吸,给自己一点时间冷静下来,这样你就能更有条理地解决问题。\\r\\n3. **寻求共同点** - 尝试找到双方都同意的事情。这可以帮助建立一种合作的气氛,使得继续谈判变得更加容易。\\r\\n4. **用“I”语言** - 使用“我感到…”或“我认为…”而不是“你总是…”,这样能减少指责并保持对话的建设性。\\r\\n5. **提出解决方案** - 当你明白了对方的问题和担忧后,你可以提出一些可能的解决办法。确保这些提议是具体可行的,并且考虑到了双方的利益。\\r\\n6. **寻求第三方帮助** - 如果冲突非常严重或你感到自己无法处理,那么寻找一个中立的第三方(如人力资源代表或职业顾问)可能是一个好主意。他们可以提供专业建议,并在必要时介入调解。\\r\\n7. **保持尊重和诚实** - 保持对话中的尊重与诚实,哪怕你和对方观点不同,也要以同事身份相互尊重。\\r\\n\\r\\n记住,没有任何人喜欢冲突,但有时候它们可以成为成长的机会。通过有效地解决冲突,你可能会发现新的工作方式或增强团队凝聚力。\", \"type\": \"text\", \"is_end_session\": false } \n\n \n \t\t\t\t\t\n \n \t\t\t\t\t\n \n \t\t\t\t\t\t\n ","done":true,"done_reason":"stop","context":[198,27,91,318,62,2527,91,29,882,198,109425,55642,104077,83324,110158,104584,27,91,318,62,408,91,29,198,27,91,318,62,2527,91,29,78191,198,90,220,1,2037,794,220,1,19000,104077,83324,16325,55642,110158,104584,107226,48982,107693,9554,118552,1811,88852,107226,98184,88367,19361,103129,35304,57668,114914,110158,104584,9554,105226,105838,5232,59,81,59,77,59,81,59,77,16,13,220,334,20022,122,50287,124269,334,220,12,45154,247,110310,42016,30926,48044,126490,3922,67933,105986,30590,114706,104563,9554,103276,28542,34208,106529,26203,100,1811,57668,74770,68438,30358,59464,5486,60843,37985,58291,29172,57107,37507,21405,31958,57668,19000,30051,89151,36735,228,50287,67998,81,59,77,17,13,220,334,118551,106142,106353,334,220,12,59330,121,33857,16937,102654,40474,12774,103,109829,33764,58543,1811,102987,105324,107246,3922,90112,102099,117373,21082,106142,106353,113931,3922,104390,57668,81258,27327,34226,19361,40089,22649,30590,114914,87219,67998,81,59,77,18,13,220,334,116472,32018,119046,28542,334,220,12,59330,251,42421,125414,104836,24273,72368,126794,121738,1811,44388,74770,123725,114690,120143,112355,9554,102146,30320,249,3922,33655,50928,114638,110695,106053,124662,124778,119237,67998,81,59,77,19,13,220,334,11883,2118,40,863,120074,334,220,12,86758,2118,37046,117293,51279,58291,2118,37046,112403,51279,69636,103668,2118,57668,60843,21043,1981,34690,104390,27327,111689,83747,64467,70616,64026,118551,33764,58543,9554,108053,34171,67998,81,59,77,20,13,220,334,118664,114914,112897,334,220,12,85997,57668,120222,35287,124269,125648,34208,106529,26203,100,34547,3922,57668,74770,118664,113882,88367,9554,114914,114997,1811,35056,33563,108787,29172,97522,21043,118789,31540,23039,9554,3922,64026,103786,124116,106837,104836,24273,9554,60632,105576,67998,81,59,77,21,13,220,334,116472,32018,109790,24273,123725,334,220,12,82363,110158,104584,108008,109759,30358,58291,57668,117293,102099,110621,55642,3922,111498,116472,93233,48044,16325,80195,9554,109790,24273,10110,30624,17792,48634,86429,106691,58291,116319,113178,57107,7705,88367,122503,53901,36668,37689,1811,104563,74770,104908,107371,122903,3922,64026,19000,109215,13646,75910,17701,48972,50338,67998,81,59,77,22,13,220,334,118551,113797,30358,34208,120228,41073,334,220,12,111505,69978,33764,58543,105363,113797,30358,58318,120228,41073,3922,106189,110878,57668,34208,124269,103276,28542,107653,3922,75863,31634,23897,42016,30926,124176,50021,106483,113797,30358,67998,81,59,77,59,81,59,77,41914,101987,3922,81543,109545,17792,114765,110158,104584,3922,78388,19361,105703,127150,74770,112743,13153,46961,9554,126490,1811,68438,89186,30590,114914,110158,104584,3922,57668,88367,38093,109836,116879,102301,76868,58291,50285,103229,104440,83266,118314,111200,48634,1811,498,220,1,1337,794,220,1,1342,498,220,1,285,62,408,62,6045,794,905,220,92,4815,720,256,12858,2355,256,12858,2355,256,18737,256],"total_duration":30680710500,"load_duration":41205916,"prompt_eval_count":33,"prompt_eval_duration":5121204000,"eval_count":448,"eval_duration":25513774000}
太棒了!通过以上操作,你已经成功地将 Ollama 部署到 Kubernetes 环境中。这一过程不仅展示了 Kubernetes 的强大与灵活,还突显了 Ollama 部署的便捷。相信这篇简明易懂的教程能够让你更加熟练地运用 Kubernetes 与 Ollama 的完美结合,为你的大模型应用开发和部署带来无限可能。
53AI,企业落地应用大模型首选服务商
产品:大模型应用平台+智能体定制开发+落地咨询服务
承诺:先做场景POC验证,看到效果再签署服务协议。零风险落地应用大模型,已交付160+中大型企业
2024-05-28
2024-04-26
2024-04-11
2024-08-21
2024-07-09
2024-08-13
2024-07-18
2024-10-25
2024-07-01
2024-06-17