微信扫码
添加专属顾问
我要投稿
docker pull tugraph/tugraph-runtime-centos7:latest
mkdir -p /tmp/tugraph/data && mkdir -p /tmp/tugraph/log && \
docker run -it -d -p 7001:7001 -p 7070:7070 -p 7687:7687 -p 8000:8000 -p 8888:8888 -p 8889:8889 -p 9090:9090 \
-v /tmp/tugraph/data:/var/lib/lgraph/data-v /tmp/tugraph/log:/var/log/lgraph_log \
--name tugraph_demo tugraph/tugraph-runtime-centos7:latest /bin/bash && \
docker exec -d tugraph_demo bash /setup.sh
pip install "neo4j>=5.20.0"
.env
设置 TuGraph
配置GRAPH_STORE_TYPE=TuGraphTUGRAPH_HOST=127.0.0.1TUGRAPH_PORT=7687TUGRAPH_USERNAME=adminTUGRAPH_PASSWORD=xxx
ollama pull qwen:0.5b
ollama pull nomic-embed-text
pip install ollama
.env
配置环境LLM_MODEL=ollama_proxyllmPROXY_SERVER_URL=http://127.0.0.1:11434PROXYLLM_BACKEND="qwen:0.5b"PROXY_API_KEY=not_usedEMBEDDING_MODEL=proxy_ollamaproxy_ollama_proxy_server_url=http://127.0.0.1:11434proxy_ollama_proxy_backend="nomic-embed-text:latest"
python dbgpt/app/dbgpt_server.py
Profile
模块实现,支持从环境变量、数据库和其他实现创建 agent profilessensory memory
, short-term memory
, long-term memory
and hybrid memory
Agent Resource
模块重构Resource
类型,包括database, knowledge, tool, pack等。另外,资源的集合是一种特殊类型的资源,称为Resource Pack
dbgpts
中安装资源,例如使用下面命令安装一个简单计算器工具dbgpt app install simple-calculator-example -U
examples/agents
? ChatKnowledge
支持 rerank
模型,同时支持将 rerank
模型发布成服务
.env
文件设置模型参数并重启服务## Rerank modelRERANK_MODEL=bge-reranker-base## If you not set RERANK_MODEL_PATH, DB-GPT will read the model path from EMBEDDING_MODEL_CONFIG based on the RERANK_MODEL.# RERANK_MODEL_PATH=## The number of rerank results to returnRERANK_TOP_K=3
dbgpt start controller --port 8000
dbgpt start worker --worker_type text2vec \
--rerank \
--model_path /app/models/bge-reranker-base \
--model_name bge-reranker-base \
--port 8004 \
--controller_addr http://127.0.0.1:8000
LLM_MODEL=deepseek_proxyllmDEEPSEEK_MODEL_VERSION=deepseek-chatDEEPSEEK_API_BASE=https://api.deepseek.com/v1DEEPSEEK_API_KEY={your-deepseek-api-key}
test_proxyllm.py
DeepseekLLMClient
import asynciofrom dbgpt.core import ModelRequestfrom dbgpt.model.proxy import DeepseekLLMClient# You should set DEEPSEEK_API_KEY to your environment variablesclient = DeepseekLLMClient()print(asyncio.run(client.generate(ModelRequest._build("deepseek-chat", "你是谁?"))))
DEEPSEEK_API_KEY={your-deepseek-api-key} python test_proxyllm.py
.env
# [Yi-1.5-34B-Chat](https://huggingface.co/01-ai/Yi-1.5-34B-Chat)LLM_MODEL=yi-1.5-6b-chat# [Yi-1.5-9B-Chat](https://huggingface.co/01-ai/Yi-1.5-9B-Chat)LLM_MODEL=yi-1.5-9b-chat# [Yi-1.5-6B-Chat](https://huggingface.co/01-ai/Yi-1.5-6B-Chat)LLM_MODEL=yi-1.5-34b-chat
Chroma
向量数据库模块Elasticsearch
作为向量数据库ChatKnowledge
支持 ExcelEmbeddingRetriever
使用CrossEncoderRanker
问题53AI,企业落地大模型首选服务商
产品:场景落地咨询+大模型应用平台+行业解决方案
承诺:免费场景POC验证,效果验证后签署服务协议。零风险落地应用大模型,已交付160+中大型企业
2025-02-01
2024-07-25
2025-01-01
2025-02-04
2024-08-13
2024-04-25
2024-06-13
2024-08-21
2024-09-23
2024-04-26
2025-04-24
2025-04-24
2025-04-23
2025-04-23
2025-04-23
2025-04-22
2025-04-22
2025-04-22