2025-11-16 04:34:40 +00:00
|
|
|
|
|
|
|
|
|
|
# DevStar AI+ DevOps
|
|
|
|
|
|
|
|
|
|
|
|
DevStar AI+ DevOps 是一个完整的AI驱动研发平台解决方案,通过集成 DevStar平台、代码大语言模型、Gitea MCP Server和 AI Code Tools(Cursor、Claude Code、iFlow等),为开发者提供智能化研发支撑体系。
|
|
|
|
|
|
|
2025-11-16 08:20:39 +00:00
|
|
|
|
## 🚀 快速部署配置指南
|
2025-11-16 04:34:40 +00:00
|
|
|
|
|
|
|
|
|
|
### 一、部署 DevStar 代码托管平台
|
|
|
|
|
|
|
2025-11-16 05:27:53 +00:00
|
|
|
|
Ubuntu-20.04下完成安装:
|
2025-11-16 04:34:40 +00:00
|
|
|
|
|
|
|
|
|
|
```
|
|
|
|
|
|
wget -c https://devstar.cn/assets/install.sh && chmod +x install.sh && sudo ./install.sh
|
|
|
|
|
|
sudo devstar start
|
|
|
|
|
|
```
|
2025-11-16 05:27:53 +00:00
|
|
|
|
|
|
|
|
|
|
安装完成后,我们得到DevStar代码托管平台的URL,比如http://172.16.94.26:80
|
|
|
|
|
|
|
|
|
|
|
|
### 二、Ollama私有部署代码大模型
|
|
|
|
|
|
|
2025-11-16 08:20:39 +00:00
|
|
|
|
> 如您使用第三方API及Token可以跳过这一部分。
|
|
|
|
|
|
|
2025-11-16 05:27:53 +00:00
|
|
|
|
Ubuntu-20.04下完成安装:
|
|
|
|
|
|
```
|
|
|
|
|
|
curl -fsSL https://ollama.com/install.sh | sh
|
|
|
|
|
|
|
|
|
|
|
|
# 验证是否安装成功
|
|
|
|
|
|
ollama --version
|
|
|
|
|
|
|
|
|
|
|
|
# 下载Qwen2.5-Coder大模型
|
|
|
|
|
|
ollama pull qwen2.5-coder:32b
|
|
|
|
|
|
|
|
|
|
|
|
# 列出已下载的模型
|
|
|
|
|
|
ollama list
|
|
|
|
|
|
|
|
|
|
|
|
# 测试模型
|
|
|
|
|
|
ollama run qwen2.5-coder:32b "Hello, can you help me code?"
|
|
|
|
|
|
|
|
|
|
|
|
# 启动Ollama服务 (默认端口11434)
|
|
|
|
|
|
ollama serve
|
|
|
|
|
|
|
|
|
|
|
|
# 验证服务状态
|
|
|
|
|
|
curl http://172.16.94.26:11434/api/tags
|
|
|
|
|
|
```
|
|
|
|
|
|
* 解决Ollama只能本地访问的问题
|
|
|
|
|
|
```
|
|
|
|
|
|
# 添加环境变量OLLAMA_HOST=0.0.0.0和OLLAMA_ORIGINS=*
|
|
|
|
|
|
sed -i '/\[Service\]/a Environment=OLLAMA_HOST=0.0.0.0' /etc/systemd/system/ollama.service
|
|
|
|
|
|
sed -i '/\[Service\]/a Environment=OLLAMA_ORIGINS=*' /etc/systemd/system/ollama.service
|
|
|
|
|
|
# 重新加载并重启
|
|
|
|
|
|
systemctl daemon-reexec
|
|
|
|
|
|
systemctl daemon-reload
|
|
|
|
|
|
systemctl restart ollama
|
|
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
|
|
安装完成后,我们得到API URL,比如http://172.16.94.26:11434/api/tags model比如qwen2.5-coder:32b token比如TOKEN***************
|
|
|
|
|
|
|
2025-11-16 08:20:39 +00:00
|
|
|
|
### 三、在项目中使用代码大模型
|
|
|
|
|
|
|
|
|
|
|
|
#### 配置AI Code Review到CI/CD工作流中
|
2025-11-16 05:27:53 +00:00
|
|
|
|
|
|
|
|
|
|
在您的项目中添加.gitea/workflows/code-review.yml , 这里使用kekxv/AiReviewPR@v0.0.6来进行AI Code Review
|
|
|
|
|
|
```
|
|
|
|
|
|
name: ai-reviews
|
|
|
|
|
|
|
|
|
|
|
|
on:
|
|
|
|
|
|
pull_request:
|
|
|
|
|
|
types: [opened, synchronize]
|
|
|
|
|
|
|
|
|
|
|
|
jobs:
|
|
|
|
|
|
review:
|
|
|
|
|
|
runs-on: ubuntu-latest
|
|
|
|
|
|
|
|
|
|
|
|
steps:
|
|
|
|
|
|
- name: Checkout code
|
|
|
|
|
|
uses: actions/checkout@v4
|
|
|
|
|
|
with:
|
|
|
|
|
|
fetch-depth: 0
|
|
|
|
|
|
- name: Review code
|
|
|
|
|
|
uses: kekxv/AiReviewPR@v0.0.6
|
|
|
|
|
|
with:
|
|
|
|
|
|
model: ${{ vars.MODEL }}
|
|
|
|
|
|
host: ${{ vars.OLLAMA_HOST }}
|
|
|
|
|
|
REVIEW_PULL_REQUEST: false
|
|
|
|
|
|
```
|
|
|
|
|
|
DevStar代码托管平台中项目设置、用户设置和后台管理中都可以设置变量vars.MODEL、vars.OLLAMA_HOST等。
|
|
|
|
|
|
|
2025-11-16 08:20:39 +00:00
|
|
|
|
#### 安装配置MCP Server
|
|
|
|
|
|
|
|
|
|
|
|
在 VS Code 中使用,要快速安装,请使用如下安装按钮。
|
|
|
|
|
|
|
|
|
|
|
|
[](https://insiders.vscode.dev/redirect/mcp/install?name=gitea&inputs=[{%22id%22:%22gitea_token%22,%22type%22:%22promptString%22,%22description%22:%22Gitea%20Personal%20Access%20Token%22,%22password%22:true}]&config={%22command%22:%22docker%22,%22args%22:[%22run%22,%22-i%22,%22--rm%22,%22-e%22,%22GITEA_ACCESS_TOKEN%22,%22docker.gitea.com/gitea-mcp-server%22],%22env%22:{%22GITEA_ACCESS_TOKEN%22:%22${input:gitea_token}%22}}) [](https://insiders.vscode.dev/redirect/mcp/install?name=gitea&inputs=[{%22id%22:%22gitea_token%22,%22type%22:%22promptString%22,%22description%22:%22Gitea%20Personal%20Access%20Token%22,%22password%22:true}]&config={%22command%22:%22docker%22,%22args%22:[%22run%22,%22-i%22,%22--rm%22,%22-e%22,%22GITEA_ACCESS_TOKEN%22,%22docker.gitea.com/gitea-mcp-server%22],%22env%22:{%22GITEA_ACCESS_TOKEN%22:%22${input:gitea_token}%22}}&quality=insiders)
|
|
|
|
|
|
|
|
|
|
|
|
也可以在项目中添加到 .vscode/mcp.json 文件如下:
|
|
|
|
|
|
|
|
|
|
|
|
```
|
|
|
|
|
|
{
|
|
|
|
|
|
"mcp": {
|
|
|
|
|
|
"inputs": [
|
|
|
|
|
|
{
|
|
|
|
|
|
"type": "promptString",
|
|
|
|
|
|
"id": "gitea_token",
|
|
|
|
|
|
"description": "Gitea 个人访问令牌",
|
|
|
|
|
|
"password": true
|
|
|
|
|
|
}
|
|
|
|
|
|
],
|
|
|
|
|
|
"servers": {
|
|
|
|
|
|
"gitea-mcp": {
|
|
|
|
|
|
"command": "docker",
|
|
|
|
|
|
"args": [
|
|
|
|
|
|
"run",
|
|
|
|
|
|
"-i",
|
|
|
|
|
|
"--rm",
|
|
|
|
|
|
"-e",
|
|
|
|
|
|
"GITEA_HOST",
|
|
|
|
|
|
"-e",
|
|
|
|
|
|
"GITEA_ACCESS_TOKEN",
|
|
|
|
|
|
"docker.gitea.com/gitea-mcp-server"
|
|
|
|
|
|
],
|
|
|
|
|
|
"env": {
|
|
|
|
|
|
"GITEA_HOST": "--host http://172.16.94.26",
|
|
|
|
|
|
"GITEA_ACCESS_TOKEN": "${input:gitea_token}"
|
|
|
|
|
|
}
|
|
|
|
|
|
}
|
|
|
|
|
|
}
|
|
|
|
|
|
}
|
|
|
|
|
|
}
|
|
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
|
|
#### 配置AI IDE/CLI使用私有大模型及MCP Server
|
|
|
|
|
|
|
|
|
|
|
|
* Copilot,简要文字描述,不要上太多图,可以提供官方配置链接
|
|
|
|
|
|
* Cursor
|
|
|
|
|
|
* Continue
|
|
|
|
|
|
* ...
|
|
|
|
|
|
|
|
|
|
|
|
## 🚀 DevStar AI+ DevOps演示
|
|
|
|
|
|
|
|
|
|
|
|
在前面部署配置的基础上,我们以VSCode + Copilot或者Continue等为例,演示AI生成代码、触发CI/CD工作流及AI Code Review
|
|
|
|
|
|
|
|
|
|
|
|
### 创建一个项目
|
|
|
|
|
|
|
|
|
|
|
|
使用ai-develops项目模板创建项目
|
|
|
|
|
|
|
|
|
|
|
|
todo
|
|
|
|
|
|
|
|
|
|
|
|
### AI生成代码
|
|
|
|
|
|
|
|
|
|
|
|
todo
|
|
|
|
|
|
|
|
|
|
|
|
### 提交PR
|
|
|
|
|
|
|
|
|
|
|
|
todo
|
|
|
|
|
|
|
|
|
|
|
|
### AI Code Review
|
2025-11-16 05:27:53 +00:00
|
|
|
|
|
|
|
|
|
|
todo
|
2025-11-16 08:20:39 +00:00
|
|
|
|
|
|
|
|
|
|
### 合并PR
|
|
|
|
|
|
|
|
|
|
|
|
todo
|