Linux安装与运行 Llama 3 的简明指南
一 环境准备与硬件建议
二 方式一 Ollama 一键安装与运行(推荐)
curl -fsSL https://ollama.com/install.sh | shollama serve[Unit]
Description=Ollama Service
After=network-online.target
[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin"
# 可选:远程访问
Environment="OLLAMA_HOST=0.0.0.0"
# 可选:跨域
Environment="OLLAMA_ORIGINS=*"
# 可选:自定义模型存放路径
Environment="OLLAMA_MODELS=/opt/ollama/models"
[Install]
WantedBy=multi-user.target然后执行:
sudo systemctl daemon-reload
sudo systemctl enable ollama
sudo systemctl start ollama# 默认 8B
ollama run llama3
# 指定 70B
ollama run llama3:70bollama --version
curl http://localhost:11434/api/generate -d '{"model":"llama3","prompt":"用一句话介绍Llama 3","stream":false}'ollama pull|list|rm|show|create(与 Docker 风格类似)。三 方式二 手动安装 Ollama 并自定义路径
curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama.tgz
sudo mkdir -p /opt/ollama
sudo tar -xzf ollama.tgz -C /opt/ollama
mkdir -p /opt/ollama/models[Service]
ExecStart=/opt/ollama/bin/ollama serve
Environment="OLLAMA_MODELS=/opt/ollama/models"
[Install]
WantedBy=multi-user.targetsudo systemctl daemon-reload
sudo systemctl enable ollama
sudo systemctl start ollama
ollama run llama3四 可视化界面 Open WebUI(可选)
docker run -d -p 3000:8080 \
--add-host=host.docker.internal:host-gateway \
-v open-webui:/app/backend/data \
--name open-webui --restart always \
ghcr.io/open-webui/open-webui:maindocker run -d -p 3000:8080 \
--add-host=host.docker.internal:host-gateway \
-v /opt/ollama/models:/app/backend/models \
-v open-webui:/app/backend/data \
--name open-webui --restart always \
ghcr.io/open-webui/open-webui:main五 常见问题与优化