Linux上Llama 3安装流程
一 环境准备与硬件建议
二 方式一 Ollama一键安装与运行(推荐)
curl -fsSL https://ollama.com/install.sh | sh,安装完成后执行ollama -v验证。curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama.tgz
sudo mkdir -p /opt/ollama
sudo tar -xzf ollama.tgz -C /opt/ollamasudo tee /etc/systemd/system/ollama.service >/dev/null <<'EOF'
[Unit]
Description=Ollama Service
After=network-online.target
[Service]
ExecStart=/opt/ollama/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="OLLAMA_MODELS=/opt/ollama/models"
[Install]
WantedBy=multi-user.target
EOF
sudo systemctl daemon-reload
sudo systemctl enable --now ollamaollama run llama3ollama run llama3:70bollama pull llama3ollama listsudo systemctl start|stop|restart ollamasudo systemctl status ollamajournalctl -u ollama -f三 方式二 源码手动部署与API调用
sudo apt update
sudo apt install -y python3 python3-pip git-lfs
pip install torch torchvision torchaudio transformersfrom transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "meta-llama/Llama-3-8b" # 或本地路径
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
prompt = "Hello, Llama 3!"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=64)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))四 可视化界面与模型存放路径
docker run -d -p 3000:8080 \
--add-host=host.docker.internal:host-gateway \
-v open-webui:/app/backend/data \
--name open-webui --restart always \
ghcr.io/open-webui/open-webui:main-v /opt/ollama/models:/app/backend/modelsgit clone https://github.com/ollama-webui/ollama-webui-lite.git
cd ollama-webui-lite
npm install
npm run devOLLAMA_MODELS=/your/models/path(如/opt/ollama/models),并重启服务。Environment="OLLAMA_MODELS=/opt/ollama/models"。五 常见问题与优化
-p 5000:8080)或关闭占用进程。ollama pull llama3再运行;必要时手动将模型文件放入OLLAMA_MODELS目录。/etc/systemd/system/ollama.service中设置Environment="OLLAMA_HOST=0.0.0.0"并重启服务;如需跨域可加Environment="OLLAMA_ORIGINS=*"(仅在内网可信环境使用)。curl http://localhost:11434/api/generate -d '{
"model":"llama3",
"prompt":"请用中文介绍Llama 3。",
"stream":false
}'