appstore/apps/open-webui
2024-06-01 18:07:27 +08:00
..
gpu-cuda feat:删除version设置 2024-06-01 18:07:27 +08:00
main feat:删除version设置 2024-06-01 18:07:27 +08:00
data.yml feat:添加open-webui到列表 2024-05-21 02:11:32 +08:00
logo.png feat:添加open-webui到列表 2024-05-21 02:11:32 +08:00
README.md feat:添加open-webui到列表 2024-05-21 02:11:32 +08:00

使用说明

  • 首次注册的账户即为管理员账户。

  • 安装 gpu 版本时,1Panel v1.10.3-lts以下版本会覆盖docker-compose.ymlgpu设置,所以最好安装完成后检查一下, 不对则用以下覆盖并在应用目录下手动执行docker-compose down && docker-compose up -d

version: '3.3'
services:
  open-webui:
    image: ghcr.io/open-webui/open-webui:cuda
    container_name: ${CONTAINER_NAME}
    restart: always
    networks:
      - 1panel-network
    ports:
      - "${PANEL_APP_PORT_HTTP}:8080"
    environment:
      - NVIDIA_VISIBLE_DEVICES=all
      - OLLAMA_BASE_URL=${OLLAMA_BASE_URL}
      - OPENAI_API_KEY=${OPENAI_API_KEY}
    deploy:
      resources:
        reservations:
          devices:
            - capabilities: [gpu]
    extra_hosts:
      - "host.docker.internal:host-gateway"
    volumes:
      - "${DATA_PATH}:/app/backend/data"
    labels:  
      createdBy: "Apps"

networks:  
  1panel-network:  
    external: true

原始相关


Open WebUI (Formerly Ollama WebUI) 👋

GitHub stars GitHub forks GitHub watchers GitHub repo size GitHub language count GitHub top language GitHub last commit Hits Discord

Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs. For more information, be sure to check out our Open WebUI Documentation.