主页 > IT业界  > 

openwebUI访问vllm加载deepseek微调过的本地大模型

openwebUI访问vllm加载deepseek微调过的本地大模型

文章目录 前言一、openwebui安装二、配置openwebui环境三、安装vllm四、启动vllm五、启动openwebui

前言

首先安装vllm,然后加载本地模型,会起一个端口好。 在安装openwebui,去访问这个端口号。下面具体步骤的演示。

一、openwebui安装 root@autodl-container-5fd249bc19-6ec28c15:~/autodl-tmp# mkdir open-webui root@autodl-container-5fd249bc19-6ec28c15:~/autodl-tmp# cd open-webui/ root@autodl-container-5fd249bc19-6ec28c15:~/autodl-tmp/open-webui# git clone github /open-webui/open-webui.git Cloning into 'open-webui'... remote: Enumerating objects: 90389, done. remote: Counting objects: 100% (121/121), done. remote: Compressing objects: 100% (64/64), done. remote: Total 90389 (delta 80), reused 57 (delta 57), pack-reused 90268 (from 2) Receiving objects: 100% (90389/90389), 174.91 MiB | 16.20 MiB/s, done. Resolving deltas: 100% (59438/59438), done. root@autodl-container-5fd249bc19-6ec28c15:~/autodl-tmp/open-webui# conda create --name open-webui python=3.12 Channels: - mirrors.tuna.tsinghua.edu /anaconda/pkgs/main - mirrors.tuna.tsinghua.edu /anaconda/pkgs/free - mirrors.tuna.tsinghua.edu /anaconda/cloud/pytorch - defaults Platform: linux-64 Collecting package metadata (repodata.json): done Solving environment: done ## Package Plan ## 二、配置openwebui环境

在激活虚拟环境时候,下面提示错误,让先run init,但打这个也没啥用,只需要source activate一下即可。 进入虚拟环境就安装openwebui.

root@autodl-container-5fd249bc19-6ec28c15:~/autodl-tmp/open-webui# conda activate open-webui CondaError: Run 'conda init' before 'conda activate' root@autodl-container-5fd249bc19-6ec28c15:~/autodl-tmp/open-webui# source activate (base) root@autodl-container-5fd249bc19-6ec28c15:~/autodl-tmp/open-webui# conda activate open-webui (open-webui) root@autodl-container-5fd249bc19-6ec28c15:~/autodl-tmp/open-webui# pip install open-webui Looking in indexes: http://mirrors.aliyun /pypi/simple Collecting open-webui Downloading http://mirrors.aliyun /pypi/packages/76/f7/89777775051feb35049d70b9119e050b7830ed1eb07cfaa7159bd0c52cc0/open_webui-0.5.18-py3-none-any.whl (131.0 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 131.0/131.0 MB 17.1 MB/s eta 0:00:00 Collecting aiocache (from open-webui) 三、安装vllm

然后开始装vllm

(open-webui) root@autodl-container-5fd249bc19-6ec28c15:~/autodl-tmp/open-webui# pip install vllm Looking in indexes: http://mirrors.aliyun /pypi/simple Collecting vllm Downloading http://mirrors.aliyun /pypi/packages/4f/d2/18246f43ca730bb81918f87b7e886531eda32d835811ad9f4657c54eee35/sentencepiece-0.2.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.3 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 15.0 MB/s eta 0:00:00 Downloading http://mirrors.aliyun /pypi/packages/8d/cf/9b775a1a1f5fe2f6c2d321396ad41b9849de2c76fa46d78e6294ea13be91/vllm-0.7.3-cp38-abi3-manylinux1_x86_64.whl (264.6 MB) ━━━━━━━━━━━━━━━━━━━━╸━━━━━━━━━━━━━━━━━━━ 136.6/264.6 MB 14.5 MB/s eta 0:00:09 四、启动vllm

启动模型

(open-webui) root@autodl-container-5fd249bc19-6ec28c15:~/autodl-tmp/open-webui# vllm serve /root/autodl-tmp/llm/deepseek-ai/DeepSeek-R1-Distill-Qwen-1___5B-merged INFO 03-02 10:00:20 __init__.py:207] Automatically detected platform cuda. INFO 03-02 10:00:20 api_server.py:912] vLLM API server version 0.7.3 INFO 03-02 10:00:20 api_server.py:913] args: Namespace(subparser='serve', model_tag='/root/autodl-tmp/llm/deepseek-ai/DeepSeek-R1-Distill-Qwen-1___5B-merged', config='', host=None, port=8000, uvicorn_log_level='info', allow_credentials=False, allowed_origins=['*'], allowed_methods=['*'], allowed_headers=['*'], api 五、启动openwebui

调用的微调模型来源 blog.csdn.net/weixin_41688410/article/details/145948449 配置openUI环境,因为openwebUI是默认连ollama,所以要设置为false.设置vllm启动的端口号。

(open-webui) root@autodl-container-5fd249bc19-6ec28c15:~/autodl-tmp/open-webui# export HF_ENDPOINT= hf-mirror (open-webui) root@autodl-container-5fd249bc19-6ec28c15:~/autodl-tmp/open-webui# export ENABLE_OLLAMA_API=False (open-webui) root@autodl-container-5fd249bc19-6ec28c15:~/autodl-tmp/open-webui# export OPENAI_API_BASE_URL=http://127.0.0.1:8000/v1 (open-webui) root@autodl-container-5fd249bc19-6ec28c15:~/autodl-tmp/open-webui#

上述环境配好后就可以启动open-webui

(open-webui) root@autodl-container-5fd249bc19-6ec28c15:~/autodl-tmp/open-webui# open-webui serve Loading WEBUI_SECRET_KEY from file, not provided as an environment variable. Converting 'chat' column to JSON Renaming 'chat' column to 'old_chat' Adding new 'chat' column of type JSON Dropping 'old_chat' column INFO [alembic.runtime.migration] Running upgrade 242a2047eae0 -> 1af9b942657b, Migrate tags INFO [alembic.runtime.migration] Running upgrade 1af9b942657b -> 3ab32c4b8f59, Update tags Primary Key: {'name': None, 'constrained_columns': []} INFO [alembic.runtime.migration] Running upgrade 3ab32c4b8f59 -> c69f45358db4, Add folder table INFO [alembic.runtime.migration] Running upgrade c69f45358db4 -> c29facfe716b, Update file table path WARNING: CORS_ALLOW_ORIGIN IS SET TO '*' - NOT RECOMMENDED FOR PRODUCTION DEPLOYMENTS. INFO [open_webui.env] Embedding model set: sentence-transformers/all-MiniLM-L6-v2 /root/miniconda3/envs/open-webui/lib/python3.12/site-packages/pydub/utils.py:170: RuntimeWarning: Couldn't find ffmpeg or avconv - defaulting to ffmpeg, but may not work warn("Couldn't find ffmpeg or avconv - defaulting to ffmpeg, but may not work", RuntimeWarning) WARNI [langchain_community.utils.user_agent] USER_AGENT environment variable not set, consider setting it to identify your requests. ██████╗ ██████╗ ███████╗███╗ ██╗ ██╗ ██╗███████╗██████╗ ██╗ ██╗██╗ ██╔═══██╗██╔══██╗██╔════╝████╗ ██║ ██║ ██║██╔════╝██╔══██╗██║ ██║██║ ██║ ██║██████╔╝█████╗ ██╔██╗ ██║ ██║ █╗ ██║█████╗ ██████╔╝██║ ██║██║ ██║ ██║██╔═══╝ ██╔══╝ ██║╚██╗██║ ██║███╗██║██╔══╝ ██╔══██╗██║ ██║██║ ╚██████╔╝██║ ███████╗██║ ╚████║ ╚███╔███╔╝███████╗██████╔╝╚██████╔╝██║ ╚═════╝ ╚═╝ ╚══════╝╚═╝ ╚═══╝ ╚══╝╚══╝ ╚══════╝╚═════╝ ╚═════╝ ╚═╝ v0.5.18 - building the best open-source AI user interface. github /open-webui/open-webui gitattributes: 1.23kB [00:00, 9.08MB/s] 1_Pooling%2Fconfig.json: 100%|███████████████████████████████████████████████████████████████████████████████████| 190/190 [00:00<00:00, 1.30MB/s] data_config.json: 39.3kB [00:00, 120MB/s] | 0.00/

启动起来后,弹出页面 点击使用,出现下面,随笔写就行了,email可以给个假的都行。 点击建立管理员 点击开始以后,就可以去问问题

这里问题是本模型微调时使用的数据。

标签:

openwebUI访问vllm加载deepseek微调过的本地大模型由讯客互联IT业界栏目发布,感谢您对讯客互联的认可,以及对我们原创作品以及文章的青睐,非常欢迎各位朋友分享到个人网站或者朋友圈,但转载请说明文章出处“openwebUI访问vllm加载deepseek微调过的本地大模型