llama-factory||AutoDL平台
- 互联网
- 2025-09-17 11:00:02

报错如下: root@autodl-container-d83e478b47-3def8c49:~/LLaMA-Factory# llamafactory-cli webui * Running on local URL: http://0.0.0.0:7860 Could not create share link. Missing file: /root/miniconda3/lib/python3.10/site-packages/gradio/frpc_linux_amd64_v0.3. Please check your internet connection. This can happen if your antivirus software blocks the download of this file. You can install manually by following these steps: 1. Download this file: cdn-media.huggingface.co/frpc-gradio-0.3/frpc_linux_amd64 2. Rename the downloaded file to: frpc_linux_amd64_v0.3 3. Move the file to this location: /root/miniconda3/lib/python3.10/site-packages/gradio 解决办法: cp frpc_linux_amd64_v0.3 /root/miniconda3/lib/python3.10/site-packages/gradio cd /root/miniconda3/lib/python3.10/site-packages/gradio
提高权限
chmod +x frpc_linux_amd64_v0.3重新执行
llamafactory-cli webui CUDA_VISIBLE_DEVICES=0 llamafactory-cli webchat --model_name_or_path /root/autodl-fs/Qwen2.5-1.5B-Instruct --template qwen 结果: [INFO|2025-03-02 23:15:05] llamafactory.model.model_utils.attention:157 >> Using torch SDPA for faster training and inference. [INFO|2025-03-02 23:15:05] llamafactory.model.loader:157 >> all params: 1,543,714,304 * Running on local URL: http://0.0.0.0:7860 * Running on public URL: 35d22b023607f1702a.gradio.live This share link expires in 72 hours. For free permanent hosting and GPU upgrades, run `gradio deploy` from the terminal in the working directory to deploy to Hugging Face Spaces ( huggingface.co/spaces)下面的链接就是可访问的动态链接
llama-factory||AutoDL平台由讯客互联互联网栏目发布,感谢您对讯客互联的认可,以及对我们原创作品以及文章的青睐,非常欢迎各位朋友分享到个人网站或者朋友圈,但转载请说明文章出处“llama-factory||AutoDL平台”