Open
Description
LocalAI version:
3.1.1
Environment, CPU architecture, OS, and Version:
Docker
mapped to /backends
Describe the bug
unable to download or install backend models
To Reproduce
Not sure as i clicked on install on cuda12-vllm, this happens will all backend models
Expected behavior
download and setup
Logs
cuda12-vllm backend
4.0 GiB/3.9 GiB (102.26%) ETA: -221.194423
4.2 GiB/3.9 GiB (106.05%) ETA: -2.577751901s
4.5 GiB/3.9 GiB (113.23%) ETA: -15.472750741s
4.8 GiB/3.9 GiB (122.17%) ETA: -34.020138513s
failed to extract image "unexpected EOF"
Additional context
LLM download and install but not backends
could be part of #5693