Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error: failed to run custom build command for `llama-cpp-server v0.27.0-dev.0 #4055

Open
adamz01h opened this issue Mar 23, 2025 · 2 comments

Comments

@adamz01h
Copy link

cargo run --features rocm serve --model TabbyML/DeepSeekCoder-1.3B --chat-model TabbyML/Yi-Coder-9B-Chat --device rocm

Caused by:
  process didn't exit successfully: `/home/adamz01h/tabby/target/debug/build/llama-cpp-server-06cfc11d00511532/build-script-build` (exit status: 101)
  --- stdout
  CMAKE_TOOLCHAIN_FILE_x86_64-unknown-linux-gnu = None
  CMAKE_TOOLCHAIN_FILE_x86_64_unknown_linux_gnu = None
  HOST_CMAKE_TOOLCHAIN_FILE = None
  CMAKE_TOOLCHAIN_FILE = None
  CMAKE_GENERATOR_x86_64-unknown-linux-gnu = None
  CMAKE_GENERATOR_x86_64_unknown_linux_gnu = None
  HOST_CMAKE_GENERATOR = None
  CMAKE_GENERATOR = None
  CMAKE_PREFIX_PATH_x86_64-unknown-linux-gnu = None
  CMAKE_PREFIX_PATH_x86_64_unknown_linux_gnu = None
  HOST_CMAKE_PREFIX_PATH = None
  CMAKE_PREFIX_PATH = None
  CMAKE_x86_64-unknown-linux-gnu = None
  CMAKE_x86_64_unknown_linux_gnu = None
  HOST_CMAKE = None
  CMAKE = None
  running: cd "/home/adamz01h/tabby/target/debug/build/llama-cpp-server-8aa09dc25aaedde8/out/build" && CMAKE_PREFIX_PATH="" "cmake" "/home/adamz01h/tabby/crates/llama-cpp-server/./llama.cpp" "-DGGML_NATIVE=OFF" "-DGGML_NATIVE_DEFAULT=OFF" "-DBUILD_SHARED_LIBS=OFF" "-DGGML_HIPBLAS=ON" "-DCMAKE_C_COMPILER=/opt/rocm/llvm/bin/clang" "-DCMAKE_CXX_COMPILER=/opt/rocm/llvm/bin/clang++" "-DAMDGPU_TARGETS=gfx803;gfx900;gfx906:xnack-;gfx908:xnack-;gfx90a:xnack+;gfx90a:xnack-;gfx940;gfx941;gfx942;gfx1010;gfx1012;gfx1030;gfx1031;gfx1100;gfx1101;gfx1102;gfx1103" "-DCMAKE_INSTALL_PREFIX=/home/adamz01h/tabby/target/debug/build/llama-cpp-server-8aa09dc25aaedde8/out" "-DCMAKE_C_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_CXX_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_ASM_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_ASM_COMPILER=/usr/bin/cc" "-DCMAKE_BUILD_TYPE=Release"

  --- stderr
  CMake Warning:
    Ignoring extra path from command line:

     "/home/adamz01h/tabby/crates/llama-cpp-server/./llama.cpp"


  CMake Error: The source directory "/home/adamz01h/tabby/crates/llama-cpp-server/llama.cpp" does not appear to contain CMakeLists.txt.
  Specify --help for usage, or press the help button on the CMake GUI.
  thread 'main' panicked at /root/.cargo/registry/src/index.crates.io-6f17d22bba15001f/cmake-0.1.50/src/lib.rs:1098:5:

  command did not execute successfully, got: exit status: 1

  build script failed, must exit now
  note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
warning: build failed, waiting for other jobs to finish...
@adamz01h
Copy link
Author

I was able to copy the build files from a previous clone into this directory and was able to move forward and complete the compile. I would look at the process for pulling and building llama-server process

@wsxiaoys
Copy link
Member

The ROCm build for llama.cpp has been broken for a while. Feel free to explore and let us know your progress. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants