Error: Building wheel for llama-cpp-python

error: subprocess-exited-with-error

× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [128 lines of output]

This solution worked for my case. My System is Linux. No GPU.

pip install --no-cache-dir llama-cpp-python==0.2.85 --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cu122

Github Link

Also Consider this solution if this doesn’t work. Github Link