sandeep
sandeep

Reputation: 163

while pip install llama-cpp-python getting error on windows pc

Creating directory "llava_shared.dir\Release". Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\Release". Creating directory "llava_shared.dir\Release\llava_shared.tlog". InitializeBuildStatus: Creating "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/Developer/AppData/Local/Temp/pip-install-zras17p_/llama-cpp-python_665b020e55e347eb88a6a31baa320e9f/vendor/llama.cpp/examples/llava/CMakeLists.txt Link: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\Release\llava.dll" /INCREMENTAL:NO /NOLOGO ....\src\Release\llama.lib ....\ggml\src\Release\ggml.lib "....\ggml\src\Release\ggml-cpu.lib" "....\ggml\src\Release\ggml-base.lib" kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /PDB:"C:/Users/Developer/AppData/Local/Temp/tmpac6hkx1w/build/vendor/llama.cpp/examples/llava/Release/llava.pdb" /SUBSYSTEM:CONSOLE /TLBID:1 /DYNAMICBASE /NXCOMPAT /IMPLIB:"C:/Users/Developer/AppData/Local/Temp/tmpac6hkx1w/build/vendor/llama.cpp/examples/llava/Release/llava.lib" /MACHINE:X64 /machine:x64 /DLL C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj Creating library C:/Users/Developer/AppData/Local/Temp/tmpac6hkx1w/build/vendor/llama.cpp/examples/llava/Release/llava.lib and object C:/Users/Developer/AppData/Local/Temp/tmpac6hkx1w/build/vendor/llama.cpp/examples/llava/Release/llava.exp llava_shared.vcxproj -> C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\Release\llava.dll FinalizeBuildStatus: Deleting file "llava_shared.dir\Release\llava_shared.tlog\unsuccessfulbuild". Touching "llava_shared.dir\Release\llava_shared.tlog\llava_shared.lastbuildstate". Done Building Project "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\llava_shared.vcxproj" (default targets). Project "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\ALL_BUILD.vcxproj" (1) is building "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (14) on node 1 (default targets). PrepareForBuild: Creating directory "llava_static.dir\Release". Structured output is enabled. The formatting of compiler diagnostics will reflect the error hierarchy. See https://aka.ms/cpp/structured-output for more details. Creating directory "llava_static.dir\Release\llava_static.tlog". InitializeBuildStatus: Creating "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified. Touching "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild". CustomBuild: Building Custom Rule C:/Users/Developer/AppData/Local/Temp/pip-install-zras17p_/llama-cpp-python_665b020e55e347eb88a6a31baa320e9f/vendor/llama.cpp/examples/llava/CMakeLists.txt Lib: C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.43.34808\bin\HostX64\x64\Lib.exe /OUT:"C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\Release\llava_static.lib" /NOLOGO /MACHINE:X64 /machine:x64 C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\llava.dir\Release\llava.obj C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\llava.dir\Release\clip.obj llava_static.vcxproj -> C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\Release\llava_static.lib FinalizeBuildStatus: Deleting file "llava_static.dir\Release\llava_static.tlog\unsuccessfulbuild". Touching "llava_static.dir\Release\llava_static.tlog\llava_static.lastbuildstate". Done Building Project "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\llava_static.vcxproj" (default targets). Done Building Project "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\ALL_BUILD.vcxproj" (default targets) -- FAILED.

  Build FAILED.

  "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\ALL_BUILD.vcxproj" (default target) (1) ->
  "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
  "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) ->
  "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj" (default target) (6) ->
  (ClCompile target) ->
    C:\Program Files (x86)\Windows Kits\10\Include\10.0.22621.0\ucrt\assert.h(21,9): warning C4005: 'static_assert': macro redefinition [C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\ggml\src\ggml-base.vcxproj]


  "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\ALL_BUILD.vcxproj" (default target) (1) ->
  "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
  "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\ggml\src\ggml.vcxproj" (default target) (5) ->
  "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj" (default target) (7) ->
    C:\Program Files (x86)\Windows Kits\10\Include\10.0.22621.0\ucrt\assert.h(21,9): warning C4005: 'static_assert': macro redefinition [C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\ggml\src\ggml-cpu.vcxproj]


  "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\ALL_BUILD.vcxproj" (default target) (1) ->
  "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
  "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\src\llama.vcxproj" (default target) (8) ->
    C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\VC\Tools\MSVC\14.43.34808\include\type_traits(1706,98): warning C4244: 'argument': conversion from 'unsigned __int64' to 'int', possible loss of data [C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\src\llama.vcxproj]


  "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\ALL_BUILD.vcxproj" (default target) (1) ->
  "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\llava.vcxproj" (default target) (4) ->
    C:\Users\Developer\AppData\Local\Temp\pip-install-zras17p_\llama-cpp-python_665b020e55e347eb88a6a31baa320e9f\vendor\llama.cpp\examples\llava\clip.cpp(1131,9): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
    C:\Users\Developer\AppData\Local\Temp\pip-install-zras17p_\llama-cpp-python_665b020e55e347eb88a6a31baa320e9f\vendor\llama.cpp\examples\llava\clip.cpp(1586,13): warning C4297: 'clip_model_load': function assumed not to throw an exception but does [C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\llava.vcxproj]
    C:\Users\Developer\AppData\Local\Temp\pip-install-zras17p_\llama-cpp-python_665b020e55e347eb88a6a31baa320e9f\vendor\llama.cpp\examples\llava\clip.cpp(2820,5): warning C4297: 'clip_n_mmproj_embd': function assumed not to throw an exception but does [C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\examples\llava\llava.vcxproj]


  "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\ALL_BUILD.vcxproj" (default target) (1) ->
  "C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\common\common.vcxproj" (default target) (9) ->
  (ClCompile target) ->
    C:\Users\Developer\AppData\Local\Temp\pip-install-zras17p_\llama-cpp-python_665b020e55e347eb88a6a31baa320e9f\vendor\llama.cpp\common\log.cpp(28,79): error C2039: 'system_clock': is not a member of 'std::chrono' [C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\common\common.vcxproj]
    C:\Users\Developer\AppData\Local\Temp\pip-install-zras17p_\llama-cpp-python_665b020e55e347eb88a6a31baa320e9f\vendor\llama.cpp\common\log.cpp(28,79): error C3083: 'system_clock': the symbol to the left of a '::' must be a type [C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\common\common.vcxproj]
    C:\Users\Developer\AppData\Local\Temp\pip-install-zras17p_\llama-cpp-python_665b020e55e347eb88a6a31baa320e9f\vendor\llama.cpp\common\log.cpp(28,93): error C2039: 'now': is not a member of 'std::chrono' [C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\common\common.vcxproj]
    C:\Users\Developer\AppData\Local\Temp\pip-install-zras17p_\llama-cpp-python_665b020e55e347eb88a6a31baa320e9f\vendor\llama.cpp\common\log.cpp(28,93): error C3861: 'now': identifier not found [C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\common\common.vcxproj]
    C:\Users\Developer\AppData\Local\Temp\pip-install-zras17p_\llama-cpp-python_665b020e55e347eb88a6a31baa320e9f\vendor\llama.cpp\common\log.cpp(28,25): error C2672: 'std::chrono::duration_cast': no matching overloaded function found [C:\Users\Developer\AppData\Local\Temp\tmpac6hkx1w\build\vendor\llama.cpp\common\common.vcxproj]

      6 Warning(s)
      5 Error(s)

  Time Elapsed 00:01:22.47


  *** CMake build failed
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for llama-cpp-python Failed to build llama-cpp-python ERROR: Failed to build installable wheels for some pyproject.toml based projects (llama-cpp-python)

Upvotes: 0

Views: 183

Answers (1)

Janzert
Janzert

Reputation: 506

This is apparently due to a breaking change in MSVC (llama-cpp-python#1942). The fix has already been applied in llama.cpp (llama.cpp#11836) but llama-cpp-python hasn't updated quite yet.

Upvotes: 0

Related Questions