
 ====================================================
      HiveMind - Dezentrale P2P-AI
 ====================================================

 Starte HiveMind...
 Logs: C:\Users\chris\AppData\Local\HiveMind\logs\

18:03:26 [INFO] hivemind: HiveMind gestartet (command=run, config=C:\Users\chris\AppData\Local\HiveMind\config.yaml)
18:03:26 [DEBUG] asyncio: Using proactor: IocpProactor
HiveMind v1.2.1
  Config: C:\Users\chris\AppData\Local\HiveMind\config.yaml
  Model:  models/Qwen2.5-7B-Instruct-Q4_K_M.gguf

18:03:26 [INFO] hivemind.assistant: Projekte: 0 geladen.
18:03:26 [INFO] hivemind.node: Starting node: Christian (ffdc2665) v1.2.1
18:03:26 [CRITICAL] hivemind: Ungecatchte Exception:
Traceback (most recent call last):
  File "C:\Users\chris\AppData\Local\HiveMind\.venv\Lib\site-packages\llama_cpp\_ctypes_extensions.py", line 67, in load_shared_library
    return ctypes.CDLL(str(lib_path), **cdll_args)  # type: ignore
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\Python312\Lib\ctypes\__init__.py", line 379, in __init__
    self._handle = _dlopen(self._name, mode)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: Could not find module 'C:\Users\chris\AppData\Local\HiveMind\.venv\Lib\site-packages\llama_cpp\lib\llama.dll' (or one of its dependencies). Try using the full path with constructor syntax.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Users\chris\AppData\Local\HiveMind\hivemind\cli.py", line 359, in <module>
    main()
  File "C:\Users\chris\AppData\Local\HiveMind\hivemind\cli.py", line 353, in main
    asyncio.run(run_node(args))
  File "C:\Program Files\Python312\Lib\asyncio\runners.py", line 194, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "C:\Program Files\Python312\Lib\asyncio\runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\Python312\Lib\asyncio\base_events.py", line 686, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "C:\Users\chris\AppData\Local\HiveMind\hivemind\cli.py", line 103, in run_node
    await node.start()
  File "C:\Users\chris\AppData\Local\HiveMind\hivemind\node.py", line 180, in start
    self.model.load()
  File "C:\Users\chris\AppData\Local\HiveMind\hivemind\model.py", line 62, in load
    from llama_cpp import Llama
  File "C:\Users\chris\AppData\Local\HiveMind\.venv\Lib\site-packages\llama_cpp\__init__.py", line 1, in <module>
    from .llama_cpp import *
  File "C:\Users\chris\AppData\Local\HiveMind\.venv\Lib\site-packages\llama_cpp\llama_cpp.py", line 38, in <module>
    _lib = load_shared_library(_lib_base_name, _base_path)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\chris\AppData\Local\HiveMind\.venv\Lib\site-packages\llama_cpp\_ctypes_extensions.py", line 69, in load_shared_library
    raise RuntimeError(f"Failed to load shared library '{lib_path}': {e}")
RuntimeError: Failed to load shared library 'C:\Users\chris\AppData\Local\HiveMind\.venv\Lib\site-packages\llama_cpp\lib\llama.dll': Could not find module 'C:\Users\chris\AppData\Local\HiveMind\.venv\Lib\site-packages\llama_cpp\lib\llama.dll' (or one of its dependencies). Try using the full path with constructor syntax.

Traceback (most recent call last):
  File "C:\Users\chris\AppData\Local\HiveMind\.venv\Lib\site-packages\llama_cpp\_ctypes_extensions.py", line 67, in load_shared_library
    return ctypes.CDLL(str(lib_path), **cdll_args)  # type: ignore
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\Python312\Lib\ctypes\__init__.py", line 379, in __init__
    self._handle = _dlopen(self._name, mode)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: Could not find module 'C:\Users\chris\AppData\Local\HiveMind\.venv\Lib\site-packages\llama_cpp\lib\llama.dll' (or one of its dependencies). Try using the full path with constructor syntax.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Users\chris\AppData\Local\HiveMind\hivemind\cli.py", line 359, in <module>
    main()
  File "C:\Users\chris\AppData\Local\HiveMind\hivemind\cli.py", line 353, in main
    asyncio.run(run_node(args))
  File "C:\Program Files\Python312\Lib\asyncio\runners.py", line 194, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "C:\Program Files\Python312\Lib\asyncio\runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files\Python312\Lib\asyncio\base_events.py", line 686, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "C:\Users\chris\AppData\Local\HiveMind\hivemind\cli.py", line 103, in run_node
    await node.start()
  File "C:\Users\chris\AppData\Local\HiveMind\hivemind\node.py", line 180, in start
    self.model.load()
  File "C:\Users\chris\AppData\Local\HiveMind\hivemind\model.py", line 62, in load
    from llama_cpp import Llama
  File "C:\Users\chris\AppData\Local\HiveMind\.venv\Lib\site-packages\llama_cpp\__init__.py", line 1, in <module>
    from .llama_cpp import *
  File "C:\Users\chris\AppData\Local\HiveMind\.venv\Lib\site-packages\llama_cpp\llama_cpp.py", line 38, in <module>
    _lib = load_shared_library(_lib_base_name, _base_path)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\chris\AppData\Local\HiveMind\.venv\Lib\site-packages\llama_cpp\_ctypes_extensions.py", line 69, in load_shared_library
    raise RuntimeError(f"Failed to load shared library '{lib_path}': {e}")
RuntimeError: Failed to load shared library 'C:\Users\chris\AppData\Local\HiveMind\.venv\Lib\site-packages\llama_cpp\lib\llama.dll': Could not find module 'C:\Users\chris\AppData\Local\HiveMind\.venv\Lib\site-packages\llama_cpp\lib\llama.dll' (or one of its dependencies). Try using the full path with constructor syntax.

 [!] HiveMind wurde mit Fehlercode 1 beendet.
 Pruefe Logs in: C:\Users\chris\AppData\Local\HiveMind\logs\