- cross-posted to:
- aicompanions@lemmy.world
- cross-posted to:
- aicompanions@lemmy.world
But in all fairness, it’s really llama.cpp that supports AMD.
Now looking forward to the Vulkan support!
You must log in or # to comment.
But in all fairness, it’s really llama.cpp that supports AMD.
Now looking forward to the Vulkan support!