I have experience in running servers, but I would like to know if it’s possible to do it, I just need a GPT 3.5 like private LLM running.
I have experience in running servers, but I would like to know if it’s possible to do it, I just need a GPT 3.5 like private LLM running.
Look into ollama. It shouldn’t be an issue if you stick to 7b parameter models