I have experience in running servers, but I would like to know if it’s possible to do it, I just need a GPT 3.5 like private LLM running.

  • MasterNerd@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Look into ollama. It shouldn’t be an issue if you stick to 7b parameter models