• z3rOR0ne@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Nice. Thanks. I’ll save this post in case I use ollama in the future. Right now I use a codellama model and a mythomax model, but am not running them via a localhost server, just outputted in the terminal or LMStudio.

    This looks interesting though. Thanks!