• MysteriousSophon21@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    Ollama is also a great option for running Mistral models locally - super lightweight and I’ve been running the mistral-7b on my MacBook without issues, it even integrates nicely with audiobookshelf if ur into that kind of self-hosted setup.