• Daemon Silverstein@calckey.world
    link
    fedilink
    arrow-up
    2
    ·
    1 day ago

    @Sunshine@piefed.ca This list is missing many names: to mention some of the platforms I know, there are also Qwen, QLM and Kimi (Chinese), Maritaca Sabiá IA and Amazônia IA (Brazil). There are also smaller (often homebrewed by hobbyists) language models (SMLs) often found on HuggingFace.

    Online platforms aside, self-hosted (offline) inference is the most private way to run LLMs, independent of who built it (be it Llama from Meta, Gemma from Google, Mixtral, DeepSeek or Qwen: they can’t really collect data from offline usage, especially if one proceeded to fully air-gap their computer).