If you were running a LLM locally on android through llama.cpp for use as a private personal assistant. What model would you use?

Thanks for any recommendations in advance.

  • haui@lemmy.giftedmc.com
    link
    fedilink
    arrow-up
    0
    ·
    15 days ago

    Running an llm on a phone will absolutely destroy your battery life. It also is imperative that you understand that the comfort of ai is bought with killing of innocents (through expediency of climate catastrophe, exploitation of the planet and the poorest on it).

    I think using ai to experiment on a home server which already exists wouldnt be problematic IN A VACUUM but you would still normalize using the tech which is morally corrupt.

        • Fisch@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          15 days ago

          I’m not even trying to argue against you, I’m arguing for veganism. The same arguments that you used for why the use of AI is bad can be used for why not being vegan is bad. The production of animal products even has a way bigger impact.