

1·
1 day agoLocalAI is pretty good but resource-intensive. I ran it on a vps in the past.
Fedimin | Privacy Advocate | FOSS Lover | GenZ | Human
I ❤ Change
LocalAI is pretty good but resource-intensive. I ran it on a vps in the past.
It’s fully open source and free (as in beer).
You should try https://cherry-ai.com/ … It’s the most advanced client out there. I personally use Ollama for running the models and Mistral API for advnaced tasks.
You should try https://cherry-ai.com/ … It’s the most advanced client out there. I personally use Ollama for running the models and Mistral API for advnaced tasks.
Many are switching to Nextcloud Talk.
Time to pin Odysee and Peertube in my browser even though I pay for YT Premium (only $2 per month here in BD).