Locally run LLMs
I sometimes get twitchy about how much data I'm sharing with a handful of US-based AI companies' models.
Enter Ollama.
Ollama makes it easy to get up and running with large language models locally on your computer. Open source, private and definitely worth exploring.
Ollama appeared in Issue 21 in the Useful section.