Ollama adopts MLX for faster AI performance on Apple silicon Macs

Wait 5 sec.

One of the best tools to run AI models locally on a Mac just got even better. Here’s why, and how to run it. more…