Your Macs can do more than you think.
Run open-source AI models on Apple Silicon. Got more than one Mac? mlxstudio combines them — no setup, no cloud.
Free. 10 MB — not 200. Apple Silicon only.
mlxstudio
Today
that segfault in audio
help me grok this codebase
why is my loss plateauing
Yesterday
rewrite the ingestion stuff
compare 8B vs 70B for cod...
Llama-3-70B-4bit
how does it split the model across both my macs?
Each Mac gets a slice of the layers. Your Mac Mini handles layers 0–39, your MacBook Pro takes 40–79. They talk over your local network — each one processes its chunk and passes the result
Ask anything...