Click any tag below to further narrow down your results
Links
The author connects a 16 GB Mac Mini to a 64 GB MacBook Pro using LM Studio Link’s encrypted mesh VPN, offloading heavy model inference to the more powerful machine without exposing ports or tweaking firewalls. This setup lets you run large LLMs on low-RAM devices as if they were local, with no cloud or API key hassles.
Google’s Gemini macOS app runs on Apple Silicon and sits in your menu bar, ready with a global shortcut (Option + Space) for instant AI assistance. It can share your active window for context-aware answers, sync chat history across devices, and offers tools for image and video creation.