← All plugins

lmstud-yo

Beta Local-AILM-Studio

Ask your local LM Studio server, right inside Obsidian.

View on GitHub →

lmstud-yo — banner

A tight modal that talks to a locally-running LM Studio. Granite, Phi-4, Gemma-3, Llama-3.2, or any custom model you’re running. Streaming, image-aware, with a system-prompt textarea and a temperature slider for when you want to dial it in.

Recent ship notes