A tight modal that talks to a locally-running LM Studio. Granite, Phi-4, Gemma-3, Llama-3.2, or any custom model you’re running. Streaming, image-aware, with a system-prompt textarea and a temperature slider for when you want to dial it in.
lmstud-yo
Ask your local LM Studio server, right inside Obsidian.
Recent ship notes
-
LM Studio writes to file — Lmstud-Yo's first complete loop
Lmstud-Yo went from `init(project)` to `works(response)` in roughly three hours on the night of July 27–28, 2025. The first time a query went out to a locally-running LM Studio server, came back, and landed in an Obsidian note as actual text. Logged retrospectively from 2026.
-
Created a new Obsidian Community Plugin for LM Studio