1 Matching Annotations
  1. Dec 2024
    1. LM Studio can run LLMs locally (I have llama and phi installed). It also has an API over a localhost webserver. I use that API to make llama available in Obsidian using the Copilot plugin.

      This is the API documentation. #openvraag other scripts / [[Persoonlijke tools 20200619203600]] I can use this in?