LLM Studio

Go deeper: API calls, streaming, system prompts, and building with local models

Prerequisite: Complete Lab 12: Local LLM Setup first. You need Ollama installed and at least one model pulled.
0 / 0 0%

Lab Complete

You can now build with local LLMs programmatically. Try integrating Ollama into your smart classroom project!