After creating a Prompt Template, you will be navigated to its details screen, where you need to configure instructions for the LLM. Explore sample Prompt Templates delivered with the Magnet AI Admin Panel for some inspiration, or search online for prompt writing tips.
The Prompt Template details screen is split into the header and three tabs:
Header:
- Name
- Description
- System name. System name is used to identify the Prompt Template when configuring RAG Tools or calling via API, so it is strongly recommended not to change it after a Prompt Template has been created.

Prompt Template tab:
- Prompt Template body (instructions for the LLM)
- Category

Advanced Settings tab:
- LLM Model (Azure OpenAI GPT 4o by default)
- Upcoming feature - JSON mode (for OpenAI models)
- Temperature (controls the diversity of LLM output)
- Top P (controls the diversity of LLM output)
- Max tokens (controls the length of output). Can be useful when it is important to strictly limit the length of LLM output, for example, make it return only “true“ or “false“.

Notes tab:
- Notes input - Field to store input (context) that can be used to test the Prompt Template. Consider this as a notepad where you can keep your test inputs and then simply copy them into the Preview panel.
