LLM Element
Element for AI/Language Model integration
LLM Element
The <llm>
element enables AI/Language Model integration:
Prop | Type | Default |
---|---|---|
id? | string | - |
model | string | - |
temperature? | number | - |
includeChatHistory? | boolean | - |
stopSequences? | string[] | - |
topP? | number | - |
toolChoice? | string | - |
tools? | object[] | - |
grammar? | object | - |
repetitionPenalty? | number | - |
responseFormat? | object | string | - |
Allowed Children
prompt
: The prompt textinstructions
: System instructions
Examples
Basic Usage
With Instructions
Advanced Configuration
Usage Notes
- Model selection affects capabilities
- Temperature controls randomness
- Instructions guide model behavior
- Prompts can be dynamic
- Can include context and history
Best Practices
-
Model Selection
- Choose appropriate model
- Consider model capabilities
- Balance cost and performance
- Test model behavior
-
Prompt Engineering
- Write clear prompts
- Include relevant context
- Use consistent format
- Handle edge cases
-
Parameter Tuning
- Adjust temperature for task
- Set appropriate top-p
- Configure stop sequences
- Test different settings
-
Error Handling
- Handle model errors
- Provide fallbacks
- Log issues
- Monitor performance