init_tracing() once at startup in your app examples.
Configure the LM
Tell DSRs which model to use. This sets a global default that all predictors will use:
Set
OPENAI_API_KEY in your environment. For other providers, use the appropriate prefix (e.g., anthropic:claude-3-haiku).Define a signature
A signature declares your task’s inputs and outputs:The doc comments become:
- Struct docstring → instruction for the LM
- Field docstrings → field descriptions in the prompt
Call the LM
Create a predictor and call it:The
#[derive(Signature)] macro generates QAInput from your #[input] fields. You get back a QA struct with both input and output fields filled in - output.answer is a typed String.Complete example
Next steps
Signatures
Define task contracts with typed inputs/outputs
Custom Types
Use your own structs and enums in signatures
Predictors
Call LMs with typed signatures, add demos
Modules
Compose multi-step pipelines
Adding complexity
Input formatting and rendering
Use#[format("json" | "yaml" | "toon")] for serialization, or #[render(jinja = "...")] for custom field text.
See the full attribute reference in Signatures and runtime behavior in Adapter.
Custom types
When you need more than primitives, add#[BamlType]:
Few-shot demos
Add examples to guide the LM:Constraints
Validate outputs with#[check] and #[assert]:
