The model to use for summarization. Can be a model name string or a BaseLlm instance.
OptionalpromptCustom prompt for summarization (optional). If not provided, uses a sensible default.
OptionalextractWhat to extract from the session.
Optionalsummary?: booleanExtract a summary (default: true)
Optionalsegments?: booleanExtract topic segments (default: true)
Optionalentities?: booleanExtract named entities (default: true)
OptionalkeyExtract key facts (default: true)
OptionalapiOptional: API key for the model provider. If not provided, will use environment variables.
OptionalbaseOptional: Base URL for the model provider.
Configuration for LlmSummaryProvider