Configuration for LlmSummaryProvider

interface LlmSummaryProviderConfig {
    model: string;
    prompt?: string;
    extract?: {
        summary?: boolean;
        segments?: boolean;
        entities?: boolean;
        keyFacts?: boolean;
    };
    apiKey?: string;
    baseUrl?: string;
}

Properties

model: string

The model to use for summarization. Can be a model name string or a BaseLlm instance.

prompt?: string

Custom prompt for summarization (optional). If not provided, uses a sensible default.

extract?: {
    summary?: boolean;
    segments?: boolean;
    entities?: boolean;
    keyFacts?: boolean;
}

What to extract from the session.

Type declaration

  • Optionalsummary?: boolean

    Extract a summary (default: true)

  • Optionalsegments?: boolean

    Extract topic segments (default: true)

  • Optionalentities?: boolean

    Extract named entities (default: true)

  • OptionalkeyFacts?: boolean

    Extract key facts (default: true)

apiKey?: string

Optional: API key for the model provider. If not provided, will use environment variables.

baseUrl?: string

Optional: Base URL for the model provider.