AI SDK integration that accepts a pre-configured LanguageModel. Enables ADK to work with any provider supported by Vercel's AI SDK.

Hierarchy (view full)

Constructors

Properties

logger: Logger = ...
model: string

The name of the LLM, e.g. gemini-2.5-flash or gemini-2.5-flash-001.

Methods

  • Generates one content from the given contents and tools.

    Parameters

    • llmRequest: LlmRequest

      LlmRequest, the request to send to the LLM.

    • Optionalstream: boolean

      bool = false, whether to do streaming call.

    Returns AsyncGenerator<LlmResponse, void, unknown>

    a generator of LlmResponse.

    For non-streaming call, it will only yield one LlmResponse.

    For streaming call, it may yield more than one response, but all yielded responses should be treated as one response by merging the parts list.