Class SpringAILLMProvider

java.lang.Object
com.vaadin.flow.component.ai.provider.SpringAILLMProvider
All Implemented Interfaces:
LLMProvider

public class SpringAILLMProvider extends Object implements LLMProvider
Spring AI implementation of LLMProvider.

Supports both streaming and non-streaming Spring AI models. Tool calling is supported through Spring AI's Tool annotation.

Streaming vs. non-streaming: Streaming is enabled by default. To disable it, call setStreaming(false). Streaming mode pushes partial responses to the UI as they arrive, which requires server push to be enabled. Annotate your UI class or application shell with @Push, or configure push programmatically, before using streaming mode. A warning is logged at runtime if push is not enabled.

Each provider instance maintains its own chat memory. To share conversation history across components, reuse the same provider instance. History restoration (via setHistory(List, Map)) is only supported when using the SpringAILLMProvider(ChatModel) constructor; the SpringAILLMProvider(ChatClient) constructor does not provide access to the internal chat memory.

Note: SpringAILLMProvider is not serializable. If your application uses session persistence, you will need to create a new provider instance after session restore.

Author:
Vaadin Ltd
  • Constructor Details

    • SpringAILLMProvider

      public SpringAILLMProvider(org.springframework.ai.chat.model.ChatModel chatModel)
      Constructor with a chat model.
      Parameters:
      chatModel - the chat model, not null
      Throws:
      NullPointerException - if chatModel is null
    • SpringAILLMProvider

      public SpringAILLMProvider(org.springframework.ai.chat.client.ChatClient chatClient)
      Constructor with a chat client. Note: When using this constructor, conversation memory must be configured externally in the ChatClient.
      Parameters:
      chatClient - the chat client, not null
      Throws:
      NullPointerException - if chatClient is null
  • Method Details

    • stream

      public reactor.core.publisher.Flux<String> stream(LLMProvider.LLMRequest request)
      Description copied from interface: LLMProvider
      Streams a response from the LLM based on the provided request. This method returns a reactive stream that emits response tokens as they become available from the LLM. The provider manages conversation history internally, so each call to this method adds to the ongoing conversation context.
      Specified by:
      stream in interface LLMProvider
      Parameters:
      request - the LLM request containing user message, system prompt, attachments, and tools, not null
      Returns:
      a Flux stream that emits response tokens as strings, never null
    • setStreaming

      public void setStreaming(boolean streaming)
      Sets whether to use streaming mode. The default is true.
      Parameters:
      streaming - true to use streaming mode, false for non-streaming.
    • setHistory

      public void setHistory(List<ChatMessage> history, Map<String,List<AIAttachment>> attachmentsByMessageId)
      Description copied from interface: LLMProvider
      Restores the provider's conversation memory from a list of chat messages with their associated attachments. Any existing memory is cleared before the new history is applied.

      Providers that support setting chat history should override this method.

      This method must not be called while a streaming response is in progress.

      Specified by:
      setHistory in interface LLMProvider
      Parameters:
      history - the list of chat messages to restore, not null
      attachmentsByMessageId - a map from ChatMessage.messageId() to the list of attachments for that message, not null