Class SpringAILLMProvider
- All Implemented Interfaces:
LLMProvider
LLMProvider.
Supports both streaming and non-streaming Spring AI models. Tool calling is
supported through Spring AI's Tool annotation.
Streaming vs. non-streaming: Streaming is enabled by default. To
disable it, call setStreaming(false).
Streaming mode pushes partial responses to the UI as they arrive, which
requires server push to be enabled. Annotate your UI class or application
shell with @Push, or configure push programmatically, before using
streaming mode. A warning is logged at runtime if push is not enabled.
Each provider instance maintains its own chat memory. To share conversation
history across components, reuse the same provider instance. History
restoration (via setHistory(List, Map)) is only supported when using
the SpringAILLMProvider(ChatModel) constructor; the
SpringAILLMProvider(ChatClient) constructor does not provide access
to the internal chat memory.
Note: SpringAILLMProvider is not serializable. If your application uses session persistence, you will need to create a new provider instance after session restore.
- Author:
- Vaadin Ltd
-
Nested Class Summary
Nested classes/interfaces inherited from interface com.vaadin.flow.component.ai.provider.LLMProvider
LLMProvider.LLMRequest -
Constructor Summary
ConstructorsConstructorDescriptionSpringAILLMProvider(org.springframework.ai.chat.client.ChatClient chatClient) Constructor with a chat client.SpringAILLMProvider(org.springframework.ai.chat.model.ChatModel chatModel) Constructor with a chat model. -
Method Summary
Modifier and TypeMethodDescriptionvoidsetHistory(List<ChatMessage> history, Map<String, List<AIAttachment>> attachmentsByMessageId) Restores the provider's conversation memory from a list of chat messages with their associated attachments.voidsetStreaming(boolean streaming) Sets whether to use streaming mode.reactor.core.publisher.Flux<String> stream(LLMProvider.LLMRequest request) Streams a response from the LLM based on the provided request.
-
Constructor Details
-
SpringAILLMProvider
public SpringAILLMProvider(org.springframework.ai.chat.model.ChatModel chatModel) Constructor with a chat model.- Parameters:
chatModel- the chat model, notnull- Throws:
NullPointerException- if chatModel isnull
-
SpringAILLMProvider
public SpringAILLMProvider(org.springframework.ai.chat.client.ChatClient chatClient) Constructor with a chat client. Note: When using this constructor, conversation memory must be configured externally in theChatClient.- Parameters:
chatClient- the chat client, notnull- Throws:
NullPointerException- if chatClient isnull
-
-
Method Details
-
stream
Description copied from interface:LLMProviderStreams a response from the LLM based on the provided request. This method returns a reactive stream that emits response tokens as they become available from the LLM. The provider manages conversation history internally, so each call to this method adds to the ongoing conversation context.- Specified by:
streamin interfaceLLMProvider- Parameters:
request- the LLM request containing user message, system prompt, attachments, and tools, notnull- Returns:
- a Flux stream that emits response tokens as strings, never
null
-
setStreaming
public void setStreaming(boolean streaming) Sets whether to use streaming mode. The default istrue.- Parameters:
streaming-trueto use streaming mode,falsefor non-streaming.
-
setHistory
public void setHistory(List<ChatMessage> history, Map<String, List<AIAttachment>> attachmentsByMessageId) Description copied from interface:LLMProviderRestores the provider's conversation memory from a list of chat messages with their associated attachments. Any existing memory is cleared before the new history is applied.Providers that support setting chat history should override this method.
This method must not be called while a streaming response is in progress.
- Specified by:
setHistoryin interfaceLLMProvider- Parameters:
history- the list of chat messages to restore, notnullattachmentsByMessageId- a map fromChatMessage.messageId()to the list of attachments for that message, notnull
-