Class SpringAiLLMProvider
java.lang.Object
com.vaadin.flow.component.ai.provider.SpringAiLLMProvider
- All Implemented Interfaces:
LLMProvider
Spring AI implementation of
LLMProvider.
Supports both streaming and non-streaming Spring AI models. Tool calling is
supported through Spring AI's Tool annotation.
Each provider instance maintains its own chat memory. To share conversation history across components, reuse the same provider instance.
Note: SpringAiLLMProvider is not serializable. If your application uses session persistence, you will need to create a new provider instance after session restore.
- Author:
- Vaadin Ltd
-
Nested Class Summary
Nested classes/interfaces inherited from interface com.vaadin.flow.component.ai.provider.LLMProvider
LLMProvider.LLMRequest -
Constructor Summary
ConstructorsConstructorDescriptionSpringAiLLMProvider(org.springframework.ai.chat.client.ChatClient chatClient) Constructor with a chat client.SpringAiLLMProvider(org.springframework.ai.chat.model.ChatModel chatModel) Constructor with a chat model. -
Method Summary
Modifier and TypeMethodDescriptionvoidsetStreaming(boolean streaming) Sets whether to use streaming mode.reactor.core.publisher.Flux<String> stream(LLMProvider.LLMRequest request) Streams a response from the LLM based on the provided request.
-
Constructor Details
-
SpringAiLLMProvider
public SpringAiLLMProvider(org.springframework.ai.chat.model.ChatModel chatModel) Constructor with a chat model.- Parameters:
chatModel- the chat model, notnull- Throws:
NullPointerException- if chatModel isnull
-
SpringAiLLMProvider
public SpringAiLLMProvider(org.springframework.ai.chat.client.ChatClient chatClient) Constructor with a chat client. Note: When using this constructor, conversation memory must be configured externally in theChatClient.- Parameters:
chatClient- the chat client, notnull- Throws:
NullPointerException- if chatModel isnull
-
-
Method Details
-
stream
Description copied from interface:LLMProviderStreams a response from the LLM based on the provided request. This method returns a reactive stream that emits response tokens as they become available from the LLM. The provider manages conversation history internally, so each call to this method adds to the ongoing conversation context.- Specified by:
streamin interfaceLLMProvider- Parameters:
request- the LLM request containing user message, system prompt, attachments, and tools, notnull- Returns:
- a Flux stream that emits response tokens as strings, never
null
-
setStreaming
public void setStreaming(boolean streaming) Sets whether to use streaming mode. The default istrue.- Parameters:
streaming-trueto use streaming mode,falsefor non-streaming.
-