Class LangChain4JLLMProvider
- All Implemented Interfaces:
LLMProvider
LLMProvider.
Supports both streaming and non-streaming LangChain4j models. Tool calling is
supported through LangChain4j's Tool annotation.
Streaming vs. non-streaming: The mode is determined by the constructor
used. Pass a StreamingChatModel to
LangChain4JLLMProvider(StreamingChatModel) for streaming, or a
ChatModel to LangChain4JLLMProvider(ChatModel) for
non-streaming. Streaming mode pushes partial responses to the UI as they
arrive, which requires server push to be enabled. Annotate your UI class or
application shell with @Push, or configure push programmatically,
before using a streaming model. A warning is logged at runtime if push is not
enabled.
Each provider instance maintains its own chat memory. To share conversation history across components, reuse the same provider instance.
Note: LangChain4JLLMProvider is not serializable. If your application uses session persistence, you will need to create a new provider instance after session restore.
- Author:
- Vaadin Ltd
-
Nested Class Summary
Nested classes/interfaces inherited from interface com.vaadin.flow.component.ai.provider.LLMProvider
LLMProvider.LLMRequest -
Constructor Summary
ConstructorsConstructorDescriptionLangChain4JLLMProvider(dev.langchain4j.model.chat.ChatModel chatModel) Constructor with a non-streaming chat model.LangChain4JLLMProvider(dev.langchain4j.model.chat.StreamingChatModel chatModel) Constructor with a streaming chat model. -
Method Summary
Modifier and TypeMethodDescriptionvoidsetHistory(List<ChatMessage> history, Map<String, List<AIAttachment>> attachmentsByMessageId) Restores the provider's conversation memory from a list of chat messages with their associated attachments.reactor.core.publisher.Flux<String> stream(LLMProvider.LLMRequest request) Streams a response from the LLM based on the provided request.
-
Constructor Details
-
LangChain4JLLMProvider
public LangChain4JLLMProvider(dev.langchain4j.model.chat.StreamingChatModel chatModel) Constructor with a streaming chat model.- Parameters:
chatModel- the streaming chat model, notnull- Throws:
NullPointerException- if chatModel isnull
-
LangChain4JLLMProvider
public LangChain4JLLMProvider(dev.langchain4j.model.chat.ChatModel chatModel) Constructor with a non-streaming chat model.- Parameters:
chatModel- the non-streaming chat model, notnull- Throws:
NullPointerException- if chatModel isnull
-
-
Method Details
-
stream
Description copied from interface:LLMProviderStreams a response from the LLM based on the provided request. This method returns a reactive stream that emits response tokens as they become available from the LLM. The provider manages conversation history internally, so each call to this method adds to the ongoing conversation context.- Specified by:
streamin interfaceLLMProvider- Parameters:
request- the LLM request containing user message, system prompt, attachments, and tools, notnull- Returns:
- a Flux stream that emits response tokens as strings, never
null
-
setHistory
public void setHistory(List<ChatMessage> history, Map<String, List<AIAttachment>> attachmentsByMessageId) Description copied from interface:LLMProviderRestores the provider's conversation memory from a list of chat messages with their associated attachments. Any existing memory is cleared before the new history is applied.Providers that support setting chat history should override this method.
This method must not be called while a streaming response is in progress.
- Specified by:
setHistoryin interfaceLLMProvider- Parameters:
history- the list of chat messages to restore, notnullattachmentsByMessageId- a map fromChatMessage.messageId()to the list of attachments for that message, notnull
-