Class LangChain4JLLMProvider

java.lang.Object
com.vaadin.flow.component.ai.provider.LangChain4JLLMProvider
All Implemented Interfaces:
LLMProvider

public class LangChain4JLLMProvider extends Object implements LLMProvider
LangChain4j implementation of LLMProvider.

Supports both streaming and non-streaming LangChain4j models. Tool calling is supported through LangChain4j's Tool annotation.

Streaming vs. non-streaming: The mode is determined by the constructor used. Pass a StreamingChatModel to LangChain4JLLMProvider(StreamingChatModel) for streaming, or a ChatModel to LangChain4JLLMProvider(ChatModel) for non-streaming. Streaming mode pushes partial responses to the UI as they arrive, which requires server push to be enabled. Annotate your UI class or application shell with @Push, or configure push programmatically, before using a streaming model. A warning is logged at runtime if push is not enabled.

Each provider instance maintains its own chat memory. To share conversation history across components, reuse the same provider instance.

Note: LangChain4JLLMProvider is not serializable. If your application uses session persistence, you will need to create a new provider instance after session restore.

Author:
Vaadin Ltd
  • Constructor Details

    • LangChain4JLLMProvider

      public LangChain4JLLMProvider(dev.langchain4j.model.chat.StreamingChatModel chatModel)
      Constructor with a streaming chat model.
      Parameters:
      chatModel - the streaming chat model, not null
      Throws:
      NullPointerException - if chatModel is null
    • LangChain4JLLMProvider

      public LangChain4JLLMProvider(dev.langchain4j.model.chat.ChatModel chatModel)
      Constructor with a non-streaming chat model.
      Parameters:
      chatModel - the non-streaming chat model, not null
      Throws:
      NullPointerException - if chatModel is null
  • Method Details

    • stream

      public reactor.core.publisher.Flux<String> stream(LLMProvider.LLMRequest request)
      Description copied from interface: LLMProvider
      Streams a response from the LLM based on the provided request. This method returns a reactive stream that emits response tokens as they become available from the LLM. The provider manages conversation history internally, so each call to this method adds to the ongoing conversation context.
      Specified by:
      stream in interface LLMProvider
      Parameters:
      request - the LLM request containing user message, system prompt, attachments, and tools, not null
      Returns:
      a Flux stream that emits response tokens as strings, never null
    • setHistory

      public void setHistory(List<ChatMessage> history, Map<String,List<AIAttachment>> attachmentsByMessageId)
      Description copied from interface: LLMProvider
      Restores the provider's conversation memory from a list of chat messages with their associated attachments. Any existing memory is cleared before the new history is applied.

      Providers that support setting chat history should override this method.

      This method must not be called while a streaming response is in progress.

      Specified by:
      setHistory in interface LLMProvider
      Parameters:
      history - the list of chat messages to restore, not null
      attachmentsByMessageId - a map from ChatMessage.messageId() to the list of attachments for that message, not null