Class LangChain4JLLMProvider

java.lang.Object
com.vaadin.flow.component.ai.provider.LangChain4JLLMProvider
All Implemented Interfaces:
LLMProvider, Serializable

public class LangChain4JLLMProvider extends Object implements LLMProvider
LangChain4j implementation of LLMProvider.

Supports both streaming and non-streaming LangChain4j models. Tool calling is supported through LangChain4j's Tool annotation.

Each provider instance maintains its own chat memory. To share conversation history across components, reuse the same provider instance.

Author:
Vaadin Ltd
See Also:
  • Constructor Details

    • LangChain4JLLMProvider

      public LangChain4JLLMProvider(dev.langchain4j.model.chat.StreamingChatModel chatModel)
      Constructor with a streaming chat model.
      Parameters:
      chatModel - the streaming chat model, not null
      Throws:
      NullPointerException - if chatModel is null
    • LangChain4JLLMProvider

      public LangChain4JLLMProvider(dev.langchain4j.model.chat.ChatModel chatModel)
      Constructor with a non-streaming chat model.
      Parameters:
      chatModel - the non-streaming chat model, not null
      Throws:
      NullPointerException - if chatModel is null
  • Method Details

    • stream

      public reactor.core.publisher.Flux<String> stream(LLMProvider.LLMRequest request)
      Description copied from interface: LLMProvider
      Streams a response from the LLM based on the provided request. This method returns a reactive stream that emits response tokens as they become available from the LLM. The provider manages conversation history internally, so each call to this method adds to the ongoing conversation context.
      Specified by:
      stream in interface LLMProvider
      Parameters:
      request - the LLM request containing user message, system prompt, attachments, and tools, not null
      Returns:
      a Flux stream that emits response tokens as strings, never null