Interface LLMProvider

All Superinterfaces:
Serializable

public interface LLMProvider extends Serializable
Framework-agnostic interface for Large Language Model (LLM) providers. This interface enables AI-powered components to communicate with LLMs without being tied to a specific implementation. Implementations are responsible for managing conversation memory, handling streaming responses, processing vendor-specific tool annotations, and handling file attachments.
Author:
Vaadin Ltd.
  • Nested Class Summary

    Nested Classes
    Modifier and Type
    Interface
    Description
    static interface 
    Represents a file attachment that can be sent to the LLM for analysis.
    static interface 
    Represents a request to the LLM containing all necessary context, configuration, and tools.
  • Method Summary

    Modifier and Type
    Method
    Description
    reactor.core.publisher.Flux<String>
    Streams a response from the LLM based on the provided request.
  • Method Details

    • stream

      reactor.core.publisher.Flux<String> stream(LLMProvider.LLMRequest request)
      Streams a response from the LLM based on the provided request. This method returns a reactive stream that emits response tokens as they become available from the LLM. The provider manages conversation history internally, so each call to this method adds to the ongoing conversation context.
      Parameters:
      request - the LLM request containing user message, system prompt, attachments, and tools, not null
      Returns:
      a Flux stream that emits response tokens as strings, never null
      Throws:
      NullPointerException - if request is null