Interface LLMProvider
- All Superinterfaces:
Serializable
Framework-agnostic interface for Large Language Model (LLM) providers. This
interface enables AI-powered components to communicate with LLMs without
being tied to a specific implementation. Implementations are responsible for
managing conversation memory, handling streaming responses, processing
vendor-specific tool annotations, and handling file attachments.
- Author:
- Vaadin Ltd.
-
Nested Class Summary
Nested ClassesModifier and TypeInterfaceDescriptionstatic interfaceRepresents a file attachment that can be sent to the LLM for analysis.static interfaceRepresents a request to the LLM containing all necessary context, configuration, and tools. -
Method Summary
Modifier and TypeMethodDescriptionreactor.core.publisher.Flux<String> stream(LLMProvider.LLMRequest request) Streams a response from the LLM based on the provided request.
-
Method Details
-
stream
Streams a response from the LLM based on the provided request. This method returns a reactive stream that emits response tokens as they become available from the LLM. The provider manages conversation history internally, so each call to this method adds to the ongoing conversation context.- Parameters:
request- the LLM request containing user message, system prompt, attachments, and tools, notnull- Returns:
- a Flux stream that emits response tokens as strings, never
null - Throws:
NullPointerException- if request isnull
-