Sessions
A Session is a Callable that’s used for AI integration as it allows for the execution of Programs on AI models. Sessions should be thought of as a transient object that represents a singular user’s session interacting with an AI. Sessions are traditionally written as state machines.
Creating a Session
To create a Session you write a POJO that extends Session
in service code. The base type is imported from the provided roli-runtime
library.
Contructor Signature
Sessions require a constructor with a single string
argument, traditionally named sessionId
that must be passed to super.
SessionId/PrimaryKey
The sessionId
in the above example is an opaque string that, together with the type information, uniquely identifies the Session instance at runtime. SessionIds are unique to the Session subtype.
After instantiation, the sessionId
value can be retrieved from the primaryKey
getter on the Session.
Using a Session from Client Code
Sessions cannot be instantiated by client code directly. Clients must call an Endpoint method that returns a Session instance.
Use the createSession
method in service code to create a new Session instance.
Calling Models from Sessions
ModelSpecification
A ModelSpecification is configuration information about how to call a specifc Large Language Model API. ModelSpecifications are registered with the Model Registry using a string key that can later be passed to the getModel(key: string)
function to get the ModelSpecification at runtime. You can re-define the ModelSpecification at any time by updating the registration, thus changing what gets returned by the getModel
function.
Program
Sessions call models by executing Programs. A Program is an object that contains both a ModelSpecification that describes how to talk to the LLM and a list of Steps that will be executed synchronously.
Steps
Each Step can be an Instruction or a Prompt. The list of steps passed to a Program must contain a least one Prompt. Additionally, the last step must be a Prompt.
Prompts
Prompts are objects that prompt a model for a response and allow developers to handle the output.
Prompts contain three properties:
- .system - A string that indicates how the LLM should act.
- .user - A string that is typically used to supply a user-defined query.
- .assistant - A function callback that will be called by the Program when the model returns its output.
After a Program executes a Prompt, the model’s response
is passed to the .assistant
callback function. The .assistant
callback must return a string. The Prompt object then gets turned into an Instruction by replacing the .assistant
property with the string returned by the callback.
Instructions
Instructions are Prompts that have run before. They demonstrate to the LLM how prompts should be responded to.
Instructions contain three properties:
- .system - A string that indicates how the LLM should act.
- .user - A string that is typically used to supply a user-defined query.
- .assistant - A string that is a response the LLM may have given considering the .system and .user messages as well as all previous Steps.
Putting it all together
Session instances run Programs using the this.execute
method, passing the ModelSpecification to call and one or more steps.