Liminal Messages beta
Effect AI Message
Format
Liminal represents messages using Effect AI's provider-agnostic Message
schema.
import { Message } from "@effect/ai/AiInput"
Message
L.messages
To access the messages of the thread, use L.messages
.
Effect.gen(function*() {
const messages = yield* L.messages
})
Serde
We can use the Message schema to encode, persist and decode messages. This encoding/decoding handles various message part types, including images and file parts.
import { Message } from "@effect/ai/AiInput"
import { Effect, pipe, Schema } from "effect"
import L from "liminal"
const conversation = Effect.gen(function*() {
yield* L.user`...`
yield* L.assistant
const messages = yield* L.messages
const encoded = pipe(
messages,
Schema.encodeSync(Schema.Array(Message)),
JSON.stringify,
)
encoded satisfies string
const decoded = pipe(
encoded,
JSON.parse,
Schema.decodeUnknownSync(Schema.Array(Message)),
)
decoded satisfies ReadonlyArray<Message>
})
L.user
Append a user message to the thread.
Effect.gen(function*() {
// As a tagged template function call.
yield* L.user`...`
// As an ordinary function call.
yield* L.user("...")
})
L.userJson
Append the stringified JSON-serializable value to the thread. Optionally provide a schema, the annotations of which will be added as JSONC comments to the resulting JSON string contained within the new message.
import { Effect, Schema } from "effect"
import L from "liminal"
Effect.gen(function*() {
yield* L.userJson({
outer: {
inner: "value",
},
})
})
We can optionally pass a schema with description annotations, which will then be used to JSONC-encode the JSON with descriptions about corresponding values.
import { Array, Console, Effect, Schema } from "effect"
import L, { LPretty } from "liminal"
const ExampleSchema = Schema.Struct({
inner: Schema.String.pipe(
Schema.annotations({
description: "Some description for the LLM.",
}),
),
})
Effect.gen(function*() {
yield* L.userJson({ inner: "value" }, ExampleSchema)
})
The resulting message looks as follows.
```jsonc
{
// Some description for the LLM.
inner: "value"
}
```
L.assistant
Infer a message from the model and append it to the thread.
Effect.gen(function*() {
yield* L.user`...`
const reply = yield* L.assistant
reply satisfies string
})
L.assistantSchema
Use Effect Schema to describe structured output requirements.
Providing Schemas
Effect.gen(function*() {
yield* L.user`Is Halloween the best holiday?`
const result = yield* L.assistantSchema(Schema.Boolean)
result satisfies boolean
})
We could of course also provide more complex structures, such as structs.
Effect.gen(function*() {
yield* L.user`When is halloween?`
const result = yield* L.assistantSchema(
Schema.Struct({
month: Schema.Int,
day: Schema.Int,
}),
)
result satisfies {
month: number
day: number
}
})
Providing Field Schemas
In the case of providing structs inline, we can skip the outer Schema.Struct
wrapping, and directly pass the fields.
Effect.gen(function*() {
yield* L.user`When is halloween?`
const result = yield* L.assistantSchema({
month: Schema.Int,
day: Schema.Int,
})
result satisfies { month: number; day: number }
})
L.clear
Clear the thread.
Effect.gen(function*() {
// Initial messages.
yield* L.user`A`
yield* L.user`B`
yield* L.user`C`
// Clear the messages.
yield* L.clear
// The thread is now empty.
const messages = yield* L.messages
assertEquals(messages, [])
})
L.append
Append raw Effect AI messages to the thread.
import { Message, TextPart, UserMessage } from "@effect/ai/AiInput"
declare const messages: Array<Message>
const conversation = Effect.gen(function*() {
// Append a list of messages.
yield* L.append(...messages)
// Or append a single message.
yield* L.append(
UserMessage.make({
parts: [
TextPart.make({ text: "..." }),
],
}),
)
})