Learn/Cookbook/Patterns
Background Tasks
Use waitUntil to run work after responding to the client
Run tasks after sending a response using waitUntil. This keeps response times fast while handling analytics, notifications, or other fire-and-forget work.
The Pattern
waitUntil accepts an async function that runs after the response is sent. Multiple calls run concurrently.
import { createAgent } from '@agentuity/runtime';
import { z } from 'zod';
const agent = createAgent('OrderProcessor', {
schema: {
input: z.object({
orderId: z.string(),
userId: z.string(),
}),
output: z.object({
status: z.string(),
orderId: z.string(),
}),
},
handler: async (ctx, input) => {
const { orderId, userId } = input;
// Process the order synchronously
const order = await processOrder(orderId);
// Background: send confirmation email
ctx.waitUntil(async () => {
await sendConfirmationEmail(userId, order);
ctx.logger.info('Confirmation email sent', { orderId });
});
// Background: update analytics
ctx.waitUntil(async () => {
await trackPurchase(userId, order);
});
// Background: notify warehouse
ctx.waitUntil(async () => {
await notifyWarehouse(order);
});
// Response sent immediately, background tasks continue
return {
status: 'confirmed',
orderId,
};
},
});
export default agent;With Durable Streams
Create a stream for the client to poll, then populate it in the background:
import { createAgent } from '@agentuity/runtime';
import { streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
import { z } from 'zod';
const agent = createAgent('AsyncGenerator', {
schema: {
input: z.object({ prompt: z.string() }),
output: z.object({
streamId: z.string(),
streamUrl: z.string(),
}),
},
handler: async (ctx, input) => {
// Create a durable stream the client can read from
const stream = await ctx.stream.create('generation', {
contentType: 'text/plain',
metadata: { sessionId: ctx.sessionId },
});
// Generate content in the background
ctx.waitUntil(async () => {
try {
const { textStream } = streamText({
model: openai('gpt-5-mini'),
prompt: input.prompt,
});
for await (const chunk of textStream) {
await stream.write(chunk);
}
} finally {
await stream.close();
}
});
// Return stream URL immediately
return {
streamId: stream.id,
streamUrl: stream.url,
};
},
});
export default agent;Progress Reporting
Write progress updates to a stream as background work proceeds:
const agent = createAgent('BatchProcessor', {
schema: {
input: z.object({ items: z.array(z.string()) }),
output: z.object({ progressUrl: z.string() }),
},
handler: async (ctx, input) => {
const progress = await ctx.stream.create('progress', {
contentType: 'application/x-ndjson',
});
ctx.waitUntil(async () => {
try {
for (let i = 0; i < input.items.length; i++) {
await processItem(input.items[i]);
await progress.write(JSON.stringify({
completed: i + 1,
total: input.items.length,
percent: Math.round(((i + 1) / input.items.length) * 100),
}) + '\n');
}
await progress.write(JSON.stringify({ done: true }) + '\n');
} finally {
await progress.close();
}
});
return { progressUrl: progress.url };
},
});Key Points
- Non-blocking — Response returns immediately, tasks run after
- Concurrent — Multiple
waitUntilcalls run in parallel - Error isolation — Background task failures don't affect the response
- Always close streams — Use
finallyblocks to ensure cleanup
See Also
- Durable Streams for stream creation and management
- Webhook Handler for another
waitUntilexample
Need Help?
Join our Community for assistance or just to hang with other humans building agents.
Send us an email at hi@agentuity.com if you'd like to get in touch.
Please Follow us on
If you haven't already, please Signup for your free account now and start building your first agent!