diff --git a/docs/ff-concepts/advanced/_category_.json b/docs/ff-concepts/advanced/_category_.json index ca70e70f..aab65992 100644 --- a/docs/ff-concepts/advanced/_category_.json +++ b/docs/ff-concepts/advanced/_category_.json @@ -1,4 +1,4 @@ { "label": "Advanced", - "position": 11 + "position": 12 } \ No newline at end of file diff --git a/docs/ff-concepts/agentic-ai/_category_.json b/docs/ff-concepts/agentic-ai/_category_.json new file mode 100644 index 00000000..ce91d2cf --- /dev/null +++ b/docs/ff-concepts/agentic-ai/_category_.json @@ -0,0 +1,4 @@ +{ + "label": "Agentic AI", + "position": 11 +} \ No newline at end of file diff --git a/docs/ff-concepts/agentic-ai/genui/_category_.json b/docs/ff-concepts/agentic-ai/genui/_category_.json new file mode 100644 index 00000000..22bb6cfd --- /dev/null +++ b/docs/ff-concepts/agentic-ai/genui/_category_.json @@ -0,0 +1,4 @@ +{ + "label": "GenUI Chat", + "position": 1 +} \ No newline at end of file diff --git a/docs/ff-concepts/agentic-ai/genui/app-event-integrations.md b/docs/ff-concepts/agentic-ai/genui/app-event-integrations.md new file mode 100644 index 00000000..e584d3dd --- /dev/null +++ b/docs/ff-concepts/agentic-ai/genui/app-event-integrations.md @@ -0,0 +1,83 @@ +--- +slug: /concepts/app-event-integration +title: App Event Integration +description: Feed local app events into GenUI so the conversation can react to live app state and time-sensitive signals. +tags: [AI, Chat, App Events] +sidebar_position: 4 +keywords: [FlutterFlow, App Events, GenUI, Conversational AI, Chat widget, AI agent, A2UI protocol, Component rendering, Tool calling] +--- + +# App Events Integrations + +App Event Integration lets GenUI listen to FlutterFlow **LOCAL** app events and turn them into conversation context. + +This is how GenUI becomes aware of things the user did not explicitly type: + +- Cart changes +- Workflow completion +- Alerts +- Navigation context +- Device or sensor updates + +GenUI automatically listens for matching local events and converts them into hidden context messages for the conversation. + +## Two Integration Modes + +- **Context Injection**: Use `auto_respond: false` when the event should enrich future replies without interrupting the user immediately. In this mode, the event message is added to a pending queue, which is then flushed before the next user message is sent, allowing the model to use these queued messages as hidden context during the next inference. + +- **Proactive Response**: Use `auto_respond: true` when the event should trigger an immediate GenUI response. In this mode, the event message is sent directly into the conversation as an InternalMessage, inference starts right away, and the model may respond with text, UI, both, or nothing visible depending on the prompt and context. + +## Message Construction + +You can either enter a custom message directly in the **Message Template** field or bind it to a variable for dynamic content. + +If the event includes payload data, GenUI automatically appends it. For example, entering “Your order status is:” and triggering the event which includes event data such as `pending` or `in transit` will result in messages such as “Your order status is pending.” + + +## Pending Context Queue +For `auto_respond: false`, GenUI stores pending event messages in memory until the user sends the next message. The queue has a maximum size of 50, and if it overflows, the oldest messages are dropped first. Before the next user request is sent, these messages are injected directly into the conversation history as InternalMessages, allowing the model to use them as context without triggering additional model calls. + +## Best Practices + +#### Use context injection for ambient state + +For example: + +- Updated cart contents +- Current page context +- Background sync results + +These make future replies smarter without causing unsolicited responses. + +#### Use proactive response for time-sensitive events + +For example: + +- Threshold alerts +- Task completion +- Failed jobs +- Incoming high-priority updates + +These are the moments where an immediate assistant response is justified. + +#### Keep event data structured + +If an event carries payload data, use a stable, well-designed data type. The generated message ends up calling `toMap()`, so clearer payload structure produces clearer AI context. + +#### Do not flood the queue + +If a background signal can fire rapidly, consider batching it before triggering the event. The queue has a hard cap of 50 messages. + +## Examples + +#### Cart awareness without interruption + +Use a `CartUpdated` local event with `auto_respond: false`. + +Each cart update quietly enriches the pending context so the next time the user asks, "What's in my cart?" the model already has the latest state. + +#### Immediate alerting + +Use a `TemperatureAlert` local event with `auto_respond: true`. + +When the event fires, GenUI immediately triggers inference and the model can warn the user and render a supporting UI component if the catalog contains one. \ No newline at end of file diff --git a/docs/ff-concepts/agentic-ai/genui/component-catalog.md b/docs/ff-concepts/agentic-ai/genui/component-catalog.md new file mode 100644 index 00000000..a5cab5fc --- /dev/null +++ b/docs/ff-concepts/agentic-ai/genui/component-catalog.md @@ -0,0 +1,137 @@ +--- +slug: /concepts/component-catalog +title: Component Catalog +description: Configure the FlutterFlow components that GenUI is allowed to render inside the chat surface. +tags: [AI, Chat, Components, Widgets] +sidebar_position: 2 +keywords: [FlutterFlow, Components, GenUI, Conversational AI, Chat widget, AI agent, A2UI protocol, Component rendering, Tool calling] +--- + +# Component Catalog + +The **Component Catalog** is the list of FlutterFlow components that GenUI can render inline in the conversation. Without a catalog, GenUI can still chat and call tools, but it has no specific UI to render. + +Internally, GenUI creates documentation for each catalog component. That documentation includes: + +- Component name +- Component description +- Parameter names +- Parameter types +- Required or optional status +- Parameter descriptions + +The model's render decisions are only as good as the naming and descriptions you provide. + +## Component Requirements + +#### The component must be serializable at the API boundary + +Catalog components cannot expose **action parameters**. GenUI only knows how to pass structured data into the component, not callbacks or arbitrary closures. + +#### Parameters should use supported types + +Supported parameter categories in the generated catalog pipeline include: + +- `String` +- `int` +- `double` +- `bool` +- `Color` +- `DateTime` +- `TimestampRange` +- `LatLng` +- `GooglePlace` +- `JSON` +- `DataStruct` +- `Enum` +- media-path string types such as `ImagePath`, `VideoPath`, `AudioPath`, and `MediaPath` +- `List` of supported item types + +#### Required complex parameters need explicit defaults + +If a catalog parameter is non-nullable and uses one of these complex types: + +- `Color` +- `DateTime` +- `TimestampRange` +- `LatLng` +- `GooglePlace` +- `DataStruct` +- `JSON` + +then you should either: + +- set an explicit default value, or +- make the parameter optional + +For instance, if your **EventCard** component has a required `eventDate: DateTime` parameter, you must either set a default value in the component editor or make the parameter optional. Without this, GenUI validation will reject the component. + +GenUI validation enforces this because those types do not have a safe implicit fallback in generated constructor code. + +## Runtime Rules + +#### One root component per surface + +Each GenUI surface renders exactly one catalog component as its root. That root component can be a rich component tree internally, but the model cannot compose arbitrary parent wrappers like `Column`, `Container`, or other widgets that are not in the catalog. + +#### The model can only use listed catalog components + +If a component is not in the catalog, it does not exist from the model's perspective. + +## Best Practices + +#### Use list-friendly components + +Because a surface has one root component, a component that accepts `List` is often the right shape for result sets: + +- `TransactionList` +- `SearchResultsGrid` +- `CartItemsSummary` + +#### Prefer focused components over screen-sized composites + +Good catalog components are reusable units, such as: + +- `ProductCard` +- `OrderSummary` +- `InvoicePreview` +- `ReviewSummary` +- `AppointmentConfirmation` + +These give the model flexible building blocks. A large page-like component is harder to reuse and usually harder for the model to choose well. + +#### Use consistent `DataStruct` across tools and components + +If a tool returns `ProductStruct`, prefer catalog components that also accept `ProductStruct` or `List`. That keeps tool output and rendering input aligned and makes the tool-to-UI handoff more reliable. + +#### Describe parameters like you are documenting an API + +Good: + +- `estimatedDeliveryDate`: "Expected arrival date in ISO 8601 format." +- `inventoryStatus`: "Availability state shown to the user, such as inStock or backOrdered." + +Weak: + +- `date` +- `status` + +#### Keep component names specific + +Use clear, descriptive names that reflect the component’s purpose. + +Good: + +- `OrderStatusCard` +- `SensorAlertSummary` +- `QuoteBreakdown` + +Weak: + +- `Card1` +- `Summary` +- `Details` + +#### Avoid ambiguous overlap + +If two components do roughly the same thing, the model has to guess. Either merge them, rename them more clearly, or narrow their intended use. \ No newline at end of file diff --git a/docs/ff-concepts/agentic-ai/genui/genui.md b/docs/ff-concepts/agentic-ai/genui/genui.md new file mode 100644 index 00000000..6d5794ed --- /dev/null +++ b/docs/ff-concepts/agentic-ai/genui/genui.md @@ -0,0 +1,430 @@ +--- +slug: /concepts/genui-chat +title: GenUI Chat +description: Add a conversational AI surface to your FlutterFlow app that can render catalog components, call action blocks as tools, and react to local app events. +tags: [AI, Chat, Conversational UI] +sidebar_position: 1 +keywords: [FlutterFlow, GenUI, Conversational AI, Chat widget, AI agent, A2UI protocol, Component rendering, Tool calling +] +--- + +# GenUI Chat + +Usually, applications follow a fixed model: developers design screens, define navigation, and hard-code interactions. Users are limited to these predefined flows, and anything outside those paths simply isn’t supported. + +With GenUI, your app provides agent-driven experiences. Instead of relying on rigid flows, an AI agent can assemble user journeys dynamically in real time. Developers no longer need to predict every scenario. Instead, they define the building blocks and the AI orchestrates them into meaningful, context-aware experiences for the user. + +This represents a fundamental shift, from building fixed applications to building flexible capabilities that an agent can compose on demand. Think of it as building the components, and AI decides when to use them. + +**Traditional App:** The user clicks 'View Order' → navigates to `OrderDetailPage` → sees order info + tracking + items list. The flow is fixed, and every interaction must be pre-built. + +**With GenUI:** Build `OrderSummaryCard`, `TrackingStatusCard`, `OrderItemsList` as separate components. Build `getOrderDetails` as a tool. The AI decides what to show based on what the user asks. + +For example, a user asks, “Show my recent orders.” Instead of responding with text, the agent renders **order card components** with details like items, price, and delivery status. The user then asks, “Where is my latest order?” Now, instead of showing another block of text, the agent switches to a **map component** to display the live delivery location. This demonstrates how the agent dynamically selects the most relevant UI component based on the user’s intent. + +![personal-shopper.avif](imgs/personal-shopper.avif) + +:::tip[GenUI is not a chatbot] + +GenUI may look like a chat interface, but it is fundamentally different from traditional chatbots. Instead of responding with text messages, the AI renders real UI components, such as cards, lists, forms, and maps—directly in the interface. Users don’t just read responses; they interact with fully functional UI. + +This means GenUI is not about conversations, it’s about dynamically composing application experiences using your actual app components. +::: + +:::note +This doesn’t replace traditional UI. Navigation, dashboards, and structured flows still play an important role. GenUI introduces a **new layer** — dynamic, adaptive, and conversational — that handles the long tail of use cases traditional interfaces can’t efficiently cover. +::: + +## GenUI Is Built on A2UI + +GenUI is FlutterFlow's implementation of [**A2UI (Agent-to-UI)**](https://a2ui.org/). An [**open project by Google**](https://github.com/google/A2UI) that defines a declarative UI protocol for agent-driven interfaces. A2UI allows AI agents to generate rich, interactive UIs that render natively across platforms (web, mobile, desktop) without executing arbitrary code. + + +## Three Pillars of GenUI + +GenUI introduces three core pillars that work together to transform your app into an agent-driven experience: + +**1. Component Catalog:** Instead of replying with plain text, the AI uses your FlutterFlow components, such as product cards, booking tiles, or dashboards, to present information directly in the interface. Users don’t read the text; they interact with real UI. + +**2. Tools:** Your existing FlutterFlow action blocks become capabilities the AI can use. Whether it’s fetching data, calling APIs, submitting forms, or triggering workflows, the AI can execute these actions and use the results instantly. It moves beyond conversation and starts performing real tasks inside your app. + +**3. App Event Integration:** Your app’s events provide real-time context to the AI. Things like user actions, state changes, or backend updates can trigger responses. With auto-response enabled, the AI doesn’t wait for input; it proactively reacts and updates the experience as things happen. + +![three-pillars.avif](imgs/three-pillars.avif) + +## Adding GenUI + +Let’s walk through how to add a GenUI Chat by building a simple product lookup assistant. Follow the steps below: + +1. Make sure you’ve completed the [Firebase integration](../../../ff-integrations/firebase/connect-to-firebase-setup.md), including the [initial setup](../../../ff-integrations/authentication/firebase-auth/auth-initial-setup.md) and configuration files. +2. Go to **Firebase Console > AI Logic** and enable it. GenUI is powered by **Google Gemini** via [**Firebase AI Logic**](https://firebase.google.com/products/firebase-ai-logic) and uses a **usage-based pricing model**. You can get started on the **Spark (free)** plan for testing and low usage, but for production or higher usage, you’ll need to upgrade to the **Blaze (pay-as-you-go)** plan, where costs depend on AI requests and token usage. + :::tip + We recommend monitoring your usage in the Firebase Console, setting up budget alerts to avoid unexpected charges, and upgrading to Blaze before moving to production. + ::: +3. In your FlutterFlow project, create a **`ProductListCard`** component, which displays product details such as the image, name, and description. This component accepts a parameter of Data Type **`Product`**. +4. Create an Action Block named **`getProductDetails`**, which retrieves the details of a single product and returns it as a **`Product`** data type. +5. Place the **GenUI Chat** widget on a page or component like any other FlutterFlow widget. +6. Go to the Properties panel and define domain instructions to guide how the assistant behaves and communicates in your app. These instructions help the AI understand your app’s context, tone, and what it should prioritize. If left empty, it defaults to a generic assistant that builds UI in response to user requests. + + **Example System Prompt:** + `You are a helpful AI shopping assistant for an e-commerce app. Help users discover products, compare options, track orders, and complete purchases.` + +7. Select the components that the AI is allowed to render in responses. For this example, select the `ProductListCard` component created in step 3. To learn how to configure components for GenUI, refer to the [Component Catalog](component-catalog.md) documentation. +8. If needed, add the [Action Blocks](../../../resources/control-flow/functions/action-blocks.md) that the AI can call. For this example, select the action block named `getProductDetails`, created in step 4. Note that only Action Blocks that return a value can be added. To learn how to configure them for GenUI, refer to the [Tools Configuration](tools-configuration.md) documentation. +9. If needed, choose Local [App Events](../../app-events/app-events.md) to connect to the conversation. To learn how to configure app events for GenUI, refer to the [App Events Integrations](app-event-integrations.md) documentation. + +
+ +
+

+ +### Customization + +You can fully customize the chat interface using the following options available in the Properties panel: + +- **Layout & container:** Background, border radius, padding, message spacing, and max message width +- **Header:** Visibility, title, background color, and text color +- **Avatars:** Visibility, size, and image sources for both user and AI +- **Message bubbles:** Background colors, text colors, and border radii for user and AI messages +- **Input field:** Placeholder text, background, border radius, and padding +- **Send button:** Icon and background styling +- **Welcome state:** Visibility, title, and subtitle shown when the chat is empty +- **Scrolling behavior:** Auto-scroll to new messages and animation duration +- **Thinking/status message:** Text displayed while the AI is generating a response + +**Default Behavior:** + +- Header is shown by default +- Avatars are enabled by default +- Auto-scroll is enabled +- Input placeholder defaults to “Type a message…” +- Thinking message defaults to “Thinking…” +- Welcome state is shown when there are no messages + +
+ +
+

+ +## Examples + +#### 1. Customer Support Agent + +**Traditional Approach:** Build a help center with FAQ pages, a ticket form, and a chatbot that matches keywords to canned responses. + +**GenUI Approach:** + +- **Catalog Components:** TicketStatusCard, FAQArticle, EscalationForm, SatisfactionSurvey, AgentContactCard +- **Tools:** `lookupTicket(ticketId)`, `searchKnowledgeBase(query)`, `createTicket(details)`, `getCustomerHistory(customerId)` +- **App Events:** `NewTicketUpdateEvent` (auto-respond) when a support ticket is updated in the backend, the AI proactively informs the user + +A user opens the support chat. They describe their issue in natural language. The AI searches the knowledge base using the tool, finds a relevant article, and renders it as a FAQ Article component. If that does not resolve the issue, the AI creates a ticket using `createTicket`, shows the TicketStatusCard with the new ticket ID, and says it will notify them of updates. Later, when the support team updates the ticket, a `NewTicketUpdateEvent` fires, and the AI proactively shows the updated TicketStatusCard with the resolution. + +The developer did not build a "ticket lookup flow" or a "knowledge base search screen." They built components and tools. The AI composed the journey. + +#### 2. E-Commerce Personal Shopper + +**Traditional Approach:** Build product listing pages, filters, a search bar, a comparison tool, a cart, and a checkout flow. + +**GenUI Approach:** + +- **Catalog Components:** ProductCard, ComparisonTable, PriceHistoryChart, ReviewSummary, CartSummary, PromoCodeBanner +- **Tools:** `searchProducts(query,filters)`, `getProductDetails(productId)`, `getReviews(productId)`, `addToCart(productId,quantity)`, `applyPromoCode(code)`, `getPriceHistory(productId)` +- **App Events:** `CartUpdatedEvent` (context injection) keeps the AI aware of what is already in the cart; `FlashSaleEvent` (auto-respond) alerts the user about time-sensitive deals + +A user says, "I need a gift for my dad who likes woodworking and coffee." The AI searches products, shows a curated set of ProductCards, and when the user shows interest in a specific item, pulls up the ReviewSummary and PriceHistoryChart. The AI knows what is in the cart (via CartUpdatedEvent context) and can suggest complementary items. When a flash sale starts on a relevant product, the AI proactively shows the PromoCodeBanner. + +No search results page. No filter sidebar. No "compare" button. The AI built a personalized shopping experience from the components and tools available to it. + +## Current Limitations + +Here are some important limitations and considerations to keep in mind: + +- The only supported backend today is **Firebase AI Logic**. +- App event listeners currently work only with **LOCAL** app events. +- Catalog components cannot expose action parameters. +- Avatar images must be valid network URLs (local asset paths are not supported). +- Each rendered surface supports only a single catalog component as its root. + +## Best Practices + +#### Describe Everything + +The AI reads your component and parameter descriptions to decide what to render and what values to provide. The quality of your descriptions directly impacts the quality of the AI's responses. + +- Name components clearly: `ProductCard` not `Card1` +- Name parameters descriptively: `estimatedDeliveryDate` not `date` +- Add descriptions to parameters: "The product's price in USD" not just "price" +- Add descriptions to action blocks: "Searches the product catalog and returns matching items with prices and availability" not just "search" + +The AI is only as smart as the vocabulary you give it. + +#### Design for Composition + +Components and tools work best when they are designed to be composed: + +- **Retrieval Tool + Display Component:** `getOrderDetails()` returns an `OrderStruct` -> `OrderStatusCard` accepts an `OrderStruct` as a parameter. The AI calls the tool and passes the result to the component. +- **Granular Over Monolithic:** A `ProductCard`, `ReviewSummary`, and `PriceChart` give the AI three options. A single `ProductDetailPage` component gives the AI one. +- **Consistent Data Types:** Use the same DataStruct across related tools and components. If `searchProducts` returns `ProductStruct`, make `ProductCard` accept `ProductStruct`. + +#### Use Events for Temporal Awareness + +App events give the AI a sense of time and change. Without them, the AI only knows what the user tells it. With them, the AI knows what is happening. + +- Use **auto_respond: false** for continuous state awareness, such as user navigation, preference changes, background data updates. +- Use **auto_respond: true** for time-sensitive signals, such as alerts, completions, threshold breaches, incoming messages. + +#### Write System Prompts Like Onboarding Documents + +The system prompt is the AI's job description. Write it like you are onboarding a new team member: + +- What is their role? +- What domain should they know about? +- What should they prioritize? +- What should they never do? +- What tone should they use? +- What business rules must they follow? + +A great system prompt makes the difference between a useful assistant and a generic chatbot. + + +## Behind the Scenes + +GenUI is powered by [**Firebase AI Logic**](https://firebase.google.com/products/firebase-ai-logic) (Google Gemini) as its LLM backend. At a high level, the system works as: + +**Your configuration → code generation → runtime widget powered by Firebase AI Logic and the [GenUI](https://pub.dev/packages/genui) package**. + +You define components, tools, and events in FlutterFlow, and GenUI automatically generates the necessary code and runtime behavior to render dynamic UI experiences. + +## FAQS + +
+ +The widget builds but the AI only sends text + + +

+Check the catalog first. If no component fits the request, text is the expected fallback. Also, confirm that your system prompt and component descriptions make it clear when each component should be used. +

+
+ +
+ +I can't add a component to the catalog + + +

+The most common causes are: + +- The component has an action parameter. +- A required complex parameter is missing a default value. +- The component was deleted or renamed after being configured. +

+
+ +
+ +I can't add an Action Block as a tool + + +

+The Action Block must return a value, and every parameter, plus the return type, must be supported by the tool serializer. +

+
+ +
+ +My event listener is not firing + + +

+Make sure the following are correctly set: + +- The event is LOCAL scope. +- The right event is being triggered at runtime. +- `auto_respond` is set the way you expect. +

+
+ +
+ +Why does a component fail validation? + + +

+Common reasons include: + +- It has an action parameter. +- It is configured twice in the same catalog. +- A required complex parameter is missing a default value. +- The configured component no longer exists. +

+
+ +
+ +Why is the model choosing the wrong component? + + +

+Usually one of these is true: + +- The names are too generic. +- Parameter descriptions are weak. +- Multiple catalog components overlap too much in purpose. +- The system prompt does not explain how the assistant should prioritize them. +

+
+ +
+ +Can the model render multiple items? + + +

+Yes, but the reliable pattern is to use a single catalog component that accepts a list rather than expecting the model to assemble multiple independent sibling components on its own. +

+
+ +
+ +Why is the model not calling a tool? + + +

+Usually, the issue is not codegen. It is tool discoverability: + +- The name is vague +- The description is weak +- The system prompt does not make it clear when the tool should be used +- The model already has enough context to answer without calling it +

+
+ +
+ +What happens when a tool fails? + + +

+The generated tool code catches the exception, clears the loading state, and sends an error payload back to the model. The UI should remain stable, and the model can decide how to explain or recover. +

+
+ +
+ +Why can't I select my event? + + +

+The event must be LOCAL scope and must still exist in the project or dependency where it was defined. +

+
+ +
+ +Why didn't the assistant respond immediately? + + +

+Check the following: + +- whether `auto_respond` is actually `true` +- whether the event is being triggered +- whether the system prompt tells the model to react visibly + +Note: Even with immediate inference, not every event will result in a visible response. +

+
+ +
+ +Why does the assistant only react on the next user message? + + +

+That is the expected behavior for `auto_respond: false`. The listener queues hidden context instead of triggering a separate inference call. +

+
+ +
+ +Can one GenUI widget listen to the same event twice? + + +

+No. Duplicate listeners for the same event on the same widget are rejected during validation. +

+
+ +
+ +Do conversations persist across app restarts? + + +

+No. Conversations do not persist across app restarts. If a user closes and reopens the app, the chat history is reset. +

+
+ +
+ +Can I choose the Gemini model or adjust parameters like temperature? + + +

+GenUI uses Firebase AI Logic, which manages the underlying Gemini model and its configuration. At the moment, you cannot directly select specific model variants or adjust parameters like temperature or top_p. The system is designed to provide a simplified, managed experience without requiring manual tuning. +

+
+ +
+ +What happens when Firebase AI Logic quota or rate limits are exceeded? + + +

+If you exceed Firebase AI Logic or Gemini free-tier limits, requests will fail with a 429 quota-exceeded error. This typically means you’ve hit limits such as requests per minute or free-tier usage caps. In some cases, the error will include a retry time, after which you can try again. While the Spark plan works for testing, it is subject to strict free-tier limits, so for higher usage or production apps, you should expect to upgrade to a paid plan and monitor usage closely +

+
diff --git a/docs/ff-concepts/agentic-ai/genui/imgs/personal-shopper.avif b/docs/ff-concepts/agentic-ai/genui/imgs/personal-shopper.avif new file mode 100644 index 00000000..9bb477e0 Binary files /dev/null and b/docs/ff-concepts/agentic-ai/genui/imgs/personal-shopper.avif differ diff --git a/docs/ff-concepts/agentic-ai/genui/imgs/three-pillars.avif b/docs/ff-concepts/agentic-ai/genui/imgs/three-pillars.avif new file mode 100644 index 00000000..ba19c51a Binary files /dev/null and b/docs/ff-concepts/agentic-ai/genui/imgs/three-pillars.avif differ diff --git a/docs/ff-concepts/agentic-ai/genui/tools-configuration.md b/docs/ff-concepts/agentic-ai/genui/tools-configuration.md new file mode 100644 index 00000000..241940e5 --- /dev/null +++ b/docs/ff-concepts/agentic-ai/genui/tools-configuration.md @@ -0,0 +1,155 @@ +--- +slug: /concepts/tools +title: Tools Configuration +description: Expose Action Blocks to GenUI so the model can fetch data, run workflows, and use real results in its responses. +tags: [AI, Chat, Actions] +sidebar_position: 3 +keywords: [FlutterFlow, Actions, Actions Block, GenUI, Conversational AI, Chat widget, AI agent, A2UI protocol, Component rendering, Tool calling] +--- + +# Tools Configuration + +In GenUI, **Tools** are Action Blocks that the model can call during a conversation. A tool is appropriate when the model needs fresh data or needs to perform work before it can answer. + +Common uses: + +- Query APIs or databases +- Run calculations +- Fetch records by ID +- Transform structured data +- Trigger a workflow that still returns a useful result + +:::warning + +If the Action Block does not return anything, it cannot be used as a GenUI tool. + +::: + +For each tool, GenUI includes: + +- Function name +- Description +- Parameters +- Required or optional status +- Parameter descriptions +- Return type +- Return description + +That means the Action Block name and description matter. They are part of the tool-selection signal the model sees. + +:::note +If a tool throws an exception, the error is caught and sent back to the model as a structured error payload. The UI remains stable and the model can explain the failure or suggest alternatives. +::: + +## Tool Requirements + +#### The Action Block must return a value + +Tools are designed around request/response semantics. No return value means nothing meaningful can be sent back to the model. + +#### Parameter and return types must be supported + +Supported tool types include: + +- `String` +- `int` +- `double` +- `bool` +- `Color` +- `DateTime` +- `TimestampRange` +- `LatLng` +- `GooglePlace` +- `JSON` +- `DataStruct` +- `Enum` +- media-path string types such as `ImagePath`, `VideoPath`, `AudioPath`, and `MediaPath` +- list forms of the same supported types + +Unsupported types are rejected during validation. + +#### Duplicate tools are not allowed on the same widget + +Configuring the same Action Block twice on one GenUI widget is treated as an error. + +## Loading Messages + +Each tool can define its own loading message in the widget configuration. + +- If set, that message is shown while the tool runs. +- If omitted, the generated tool uses `Processing...`. + +This is separate from the widget-level thinking message, which defaults to `Thinking...` and is shown before the tool call starts. + +## Serialization Rules + +The generated code serializes common FlutterFlow data types into model-friendly JSON: + +- **Color**: CSS color string. e.g., `Color(0xFF4CAF50)` → `"#4CAF50"` +- **DateTime**: ISO 8601 string. e.g., `DateTime(2024, 3, 15)` → `"2024-03-15T00:00:00.000"` +- **TimestampRange**: start|end milliseconds string. e.g., `TimestampRange(1700000000000, 1700086400000)` → `"1700000000000|1700086400000"` +- **LatLng**: serialized string form. e.g., `LatLng(37.7749, -122.4194)` → `"37.7749,-122.4194"` +- **GooglePlace**: serialized place payload (JSON object with place details) +- **DataStruct**: converted using `toMap()`. e.g., `Product(name: "Shoes", price: 99)` → `{ "name": "Shoes", "price": 99 }` +- **Enum**: serialized enum string. e.g., `OrderStatus.delivered` → `"delivered"` + + +## Best Practices + +#### Keep tools focused + +Prefer small, specific tools: + +- `getOrderDetails` +- `searchProducts` +- `getWeatherForLocation` +- `calculateQuote` + +over broad tools like: + +- `handleRequest` +- `fetchData` +- `processWorkflow` + +#### Write descriptions for model behavior, not just for humans + +Good: + +`Retrieves the current order status, tracking number, and ETA for a given order ID.` + +Weak: + +`Looks up an order.` + +#### Return structured data when possible + +If the output can be represented as Custom Data Type `DataStruct`, do that instead of flattening everything into strings. Structured output is easier for the model to feed into catalog components. + +#### Match tool output to catalog input + +Reliable GenUI setups usually follow this shape: + +- A tool returns `OrderStruct` +- A catalog component accepts `OrderStruct` + +That gives the model a clean path from retrieval to rendering. + +## Common Examples + +#### Data lookup + +`getOrderDetails(orderId: String) -> OrderStruct` + +The model calls the tool, gets a structured order result, and renders an order summary component. + +#### Search + +`searchProducts(query: String, maxPrice: double?) -> List` + +The model calls the tool and then renders a list-style catalog component using the returned products. + +#### Calculation + +`calculateMonthlyPayment(amount: double, rate: double, termMonths: int) -> PaymentQuoteStruct` + +The model uses the result to explain the output and optionally render a quote component. \ No newline at end of file