Conversation
|
|
|
@pinkeshmars I just had a glance and I think this should directly belong to App Events folder because it falls under Event driven paradigms? We shouldn't be creating new folders in concepts section for new features unless it doesn't fit anything else . |
No problem. I moved it under the App Events folder now. |
ayushflow
left a comment
There was a problem hiding this comment.
GenUI Chat Documentation Review
Overall this is solid documentation for a major feature — the writing is clear, the best practices section is genuinely excellent, and the FAQ coverage is thorough. The main issues are structural (wrong nesting, audience mismatch in the architecture section) and a gap in the "paradigm shift" narrative that this feature deserves.
Top priorities
- Restructure nesting — GenUI is a widget/capability, not a subcategory of App Events
- Lead with the paradigm shift — the "Build Primitives, Not Paths" framing should open the doc, not be buried in best practices
- Add one end-to-end tutorial — the single biggest gap; both examples are aspirational, not instructional
- Move architecture internals to a separate page or collapsible section
- Fill info gaps — model identity in setup, cost implications, conversation persistence, debugging guidance
Detailed comments on specific lines below.
| ] | ||
| --- | ||
|
|
||
| # GenUI Chat |
There was a problem hiding this comment.
Structure: GenUI shouldn't live under app-events/
GenUI is a widget/capability that uses app events as one of its three pillars — it's not an app event concept itself. This nesting implies GenUI is a subcategory of app events, which undersells the feature and confuses the information architecture. GenUI should be a top-level concept under ff-concepts/ (sibling to app-events, not a child of it).
There was a problem hiding this comment.
Done! Moved it under the new location: ff-concepts/agenti-ai/gen-ui
|
|
||
| # GenUI Chat | ||
|
|
||
| Usually, applications follow a fixed model: developers design screens, define navigation, and hard-code interactions. Users are limited to these predefined flows, and anything outside those paths simply isn’t supported. |
There was a problem hiding this comment.
The paradigm shift is undersold
The intro sketches the vision in two paragraphs then jumps to a mechanical example. The "Build Primitives, Not Paths" framing (currently buried in Best Practices at line 175) is the clearest articulation of the paradigm shift — that before/after comparison should be the opening of this doc, not a best practice.
Also missing: an explicit differentiation from chatbots. Most developers have seen chat widgets. GenUI is fundamentally different — the AI renders real UI components, not text bubbles. A reader skimming might think "oh, another chatbot widget." Even a single callout saying "GenUI is not a chatbot" would immediately reframe expectations.
There was a problem hiding this comment.
Done! The "GenUI is not a chatbot" part is really necessary here.
|  | ||
|
|
||
| :::note | ||
| This doesn’t replace traditional UI. Navigation, dashboards, and structured flows still play an important role. GenUI introduces a **new layer,** dynamic, adaptive, and conversational, that handles the long tail of use cases traditional interfaces can’t efficiently cover. |
There was a problem hiding this comment.
nit: "dynamic, adaptive, and conversational, that handles" — the comma after "conversational" creates a comma splice. Should be an em-dash:
GenUI introduces a **new layer** — dynamic, adaptive, and conversational — that handles the long tail...
| Follow the steps below to add GenUI Chat to your app: | ||
|
|
||
| 1. Make sure you’ve completed the [Firebase integration](../../../ff-integrations/firebase/connect-to-firebase-setup.md), including the [initial setup](../../../ff-integrations/authentication/firebase-auth/auth-initial-setup.md) and configuration files. | ||
| 2. Go to **Firebase Console > AI Logic** and enable it. |
There was a problem hiding this comment.
The user doesn't know what model they're using
Step 2 says "enable AI Logic" but the reader has no idea what LLM powers this until they reach the Architecture section at line 156 (where it's mentioned parenthetically). This is the setup section — it should say upfront:
GenUI is powered by Google Gemini via Firebase AI Logic.
Also missing: any mention of Firebase Blaze plan requirements or cost implications. Users will discover this when they hit billing errors, which is a bad first experience.
| <p></p> | ||
|
|
||
| ### Customization | ||
|
|
There was a problem hiding this comment.
Customization section has no fallback content
This section is just an Arcade embed with no text explanation of what can be customized. If the embed fails to load (corporate firewalls, reader is offline, Arcade goes down), the reader gets a heading and nothing else. Add at least a bullet list of the customizable properties (colors, avatars, header text, input placeholder, spacing, etc.).
| - Navigation context | ||
| - Device or sensor updates | ||
|
|
||
| The runtime listens on `FFAppEventService.instance.localEventsStream` and converts matching events into hidden `InternalMessage`s. |
There was a problem hiding this comment.
Implementation detail in user-facing docs
FFAppEventService.instance.localEventsStream is an internal class/stream name that users never interact with. This line should describe the behavior, not the implementation:
GenUI automatically listens for matching local events and converts them into hidden context messages for the conversation.
There was a problem hiding this comment.
Done! Updated to your suggestion
|
|
||
| ## Message Construction | ||
|
|
||
| Each listener has a `message_template`. GenUI resolves it in this order: |
There was a problem hiding this comment.
Message Construction section is confusing
"FlutterFlow variable binding, if configured" and "Literal input value, if configured" are unexplained. How does the user configure a variable binding vs. a literal value? What does this look like in the Properties panel? A screenshot or brief description of the configuration UI would make this actionable.
Also line 39: Event data: ${event.data?.toMap()} — showing Dart template syntax is unhelpful for users. Reframe as behavior:
If the event carries payload data, GenUI automatically appends it to the message sent to the model.
There was a problem hiding this comment.
I think this section needs a short rewrite. I didn’t get a chance to fully test the event flow and verify how the message is received. Could you review the updated explanation and let me know if it’s accurate?
|
|
||
| ## Serialization Rules | ||
|
|
||
| The generated code serializes common FlutterFlow data types into model-friendly JSON: |
There was a problem hiding this comment.
nit: The serialization rules would be more useful with at least one concrete example showing the actual JSON output:
Color(0xFF4CAF50) → "#4CAF50"
DateTime(2024, 3, 15) → "2024-03-15T00:00:00.000"
Right now a reader knows that Color becomes a CSS string, but not what it looks like.
| List return values are serialized item-by-item using the same rules. | ||
|
|
||
| ## Best Practices | ||
|
|
There was a problem hiding this comment.
Missing: error handling behavior
What happens when a tool throws an exception at runtime? This is only documented in the main page's FAQ ("What happens when a tool fails?"). Since this is the dedicated tools page, it should mention the error handling behavior here — even a single sentence:
If a tool throws an exception, the error is caught and sent back to the model as a structured error payload. The UI remains stable and the model can explain the failure or suggest alternatives.
There was a problem hiding this comment.
ok, added as a note in the intro section.
|
|
||
| **3. App Event Integration:** Your app’s events provide real-time context to the AI. Things like user actions, state changes, or backend updates can trigger responses. With auto-response enabled, the AI doesn’t wait for input; it proactively reacts and updates the experience as things happen. | ||
|
|
||
|  |
There was a problem hiding this comment.
nit: Image filename three-pillers.avif has a typo — should be three-pillars.avif.
|
Hi @ayushflow I have addressed all the review comments. You can take a look again! |
| ::: | ||
| 3. In your FlutterFlow project, create a **`ProductListCard`** component, which displays product details such as the image, name, and description. This component accepts a parameter of Data Type **`Product`**. | ||
| 4. Create an Action Block named **`getProductDetails`**, which retrieves the details of a single product and returns it as a **`Product`** data type. | ||
| 4. Place the **GenUI Chat** widget on a page or component like any other FlutterFlow widget. |
There was a problem hiding this comment.
Step numbering breaks here — two 4.s in a row (getProductDetails and Place the GenUI Chat widget). The subsequent steps should shift up so the list reads 1–8 without duplicates.
| - **Layout & container:** Background, border radius, padding, message spacing, and max message width | ||
| - **Header:** Visibility, title, background color, and text color | ||
| - **Avatars:** Visibility, size, and image sources for both user and AI | ||
| - **Message bubbles: Background c**olors, text colors, and border radii for user and AI messages |
There was a problem hiding this comment.
Broken bold formatting: **Message bubbles: Background c**olors, text colors, and border radii.... The closing ** is in the wrong spot — this renders as "Message bubbles: Background c" bolded, then unbolded "olors...". Should be: **Message bubbles:** Background colors, text colors, and border radii for user and AI messages.
| - **Send button:** Icon and background styling | ||
| - **Welcome state:** Visibility, title, and subtitle shown when the chat is empty | ||
| - **Scrolling behavior:** Auto-scroll to new messages and animation duration | ||
| - **Thinking/status message:** Aext displayed while the AI is generating a response |
There was a problem hiding this comment.
Typo: "Aext displayed while the AI is generating a response" — should be Text.
|
|
||
| <details> <summary> Can I choose the Gemini model or adjust parameters like temperature? </summary> <p> GenUI uses Firebase AI Logic, which manages the underlying Gemini model and its configuration. At the moment, you cannot directly select specific model variants or adjust parameters like temperature or top_p. The system is designed to provide a simplified, managed experience without requiring manual tuning. </p> </details> | ||
|
|
||
| <details> <summary> What happens when Firebase AI Logic quota or rate limits are exceeded? </summary> <p> If you exceed Firebase AI Logic or Gemini free-tier limits, requests will fail with a 429 quota-exceeded error. This typically means you’ve hit limits such as requests per minute or free-tier usage caps. In some cases, the error will include a retry time, after which you can try again. While the Spark plan works for testing, it is subject to strict free-tier limits, so for higher usage or production apps, you should expect to upgrade to a paid plan and monitor usage closely. </p> </details> |
There was a problem hiding this comment.
These last two FAQ entries ("Can I choose the Gemini model..." and "What happens when Firebase AI Logic quota...") are collapsed onto a single line each, while every other FAQ above them uses multi-line <details> / <summary> / <p> blocks. Functionally fine, but please reformat for consistency with the rest of the file.
|
Hi @ayushflow addressed your latest review comments as well. You can take a look again! |
Description
Add GenUI Chat docs
Linear ticket and magic word Fixes DEVR-1240
Type of change