Getting Started with GenUI: Flutter Dynamic and AI-Driven UIs

What GenUI is, why it matters, and how to get your first AI-driven Flutter interface running.

8 min read

A smartphone with UI components materializing from flowing light streams against a neural network background

Flutter has always made it easy to build beautiful, fast UIs. But, what if the UI itself could be generated on the fly in response to what a user actually needs? That’s the idea behind GenUI, the Flutter new SDK for building generative user interfaces.

This article walks you through what GenUI is, why it matters, and how to get your first GenUI-powered app running.

Note: GenUI is highly experimental — expect breaking changes. It’s great for prototyping, so treat it accordingly.

What is GenUI?

The traditional Flutter UIs are meant to be declarative and static where you write a Column with a Text and a Button, and that’s what users see every time, regardless of context.

GenUI flips this model. Instead of hard-coding widget trees, you give an AI agent a catalog of your widgets and let it decide which ones to render, and how to compose them, based on the user’s intent.

The difference:

Traditional approachGenUI approach
LLM responds: “Here are three hotels in Tokyo…”LLM renders an interactive hotel carousel widget
LLM responds: “Please answer the following questions…”LLM renders a form with sliders, date pickers, and checkboxes
Screen layout defined at compile timeScreen layout assembled at runtime from your widget catalog

Rather than replacing walls of text with better text, GenUI replaces them with real, interactive Flutter widgets.

The Three Core Concepts

Before writing any code, it helps to understand how GenUI thinks about your app.

Catalog: A vocabulary of widgets you expose to the AI. Each entry includes a widget name, a JSON schema describing its properties, and a builder function that renders it. The AI can only use widgets in the catalog — it cannot create new ones.

Surface: A region of your UI where AI-generated content appears. You place a GenUiSurface widget in your layout, and GenUI fills it with whatever the agent produces.

Conversation: The stateful interaction loop. GenUiConversation manages the full history of messages between your user and the agent, so each new request has context from everything that came before.

The flow looks like this:

  1. User sends a prompt
  2. Your app forwards it to the AI agent, along with the widget catalog as a set of tools
  3. The agent answers not with text, but with structured JSON describing which widgets to render
  4. GenUI deserializes that JSON and builds the widget tree on Surface
  5. The user interacts with those widgets; state changes flow back to the agent as context for the next turn

Even though Flutter still does all the rendering.

Prerequisites

You will need:

  • Flutter SDK installed (preferably the one from the stable channel)
  • Basic Flutter knowledge
  • A free Gemini API key from the Google AI Studio or a FirebaseAIChatModel provided by the dartantic_firebase_ai package

Step 1: Add Dependencies

The fastest path to a working GenUI app uses dartantic_ai directly — no server required, just your API key.

Add the following dependencies to your pubspec.yaml file:

dependencies:
  dartantic_ai: ^3.x.x
  genui: ^0.8.0
  genui_dartantic: ^0.x.x
  json_schema_builder: ^0.1.3

Step 2: Define your Widget Catalog

The catalog is where you tell the AI what it can build. Each CatalogItem has three parts:

  1. A name: what the AI uses to refer to this widget
  2. A JSON schema: the shape of the data the AI must provide
  3. A builder: the Flutter widget that renders from that data

As an example, let’s introduce an InfoCard widget the AI can use to display a title and a description.

import 'package:flutter/material.dart';
import 'package:genui/genui.dart';

final infoCatalog = Catalog(
  components: [
    CatalogItem(
      name: 'InfoCard',
      dataSchema: S.object(
        properties: {
          'title': A2uiSchemas.stringReference(
            description: 'Header text displayed in the card',
          ),
          'description': A2uiSchemas.stringReference(
            description: 'Informative text displayed in the card',
          ),
        },
        required: ['title', 'description'],
      ),
      widgetBuilder: (ctx) {
        final json = ctx.data as Map<String, Object?>;
        final titleValue = json['title'];
        final descriptionValue = json['description'];

        return BoundString(
          dataContext: ctx.dataContext,
          value: titleValue,
          builder: (data, _) {
            return BoundString(
              dataContext: ctx.dataContext,
              value: descriptionValue,
              builder: (context, _) => Card(
                margin: const EdgeInsets.all(8),
                child: Padding(
                  padding: const EdgeInsets.all(16),
                  child: Column(
                    crossAxisAlignment: CrossAxisAlignment.start,
                    children: [
                      Text(
                        titleValue as String? ?? '',
                        style: const TextStyle(
                          fontSize: 18,
                          fontWeight: FontWeight.bold,
                        ),
                      ),
                      const SizedBox(height: 8),
                      Text(descriptionValue as String? ?? ''),
                    ],
                  ),
                ),
              ),
            );
          },
        );
      }
    ),
  ],
);

A few things to take into consideration:

  • The JSON schema is how the AI knows what properties to supply. It acts as the contract between your agent and your UI.
  • The builder function receives the agent’s data and returns a standard Flutter widget. Any widget you can write in Flutter, you can expose through a CatalogItem.
  • The AI cannot use a widget unless it’s in the catalog. This is a feature, not a limitation — it keeps generated UIs within your design system.

Step 3: Set Up the Conversation

The next step is to wire up the GenUIConversation with your catalog and the model.

genui_dartantic provides DartanticContentGenerator, a ready-made ContentGenerator implementation backed by the dartantic_ai agentic framework. It supports multiple AI providers (Google, OpenAI, Anthropic, Mistral, and more) through a unified API, so you can swap backends without touching the rest of your code.

import 'package:genui/genui.dart';
import 'package:google_generative_ai/google_generative_ai.dart';

const _apiKey = 'YOUR_GEMINI_API_KEY'; // Only for local development

late final GenUiConversation conversation;

void initConversation() {
  conversation = GenUiConversation(
    contentGenerator: DartanticContentGenerator(
      model: 'gemini-3-flash-preview',
      provider: Providers.google(apiKey: _apiKey),
      catalog: infoCatalog,
      systemInstruction: '''
        You are a helpful assistant. When responding, always use the
        InfoCard widget from your catalog to display information visually.
        Never respond with plain text alone.
      ''',
    ),
  );
}

The systemInstruction plays a key role in the setup. The AI needs explicit direction to use catalog tools rather than defaulting to plain text responses. Be specific about when and how you want it to use your widgets.

Step 4: Build the UI

Now place a GenUiSurface in your widget tree. This is the region where the agent’s generated content will appear.

import 'package:flutter/material.dart';
import 'package:genui/genui.dart';

class GenUiDemo extends StatefulWidget {
  const GenUiDemo({super.key});

  @override
  State<GenUiDemo> createState() => _GenUiDemoState();
}

class _GenUiDemoState extends State<GenUiDemo> {
  final _controller = TextEditingController();
  bool _loading = false;

  @override
  void initState() {
    super.initState();
    initConversation();
  }

  Future<void> _send() async {
    final prompt = _controller.text.trim();
    if (prompt.isEmpty) return;

    setState(() => _loading = true);
    _controller.clear();
    await conversation.sendMessage(prompt);
    setState(() => _loading = false);
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(title: const Text('GenUI Demo')),
      body: Column(
        children: [
          // The surface where AI-generated widgets appear
          Expanded(
            child: GenUiSurface(
              conversation: conversation,
              surfaceId: 'main',
            ),
          ),
          if (_loading) const LinearProgressIndicator(),
          Padding(
            padding: const EdgeInsets.all(8),
            child: Row(
              children: [
                Expanded(
                  child: TextField(
                    controller: _controller,
                    decoration: const InputDecoration(
                      hintText: 'Ask something...',
                      border: OutlineInputBorder(),
                    ),
                    onSubmitted: (_) => _send(),
                  ),
                ),
                const SizedBox(width: 8),
                IconButton.filled(
                  onPressed: _loading ? null : _send,
                  icon: const Icon(Icons.send),
                ),
              ],
            ),
          ),
        ],
      ),
    );
  }
}

Run the app, type a prompt like “Tell me three fun facts about the ocean”, and the AI will respond by composing InfoCard widgets rather than a block of text.

Step 5: Add an Interactive Widget

Static display widgets are a good start, but GenUI becomes compelling when widgets feed data back to the agent. Let’s take a look at a RatingWidget that the user can interact with:

CatalogItem(
  name: 'RatingWidget',
  dataSchema: S.object(
    properties: {
      'label': A2uiSchemas.stringReference(
        description: 'The title text',
      ),
      'value': S.number(
        description: 'Current rating value, must be between 1 and 5',
      ),
    },
    required: ['label', 'value'],
  ),
  widgetBuilder: (ctx) {
    final json = ctx.data as Map<String, Object?>;
    final label = json['label'] as String;
    final value = (json['value'] as num).toDouble();

    return BoundString(
      dataContext: ctx.dataContext,
      value: label,
      builder: (context, title) {
        return Column(
          crossAxisAlignment: CrossAxisAlignment.start,
          children: [
            Text(title ?? label),
            Slider(
              value: value,
              min: 1,
              max: 5,
              divisions: 4,
              label: value.toStringAsFixed(0),
              onChanged: (newValue) {
                // Update the DataModel so the agent sees the new state
                ctx.dataContext.update(json['_id'] as String, {'value': newValue});
                // Instead of manual rebuild, the Slider could be wrapped
                // into BoundNumber, another GenUI reactive binding widget
                setState(() {});
              },
            ),
          ],
        );
      },
    );
  },
),

BoundString and BoundNumber are GenUI’s reactive binding widgets. Rather than managing local state with StatefulBuilder and calling setState, you wrap the relevant part of your tree in a bound widget and it rebuilds automatically whenever the underlying value in the DataContext changes. BoundString resolves a label (which may be a literal or a data-model path reference), while BoundNumber does the same for numeric values.

When the user moves the slider, ctx.dataContext.update writes the new value back to GenUI’s central state store. In the next message, that updated state flows back to the agent as context — enabling true bidirectional interaction between the user and the LLM.

What to Build Next

Once you’re comfortable with the basics, a few directions worth exploring:

More widget types: The catalog can hold anything Flutter can render: carousels, date pickers, charts, maps. The richer your catalog, the more expressive the AI’s responses.

Better system instructions: How you prompt the agent determines the quality of its output. Spend time iterating on your systemInstruction — it’s as important as the catalog itself.

Switching providers: One of dartantic_ai’s strengths is provider portability. Swapping from Google to OpenAI or Anthropic requires changing only the Providers.* argument and the model name; the rest of your code stays the same.

Firebase AI for production: Embedding an API key directly in the app is for local experimentation only. For a shipped app, switch to dartantic_firebase_ai, which uses Firebase AI Logic for secure, server-side model calls, keeping your credentials off the client.

The A2UI protocol: For server-side agent architectures, genui_a2a connects your Flutter app to any backend that implements the A2UI protocol over WebSockets. Note that this area is still actively evolving — the team is splitting the library into a pure-Dart genui_core package (responsible for A2UI message parsing, JSON Pointer-based state management, and expression evaluation) and a Flutter-only genui renderer on top of it. The goal is to make the core logic fully portable to non-Flutter environments. As of writing, the milestone is roughly 40% complete, so keep an eye on that issue for breaking changes before committing to the current API surface.

Logging: Enable GenUI’s built-in logging during development to see exactly what flows between your app and the agent:

import 'package:logging/logging.dart';
import 'package:genui/genui.dart';

final logger = configureGenUiLogging(level: Level.ALL);

void main() {
  logger.onRecord.listen((record) {
    debugPrint('${record.loggerName}: ${record.message}');
  });
  runApp(const MyApp());
}

Wrapping Up

GenUI changes the question from “what should this screen look like?” to “what widgets should I give the AI to work with?”. Your job shifts from assembling widget trees by hand to designing and curating the system that produces them.

It’s early days — the package is in alpha and the API will change. But the underlying idea is solid, and the Flutter team is investing in it seriously. Now is a good time to experiment, build prototypes, and share feedback.

The full GenUI documentation lives at docs.flutter.dev/ai/genui, and the official examples are worth studying once you’ve got the basics down.

Resources