-
Notifications
You must be signed in to change notification settings - Fork 115
Add loading scrim. #79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request introduces a LoadingScrim widget to provide visual feedback during LLM inference, which is a great user experience enhancement. The implementation is clean, wrapping the main UI and controlled by a stream from the GenUiManager. The refactoring of the event handling system from a time-based EventDebouncer to an action-based UiEventManager is a significant architectural improvement, making the event flow more robust and predictable. The introduction of UiActionEvent and UiChangeEvent clarifies the intent of each UI interaction. The tests have been updated accordingly, and a new test for the loading scrim has been added, which is excellent. I've found one critical issue and a couple of medium-severity suggestions for improving maintainability.
|
I personally think we should avoid this approach of blocking the entire UI, because it prevents the user from even properly consuming existing content while their waiting for new content to load. Instead, can we make the FilterChipGroup show a loading scrim when the "Submit" button is pressed, and then ideally stop loading it when the surface gets regenerated, e.g. perhaps in didUpdateWidget if necessary? I'm worried that if we switch to the loading scrim temporarily, it will get in the way of us prioritising a better solution to loading, which is necessary to fulfill our first milestone of supporting chat-oriented UIs. Some other options we could explore:
|
|
Okay, makes sense. The scrim is pretty transparent so you can see what's still there, but I get it: I was kind of doing this as a test to see how it looked/acted, and I'm not super pleased with it, although it does do what I set out to do. The problem I'm trying to solve here is that while the LLM is working on an answer, the user can send more events its way before it's ready. I thought about just letting the events accumulate until the LLM responded and then send all the accumulated interactions its way. I suspect that would end in chaos though, and the user would just click a bunch of times on things and make a mess. You could lock down the conversation mode, and just disable anything that's not the last thing in. But you'd still have to disable the last thing during inferences. How about a wrapper to catalog widgets that will put a tiny scrim over each of them when the LLM is thinking? Or maybe we can segment the chat into the separate surfaces for each "turn" and just let the LLM update an existing turn, and disable the UI for that turn while it thinks? In any case, we should probably disable the prompt input field while the LLM thinks. |
|
@jacobsimionato and I spoke, and decided that probably the best choice would be to somehow separate the chat into "turns", which each had their own surface, and when the user interacts with that surface, it becomes disabled, but other turns remain interactable. We also thought it would be good to look at an internal project that has some similar UI challenges. |
Adds a `LoadingScrim` widget to the `flutter_genui` package to provide a visual indicator and disable UI interaction while an LLM inference is in flight. The `travel_app` example has been updated to use this new widget, wrapping the main UI surface and the prompt input field. The `loadingStream` from the `GenUiManager` is used to control the visibility of the scrim. A new widget test has been added to verify that the scrim is displayed correctly when an inference is in progress.
|
I'm going to close this and make another PR later with the solution we want. |
Adds a
LoadingScrimwidget to theflutter_genuipackage toprovide a visual indicator and disable UI interaction while an LLM
inference is in flight.
The
travel_appexample has been updated to use this new widget,wrapping the main UI surface and the prompt input field. The
loadingStreamfrom theGenUiManageris used to control thevisibility of the scrim.
A new widget test has been added to verify that the scrim is
displayed correctly when an inference is in progress.