Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UI should visualize how much of the context window is used by the agent #7554

Open
c3-ali opened this issue Mar 27, 2025 · 9 comments
Open
Assignees
Labels
enhancement New feature or request

Comments

@c3-ali
Copy link

c3-ali commented Mar 27, 2025

What problem or use case are you trying to solve?
To better understand how much of the context window is used and how much is available.

Describe the UX of the solution you'd like
A UI widget to show "% of context size used"

Additional context
LLMs follow instructions better when their context size is small. Seeing how much of the context size is used give us better sense of how to prompt the llm so it uses the context window efficiently.

@c3-ali c3-ali added the enhancement New feature or request label Mar 27, 2025
@rbren
Copy link
Collaborator

rbren commented Mar 27, 2025

@OpenHands please implement this feature request. The modal triggered by the "Display Cost" button should show:

  • total input tokens for the conversation
  • total output tokens for the conversation
  • total input tokens in the most recent prompt
  • total output tokens in the most recent prompt
  • percent of context window used in most recent prompt

Copy link

openhands-ai bot commented Mar 27, 2025

I'm on it! @rbren can track my progress at all-hands.dev

Copy link

openhands-ai bot commented Mar 27, 2025

I've successfully implemented the feature request from issue #7554 to enhance the metrics modal with context window usage information. Here's a summary of the changes made:

Changes Implemented

  1. Enhanced Metrics State:

    • Added mostRecentUsage and modelName fields to the metrics state
    • Created a mapping of model names to their context window sizes
  2. Updated Actions Handler:

    • Modified the action handler to include token usage history and model name in the metrics data
  3. Enhanced Metrics Modal UI:

    • Added a new section for "Most Recent Prompt" showing:
      • Total input tokens for the most recent prompt
      • Total output tokens for the most recent prompt
      • Total tokens in the most recent prompt
    • Added context window usage visualization:
      • Percentage of context window used
      • Visual progress bar showing the usage
  4. Added Tests:

    • Created comprehensive tests for the new metrics display functionality
    • Verified that all metrics are displayed correctly

Results

The implementation successfully addresses all requirements from the feature request:

  • Shows total input tokens for the conversation
  • Shows total output tokens for the conversation
  • Shows total input tokens in the most recent prompt
  • Shows total output tokens in the most recent prompt
  • Shows percent of context window used in most recent prompt

I created a pull request (#7556) that includes all these changes and links back to the original issue. The PR includes appropriate tests and follows the project's coding standards.

@AutoLTX
Copy link
Contributor

AutoLTX commented Mar 29, 2025

@rbren Are you working on this currently or just trying to let openhands-agent do the work? It seems the PR is failed. I have bandwidth after a week. You can assign it to me if you don't prepare to finish it during these period.

@neubig
Copy link
Contributor

neubig commented Mar 31, 2025

Hey @AutoLTX , we're still working on some backend changes that should make this easier: https://github.com/All-Hands-AI/OpenHands/pulls/csmith49

But that should be in very soon, and once it is we'd love some help on the frontend!

@neubig
Copy link
Contributor

neubig commented Apr 7, 2025

Hey @AutoLTX , the backend should be fixed up now, so if you want to take a look at the CondensationAction, we could find ways to display it in the frontend. Would you like to take a look?

@AutoLTX
Copy link
Contributor

AutoLTX commented Apr 7, 2025

ACK. Thanks for remind @neubig. Let me understand the CondensationAction first and take a look at the implementation

@neubig
Copy link
Contributor

neubig commented Apr 7, 2025

Thanks a lot!

@c3-ali
Copy link
Author

c3-ali commented Apr 8, 2025

@AutoLTX thanks for looking into this one. For the ux, if possible, i think visualizing the % used as a progress bar would be nice so we visually get a sense how much context length is remaining.
look at https://mintlify.s3.us-west-1.amazonaws.com/factory/images/tutorial/step-8.webp on how factory ai is doing it
This is how claude.ai visualize it for projects:
Image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants