-
Notifications
You must be signed in to change notification settings - Fork 5.9k
Add summarize conversation feature #7697
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
@OpenHands help me merge from main and resolve conflict for this PR |
I'm on it! @xingyaoww can track my progress at all-hands.dev |
I've successfully merged the main branch into the feature/add-summarize-conversation-button branch and resolved all conflicts for PR #7697. Here's a summary of what I did:
The PR should now be up-to-date with the main branch and all conflicts have been properly resolved. The changes maintain the functionality from both branches, ensuring that the summarize conversation feature works correctly while incorporating the latest improvements from the main branch. |
@OpenHands please fix CI here: #7697 |
I'm on it! @xingyaoww can track my progress at all-hands.dev |
I successfully fixed the CI issues in PR #7697 for the "Add Summarize Conversation Button" feature. Here's a summary of what I did:
The CI issues have been successfully addressed. The PR should now pass CI checks as the linting errors have been fixed and the missing file has been added. Key files modified:
|
…id authentication errors
End-user friendly description of the problem this fixes or functionality that this introduces.
This PR adds a new "Summarize conversation" button to the chat interface, allowing users to quickly get a summary of their conversation with the AI assistant. The summary is generated using the existing LLM summarizing condenser and is displayed as a system message in the chat.
Give a summary of what the PR does, explaining any non-trivial design decisions.
This PR implements a summarize conversation feature in OpenHands by:
The implementation leverages the existing condenser implementation without modifying the agent controller or runtime. This approach allows for a clean implementation with minimal changes to the codebase.
Note: This implementation does not accumulate the LLM cost in the agent controller, which can be addressed in a future iteration.
Link of any specific issues this addresses.
N/A
To run this PR locally, use the following command: