Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: How to calculate the cost of llm api token used in headless mode #7649

Open
1 task done
krishgcek opened this issue Apr 2, 2025 · 4 comments
Open
1 task done
Labels
troubleshooting/help User requires help

Comments

@krishgcek
Copy link

Is there an existing issue for the same bug?

  • I have checked the existing issues.

Describe the bug and reproduction steps

I am using docker headless mode and I am setting cost related environment variables like this, -e LLM_INPUT_COST_PER_TOKEN=0.015 -e LLM_OUTPUT_COST_PER_TOKEN=0.020 . I f does not set it, i see llm.py:670 - Error getting cost from litellm: This model isn't mapped yet. model=gpt-4, custom_llm_provider=litellm_proxy. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json. if I set the above environments variables, the error will be cleared but i am not getting the cost incurred during the operation. where can i find it?

OpenHands Installation

Docker command in README

OpenHands Version

No response

Operating System

None

Logs, Errors, Screenshots, and Additional Context

No response

@krishgcek krishgcek added the bug Something isn't working label Apr 2, 2025
@krishgcek krishgcek changed the title [Bug]: How to calculate the of llm used via proxy in headless mode [Bug]: How to calculate the cost of llm api token used in headless mode Apr 2, 2025
@krishgcek
Copy link
Author

I am able to see the cost in the log once I enable the DEBUG environment variable

@mamoodi
Copy link
Collaborator

mamoodi commented Apr 2, 2025

Does that resolve your question @krishgcek ?

@mamoodi mamoodi added troubleshooting/help User requires help and removed bug Something isn't working labels Apr 2, 2025
@krishgcek
Copy link
Author

Can I write those information to some file?

@mamoodi
Copy link
Collaborator

mamoodi commented Apr 2, 2025

So you want the output from the command line that shows the cost of things to be written to some file?
It's likely possible but I don't know if OpenHands has a thing that allows that out of the box.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
troubleshooting/help User requires help
Projects
None yet
Development

No branches or pull requests

2 participants