Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: why ollama/gemma3:27b can not execute cmd #7613

Open
1 task done
jjman889 opened this issue Apr 1, 2025 · 6 comments
Open
1 task done

[Bug]: why ollama/gemma3:27b can not execute cmd #7613

jjman889 opened this issue Apr 1, 2025 · 6 comments
Labels
bug Something isn't working

Comments

@jjman889
Copy link

jjman889 commented Apr 1, 2025

Is there an existing issue for the same bug?

  • I have checked the existing issues.

Describe the bug and reproduction steps

Prompt:
creat a new branch from /workspace/test name test_branch

respond:
Okay, let's create a new branch.

<function_call>
execute_command

{
"command": "git checkout -b test_branch"
}


it can not execute command.

OpenHands Installation

Docker command in README

OpenHands Version

No response

Operating System

None

Logs, Errors, Screenshots, and Additional Context

No response

@enyst
Copy link
Collaborator

enyst commented Apr 1, 2025

Could you perhaps tell what is the error?

Alternatively, you may want to try a similar size LLM that was tuned on openhands histories:
https://huggingface.co/all-hands/openhands-lm-32b-v0.1

@OhBobb
Copy link

OhBobb commented Apr 2, 2025

Could you perhaps tell what is the error?

Alternatively, you may want to try a similar size LLM that was tuned on openhands histories: https://huggingface.co/all-hands/openhands-lm-32b-v0.1

can you help me. i want to download this gguf(q4_k_m):https://huggingface.co/bartowski/all-hands_openhands-lm-32b-v0.1-GGUF/tree/main

how to load this to ollama? what is the correct template to use in the Modelfile to use functions/tools successfuly? ollama sometimes doesnt like template unless its own from "ollama pull 'model'" command. maybe this is wrong repo to ask these questions?

@mrdev023
Copy link

mrdev023 commented Apr 3, 2025

Could you perhaps tell what is the error?
Alternatively, you may want to try a similar size LLM that was tuned on openhands histories: https://huggingface.co/all-hands/openhands-lm-32b-v0.1

can you help me. i want to download this gguf(q4_k_m):https://huggingface.co/bartowski/all-hands_openhands-lm-32b-v0.1-GGUF/tree/main

how to load this to ollama? what is the correct template to use in the Modelfile to use functions/tools successfuly? ollama sometimes doesnt like template unless its own from "ollama pull 'model'" command. maybe this is wrong repo to ask these questions?

https://ollama.com/eramax/openhands-lm-32b-v0.1

@mrdev023
Copy link

mrdev023 commented Apr 3, 2025

I get the following result in chat.

Let's create a new Rust project using Cargo and add the necessary dependencies manually.
<function> <parameter=command</parameter> <parameter=is_input</parameter> <value>true</value> <parameter=command</parameter> <parameter=value>cargo new --bin vulkan_ecs_app && cd vulkan_ecs_app && cargo add vulkano = "0.32" bevy_ecs = "0.6"</parameter>

With https://ollama.com/eramax/openhands-lm-32b-v0.1

@OhBobb
Copy link

OhBobb commented Apr 3, 2025

Could you perhaps tell what is the error?
Alternatively, you may want to try a similar size LLM that was tuned on openhands histories: https://huggingface.co/all-hands/openhands-lm-32b-v0.1

can you help me. i want to download this gguf(q4_k_m):https://huggingface.co/bartowski/all-hands_openhands-lm-32b-v0.1-GGUF/tree/main
how to load this to ollama? what is the correct template to use in the Modelfile to use functions/tools successfuly? ollama sometimes doesnt like template unless its own from "ollama pull 'model'" command. maybe this is wrong repo to ask these questions?

https://ollama.com/eramax/openhands-lm-32b-v0.1

thanks. i managed to download it but then in open-hands ui it was not connecting. i had to try several options for the name of the model. it ended up working using:
ollama/eramax/openhands-lm-32b-v0.1:q4_K_M
im assuming cus the extra slash '/' it was getting confused

@OhBobb
Copy link

OhBobb commented Apr 3, 2025

I get the following result in chat.

Let's create a new Rust project using Cargo and add the necessary dependencies manually.
<function> <parameter=command</parameter> <parameter=is_input</parameter> <value>true</value> <parameter=command</parameter> <parameter=value>cargo new --bin vulkan_ecs_app && cd vulkan_ecs_app && cargo add vulkano = "0.32" bevy_ecs = "0.6"</parameter>

With https://ollama.com/eramax/openhands-lm-32b-v0.1

im getting similar error

I want to create a Hello World app in Javascript that: Displays Hello World in the middle. Has a button that when clicked, changes the greeting with a bouncing animation to fun versions of Hello. Has a counter for how many times the button has been clicked. Has another button that changes the app's background color.

Task Creating I'll will I SOLUTION <function I </ <|im_start|> assistant ###() <|im_start|> <function call> <parameter=path>/workspace/h//</parameter> <parameter=command>str_replace_editor></ <parameter=old_str()>return '<table>'++'</parameter> <parameter=new_str>return '<h1>Hello World</h1><button onclick="changeGreeting()'>Click me!</button><p>You ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ###() ### ### ### ### ### ### ### ### ### ### ### ### <parameter I'm ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### <function=call <parameter=path>/workspace/appapp.py</parameter></</parameter> <assistant < ### ### ### ### ### ### ### ### ### ### ### ### ### ### <|im_start|> ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ###() <parameter=old_str>return '<table>'td</parameter> <parameter=new_str>return <'<</</parameter </parameter ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### <function=execute_bbash <parameter=path>/workspace/app.js</parameter> <parameter=new_str> html <!DOCTYPE html> <html lang="en"> <head> <meta charset=" <utf-88- <titlehttp() <title><bodyapp <title> Hello ### <script <style <### <### <### <body </<body <h1>Hello World</h1> <p <h <button onclick=" <### <### <### <### <### <### <### << ### < < ### < < <script>() < I < </parameter ###() <parameter I <function <parameter I <parameter <function <parameter= <parameter ###() <parameter I <parameter < ###() < </ < ###assistant I ###() <function>call=

and saw this in the open-hands console window, not sure what it means:

23:10:35 - openhands:ERROR: llm.py:670 - Error getting cost from litellm: OllamaError: Error getting model info for eramax/openhands-lm-32b-v0.1:q4_K_M. Set Ollama API Base via 'OLLAMA_API_BASE' environment variable. Error: [Errno 111] Connection refused 23:10:35 - ACTION [Agent Controller 26c786d764484cad891add7cd807715d] **MessageAction** (source=EventSource.AGENT) CONTENT: ### Task Creating I'll will I SOLUTION ##... ... (continued same as above)

i then told it to 'continue' but i think it started to hallucinate or similar cus gpu temps went skyhigh for a while but it never returned any output. i left it for about 5mins and then killed ollama to save my gpu's life.

Edits: sorry i had to edit like 10 times and this is best i could do. please don hate me im new to github :(

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants