-
Notifications
You must be signed in to change notification settings - Fork 703
add capabilities to show response. #511
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add capabilities to show response. #511
Conversation
Thanks for this! Didn't realize that we didn't have a show example. Instead of having just the capabilities displayed it would be nicer to have the example just showcase the Something like this maybe: response: ShowResponse = show("llama3.2:3b")
print(f"Modified at: {response.modified_at}\n")
print(f"Template: {response.template}\n")
print(f"Modelfile: {response.modelfile}\n")
print(f"License: {response.license}\n")
print(f"Details: {response.details}\n")
print(f"Model Info: {response.modelinfo}\n")
print(f"Parameters: {response.parameters}\n")
print(f"Capabilities: {response.capabilities}\n") |
I understand. I updated the show example to display all properties of the show response. |
@ParthSareen In https://github.com/taketwo/llm-ollama we'd benefit if the capabilities were exposed in Python API. Is there anything we can do to support merging this PR? |
Added capabilities to show response.
This will allow filtering models based on capabilities such as embedding or vision