Skip to content

docs: Add release notes #1141

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: develop
Choose a base branch
from

Conversation

mikemckiernan
Copy link
Member

Description

Add release notes so that recent additions to this large set of HTML pages are more findable.

Related Issue(s)

Checklist

  • I've read the CONTRIBUTING guidelines.
  • I've updated the documentation if applicable.
  • I've added tests if applicable.
  • @mentions of the person or team responsible for reviewing proposed changes.

@mikemckiernan mikemckiernan added the documentation Improvements or additions to documentation label Apr 23, 2025
@mikemckiernan mikemckiernan added this to the v0.14.0 milestone Apr 23, 2025
@mikemckiernan mikemckiernan self-assigned this Apr 23, 2025
Copy link

Documentation preview

https://nvidia.github.io/NeMo-Guardrails/review/pr-1141

Copy link
Collaborator

@cparisien cparisien left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Mike. This should also link to the github CHANGELOG.md that sits at the top level of the repo. And @Pouyanpi, @tgasser-nv, when we update that changelog on a release we should link to this release-notes.md.

models:
- type: main
engine: nim
model_name: meta/llama-3.1-8b-instruct
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mike please correct me if I'm wrong but I think this was never supported. I'll try to reproduce it using v0.13.0 and let you know.

@Pouyanpi
Copy link
Collaborator

Pouyanpi commented Apr 24, 2025

@mikemckiernan It was not supported but we could call it a bug

git checkout v0.13.0

then

from nemoguardrails import RailsConfig, LLMRails

config = RailsConfig.from_path("./examples/configs/gs_content_safety/config")
rails = LLMRails(config, verbose=False)


# it is not set in config
assert config.models[0].model is None
assert not hasattr(config.models[0], "model_name")

rails = LLMRails(config, verbose=False)
# meta/llma-3.3-70b-instruct is define in the config with model_name
# but it is not used as main llm
assert rails.llm.model != "meta/llama-3.3-70b-instruct"

# actually, it is using the default provided by ChatNVIDIA()
assert rails.llm.model == "meta/llama3-8b-instruct"

# it works because it is using the default provided by ChatNVIDIA
rails.generate(messages=[{"role": "user", "content": "what can you do?"}])

Signed-off-by: Mike McKiernan <[email protected]>
@mikemckiernan
Copy link
Member Author

mikemckiernan commented Apr 25, 2025

@mikemckiernan It was not supported but we could call it a bug

Thanks for the help with this Pouyan. I can't wait for Tim to get started on refactoring some config work. This is not the kind of surprise that I like.

I moved the changes to the GS pages to PR #1146 to keep the changes a little more manageable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants