Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: make training config fields optional #1861

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

cdoern
Copy link
Contributor

@cdoern cdoern commented Apr 2, 2025

What does this PR do?

Today, supervised_fine_tune itself and the TrainingConfig class have a bunch of required fields that a provider implementation might not need.

for example, if a provider wants to handle hyperparameters in its configuration as well as any type of dataset retrieval, optimizer or LoRA config, a user will still need to pass in a virtually empty DataConfig, OptimizerConfig and AlgorithmConfig in some cases.

Many of these fields are intended to work specifically with llama models and knobs intended for customizing inline.

Adding remote post_training providers will require loosening these arguments, or forcing users to pass in empty objects to satisfy the pydantic models.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Apr 2, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants