Skip to content

Commit d749620

Browse files
committed
Update default model reference from GPT-4 to o3-mini and improve model configuration docs
1 parent e692735 commit d749620

File tree

1 file changed

+14
-12
lines changed

1 file changed

+14
-12
lines changed

docs/docs/usage-guide/changing_a_model.md

+14-12
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,18 @@
11
## Changing a model in PR-Agent
22

33
See [here](https://github.com/Codium-ai/pr-agent/blob/main/pr_agent/algo/__init__.py) for a list of available models.
4-
To use a different model than the default (GPT-4), you need to edit in the [configuration file](https://github.com/Codium-ai/pr-agent/blob/main/pr_agent/settings/configuration.toml#L2) the fields:
4+
To use a different model than the default (o3-mini), you need to edit in the [configuration file](https://github.com/Codium-ai/pr-agent/blob/main/pr_agent/settings/configuration.toml#L2) the fields:
55
```
66
[config]
77
model = "..."
88
fallback_models = ["..."]
99
```
1010

1111
For models and environments not from OpenAI, you might need to provide additional keys and other parameters.
12-
You can give parameters via a configuration file (see below for instructions), or from environment variables. See [litellm documentation](https://litellm.vercel.app/docs/proxy/quick_start#supported-llms) for the environment variables relevant per model.
12+
You can give parameters via a configuration file, or from environment variables.
13+
14+
!!! note "Model-specific environment variables"
15+
See [litellm documentation](https://litellm.vercel.app/docs/proxy/quick_start#supported-llms) for the environment variables needed per model, as they may vary and change over time. Our documentation per-model may not always be up-to-date with the latest changes.
1316

1417
### Azure
1518

@@ -158,25 +161,24 @@ And also set the api key in the .secrets.toml file:
158161
KEY = "..."
159162
```
160163

164+
See [litellm](https://docs.litellm.ai/docs/providers/anthropic#usage) documentation for more information about the environment variables required for Anthropic.
165+
161166
### Amazon Bedrock
162167

163168
To use Amazon Bedrock and its foundational models, add the below configuration:
164169

165170
```
166171
[config] # in configuration.toml
167-
model="bedrock/anthropic.claude-3-sonnet-20240229-v1:0"
168-
fallback_models=["bedrock/anthropic.claude-v2:1"]
169-
```
172+
model="bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0"
173+
fallback_models=["bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0"]
170174
171-
Note that you have to add access to foundational models before using them. Please refer to [this document](https://docs.aws.amazon.com/bedrock/latest/userguide/setting-up.html) for more details.
172-
173-
If you are using the claude-3 model, please configure the following settings as there are parameters incompatible with claude-3.
174-
```
175-
[litellm]
176-
drop_params = true
175+
[aws]
176+
AWS_ACCESS_KEY_ID="..."
177+
AWS_SECRET_ACCESS_KEY="..."
178+
AWS_REGION_NAME="..."
177179
```
178180

179-
AWS session is automatically authenticated from your environment, but you can also explicitly set `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY` and `AWS_REGION_NAME` environment variables. Please refer to [this document](https://litellm.vercel.app/docs/providers/bedrock) for more details.
181+
See [litellm](https://docs.litellm.ai/docs/providers/bedrock#usage) documentation for more information about the environment variables required for Amazon Bedrock.
180182

181183
### DeepSeek
182184

0 commit comments

Comments
 (0)