|
1 | 1 | ## Changing a model in PR-Agent
|
2 | 2 |
|
3 | 3 | See [here](https://github.com/Codium-ai/pr-agent/blob/main/pr_agent/algo/__init__.py) for a list of available models.
|
4 |
| -To use a different model than the default (GPT-4), you need to edit in the [configuration file](https://github.com/Codium-ai/pr-agent/blob/main/pr_agent/settings/configuration.toml#L2) the fields: |
| 4 | +To use a different model than the default (o3-mini), you need to edit in the [configuration file](https://github.com/Codium-ai/pr-agent/blob/main/pr_agent/settings/configuration.toml#L2) the fields: |
5 | 5 | ```
|
6 | 6 | [config]
|
7 | 7 | model = "..."
|
8 | 8 | fallback_models = ["..."]
|
9 | 9 | ```
|
10 | 10 |
|
11 | 11 | For models and environments not from OpenAI, you might need to provide additional keys and other parameters.
|
12 |
| -You can give parameters via a configuration file (see below for instructions), or from environment variables. See [litellm documentation](https://litellm.vercel.app/docs/proxy/quick_start#supported-llms) for the environment variables relevant per model. |
| 12 | +You can give parameters via a configuration file, or from environment variables. |
| 13 | + |
| 14 | +!!! note "Model-specific environment variables" |
| 15 | + See [litellm documentation](https://litellm.vercel.app/docs/proxy/quick_start#supported-llms) for the environment variables needed per model, as they may vary and change over time. Our documentation per-model may not always be up-to-date with the latest changes. |
13 | 16 |
|
14 | 17 | ### Azure
|
15 | 18 |
|
@@ -158,25 +161,24 @@ And also set the api key in the .secrets.toml file:
|
158 | 161 | KEY = "..."
|
159 | 162 | ```
|
160 | 163 |
|
| 164 | +See [litellm](https://docs.litellm.ai/docs/providers/anthropic#usage) documentation for more information about the environment variables required for Anthropic. |
| 165 | + |
161 | 166 | ### Amazon Bedrock
|
162 | 167 |
|
163 | 168 | To use Amazon Bedrock and its foundational models, add the below configuration:
|
164 | 169 |
|
165 | 170 | ```
|
166 | 171 | [config] # in configuration.toml
|
167 |
| -model="bedrock/anthropic.claude-3-sonnet-20240229-v1:0" |
168 |
| -fallback_models=["bedrock/anthropic.claude-v2:1"] |
169 |
| -``` |
| 172 | +model="bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0" |
| 173 | +fallback_models=["bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0"] |
170 | 174 |
|
171 |
| -Note that you have to add access to foundational models before using them. Please refer to [this document](https://docs.aws.amazon.com/bedrock/latest/userguide/setting-up.html) for more details. |
172 |
| - |
173 |
| -If you are using the claude-3 model, please configure the following settings as there are parameters incompatible with claude-3. |
174 |
| -``` |
175 |
| -[litellm] |
176 |
| -drop_params = true |
| 175 | +[aws] |
| 176 | +AWS_ACCESS_KEY_ID="..." |
| 177 | +AWS_SECRET_ACCESS_KEY="..." |
| 178 | +AWS_REGION_NAME="..." |
177 | 179 | ```
|
178 | 180 |
|
179 |
| -AWS session is automatically authenticated from your environment, but you can also explicitly set `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY` and `AWS_REGION_NAME` environment variables. Please refer to [this document](https://litellm.vercel.app/docs/providers/bedrock) for more details. |
| 181 | +See [litellm](https://docs.litellm.ai/docs/providers/bedrock#usage) documentation for more information about the environment variables required for Amazon Bedrock. |
180 | 182 |
|
181 | 183 | ### DeepSeek
|
182 | 184 |
|
|
0 commit comments