Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.Net: New Feature: Add Truncation Parameter to OpenAIPromptExecutionSettings for Context Management #11507

Open
RuizhangZhou opened this issue Apr 11, 2025 · 0 comments
Labels
.NET Issue or Pull requests regarding .NET code triage

Comments

@RuizhangZhou
Copy link


name: Feature request - Add Truncation Parameter to OpenAIPromptExecutionSettings for Context Management

about: I would like to request the addition of a truncation parameter in the OpenAIPromptExecutionSettings class within the Semantic Kernel project. The idea is to allow developers to control how input conversation history is handled when it exceeds the model's context window. For example, similar to the OpenAI API, where the truncation parameter can be set to "auto" (which would automatically truncate conversation history by dropping input items from the middle) or "disabled" (to disable truncation and instead fail the request if the context is too large).

This would be particularly useful in scenarios where long conversation histories are used, as it would provide a built-in mechanism to prevent reaching the context token limit without manually pre-processing the conversation history. I believe that including a parameter that controls the truncation strategy would improve usability and flexibility for developers integrating AI functionality via the Semantic Kernel.

For reference, please see the OpenAI API documentation on the truncation parameter:
OpenAI API Truncation Parameter

@markwallace-microsoft markwallace-microsoft added .NET Issue or Pull requests regarding .NET code triage labels Apr 11, 2025
@github-actions github-actions bot changed the title New Feature: Add Truncation Parameter to OpenAIPromptExecutionSettings for Context Management .Net: New Feature: Add Truncation Parameter to OpenAIPromptExecutionSettings for Context Management Apr 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
.NET Issue or Pull requests regarding .NET code triage
Projects
None yet
Development

No branches or pull requests

2 participants