-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Add dependency injection to ChatCompletionStream for improved testability #1011
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Hi @sashabaranov 👋 , first time contributor here. Fixing up those lint issues now. |
…lity **Describe the change** This PR refactors the `ChatCompletionStream` to use dependency injection by introducing a `ChatStreamReader` interface. This allows for injecting custom stream readers, primarily for testing purposes, making the streaming functionality more testable and maintainable. **Provide OpenAI documentation link** https://platform.openai.com/docs/api-reference/chat/create **Describe your solution** The changes include: - Added a `ChatStreamReader` interface that defines the contract for reading chat completion streams - Refactored `ChatCompletionStream` to use composition with a `ChatStreamReader` instead of embedding `streamReader` - Added `NewChatCompletionStream()` constructor function to enable dependency injection - Implemented explicit delegation methods (`Recv()`, `Close()`, `Header()`, `GetRateLimitHeaders()`) on `ChatCompletionStream` - Added interface compliance check via `var _ ChatStreamReader = (*streamReader[ChatCompletionStreamResponse])(nil)` This approach maintains backward compatibility while enabling easier mocking and testing of streaming functionality. **Tests** Added comprehensive tests demonstrating the new functionality: - `TestChatCompletionStream_MockInjection`: Tests basic mock injection with the new constructor - `mock_streaming_demo_test.go`: A complete demonstration file showing how to create mock clients and stream readers for testing, including: - `MockOpenAIStreamClient`: Full mock client implementation - `mockStreamReader`: Custom stream reader for controlled test responses - `TestMockOpenAIStreamClient_Demo`: Demonstrates assembling multiple stream chunks - `TestMockOpenAIStreamClient_ErrorHandling`: Shows error handling patterns **Additional context** This refactoring improves the testability of code that depends on go-openai streaming without introducing breaking changes. The existing public API remains unchanged, but now supports dependency injection for testing scenarios. The new demo test file serves as documentation for users who want to mock streaming responses in their own tests. Lint fix
370f947
to
857bb0d
Compare
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #1011 +/- ##
==========================================
+ Coverage 85.77% 85.83% +0.06%
==========================================
Files 43 43
Lines 2291 2315 +24
==========================================
+ Hits 1965 1987 +22
- Misses 304 306 +2
Partials 22 22 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
// ChatStreamReader is an interface for reading chat completion streams. | ||
type ChatStreamReader interface { | ||
Recv() (ChatCompletionStreamResponse, error) | ||
Close() error | ||
} | ||
|
||
// ChatCompletionStream | ||
// Note: Perhaps it is more elegant to abstract Stream using generics. | ||
type ChatCompletionStream struct { | ||
*streamReader[ChatCompletionStreamResponse] | ||
reader ChatStreamReader | ||
} | ||
|
||
// NewChatCompletionStream allows injecting a custom ChatStreamReader (for testing). | ||
func NewChatCompletionStream(reader ChatStreamReader) *ChatCompletionStream { | ||
return &ChatCompletionStream{reader: reader} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sashabaranov Does this approach fulfill your intent in the Note
comment?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR refactors ChatCompletionStream
to use a ChatStreamReader
interface, enabling dependency injection for better testability without breaking existing APIs.
- Introduces
ChatStreamReader
andNewChatCompletionStream
for injecting custom stream readers. - Refactors
ChatCompletionStream
to delegate streaming operations to the interface. - Adds demo and unit tests showing how to mock streaming behavior.
Reviewed Changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 2 comments.
File | Description |
---|---|
stream_reader.go | Added var _ ChatStreamReader = (*streamReader[ChatCompletionStreamResponse])(nil) to assert interface compliance. |
chat_stream.go | Defined ChatStreamReader , NewChatCompletionStream , and delegation methods (Recv , Close , Header , GetRateLimitHeaders ). |
chat_stream_test.go | Added TestChatCompletionStream_MockInjection using a mockStream to verify custom injection. |
mock_streaming_demo_test.go | Demo tests (MockOpenAIStreamClient , mockStreamReader ) illustrating mock client usage. |
Comments suppressed due to low confidence (1)
chat_stream.go:133
- The newly added methods Header() and GetRateLimitHeaders() are not currently covered by unit tests; consider adding tests to validate their delegated behavior.
func (s *ChatCompletionStream) Header() http.Header {
// MockOpenAIStreamClient demonstrates how to create a full mock client for go-openai. | ||
type MockOpenAIStreamClient struct { | ||
// Configure canned responses | ||
ChatCompletionResponse openai.ChatCompletionResponse |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The field ChatCompletionResponse in MockOpenAIStreamClient is declared but never used; consider removing it to eliminate dead code.
ChatCompletionResponse openai.ChatCompletionResponse |
Copilot uses AI. Check for mistakes.
@@ -16,6 +16,8 @@ var ( | |||
errorPrefix = regexp.MustCompile(`^data:\s*{"error":`) | |||
) | |||
|
|||
var _ ChatStreamReader = (*streamReader[ChatCompletionStreamResponse])(nil) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Consider moving the interface compliance assertion closer to the ChatStreamReader interface definition (e.g., in chat_stream.go) to improve code readability.
Copilot uses AI. Check for mistakes.
Re-rolled PR from the Khan org. #1016 |
Describe the change
This PR refactors the
ChatCompletionStream
to use dependency injection by introducing aChatStreamReader
interface. This allows for injecting custom stream readers, primarily for testing purposes, making the streaming functionality more testable and maintainable.Describe your solution
The changes include:
ChatStreamReader
interface that defines the contract for reading chat completion streamsChatCompletionStream
to use composition with aChatStreamReader
instead of embeddingstreamReader
NewChatCompletionStream()
constructor function to enable dependency injectionRecv()
,Close()
,Header()
,GetRateLimitHeaders()
) onChatCompletionStream
var _ ChatStreamReader = (*streamReader[ChatCompletionStreamResponse])(nil)
This approach maintains backward compatibility while enabling easier mocking and testing of streaming functionality.
Tests
Added comprehensive tests demonstrating the new functionality:
TestChatCompletionStream_MockInjection
: Tests basic mock injection with the new constructormock_streaming_demo_test.go
: A complete demonstration file showing how to create mock clients and stream readers for testing, including:MockOpenAIStreamClient
: Full mock client implementationmockStreamReader
: Custom stream reader for controlled test responsesTestMockOpenAIStreamClient_Demo
: Demonstrates assembling multiple stream chunksTestMockOpenAIStreamClient_ErrorHandling
: Shows error handling patternsAdditional context
This refactoring improves the testability of code that depends on go-openai streaming without introducing breaking changes. The existing public API remains unchanged, but now supports dependency injection for testing scenarios. The new demo test file serves as documentation for users who want to mock streaming responses in their own tests.