Add batch-based streaming support for OpenAI Agents#1335
Draft
tconley1428 wants to merge 2 commits intomainfrom
Draft
Add batch-based streaming support for OpenAI Agents#1335tconley1428 wants to merge 2 commits intomainfrom
tconley1428 wants to merge 2 commits intomainfrom
Conversation
Implements streaming API using a list-based approach where: - Stream events are collected during activity execution - Complete list is returned when activity finishes - Workflows can iterate over events deterministically - No real-time signaling to maintain workflow determinism Changes: - Add batch_stream_model activity for collecting streaming events - Implement stream_response method in _TemporalModelStub - Update TemporalOpenAIRunner to support run_streamed - Add streaming documentation and update feature support - Refactor common code into helper functions 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
- Add EventBuilders class for creating test streaming events - Update TestModel to support streaming with streaming_fn parameter - Add streaming factory methods: streaming_events and streaming_events_with_ending - Add test_batch_streaming to verify streaming works end-to-end - Update testing module exports to include EventBuilders 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
This PR implements streaming support for OpenAI Agents using a batch-based approach that maintains workflow determinism. Based on a subset of changes from PR #1246, this implementation focuses solely on collecting streaming events in a list and returning them when the activity completes.
batch_stream_modelactivity for collecting streaming events into a liststream_responsemethod in_TemporalModelStubusing the batch activityTemporalOpenAIRunnerto supportrun_streamedmethodKey Design Decisions
Batch-only approach: Unlike PR #1246 which included real-time signaling, this implementation only supports batch streaming where all events are collected during activity execution and returned as a complete list. This ensures deterministic workflow execution.
No signal mechanisms: Excludes the signal-based streaming and callback options from the original PR to keep the implementation focused and simple.
Deterministic iteration: Stream events are only delivered to the workflow after the entire LLM response is complete, allowing workflows to iterate over them deterministically.
Test plan
Runner.run_streamedTesting
Added comprehensive test coverage:
EventBuildersclass for creating test streaming eventsTestModelto support streaming withstreaming_fnparametertest_batch_streamingto verify end-to-end streaming functionality🤖 Generated with Claude Code