Skip to content

Comments

Add batch-based streaming support for OpenAI Agents#1335

Draft
tconley1428 wants to merge 2 commits intomainfrom
streaming-list-only
Draft

Add batch-based streaming support for OpenAI Agents#1335
tconley1428 wants to merge 2 commits intomainfrom
streaming-list-only

Conversation

@tconley1428
Copy link
Contributor

Summary

This PR implements streaming support for OpenAI Agents using a batch-based approach that maintains workflow determinism. Based on a subset of changes from PR #1246, this implementation focuses solely on collecting streaming events in a list and returning them when the activity completes.

  • Adds batch_stream_model activity for collecting streaming events into a list
  • Implements stream_response method in _TemporalModelStub using the batch activity
  • Updates TemporalOpenAIRunner to support run_streamed method
  • Refactors common code into helper functions to reduce duplication
  • Updates documentation and feature support matrix
  • Adds comprehensive tests for streaming functionality

Key Design Decisions

Batch-only approach: Unlike PR #1246 which included real-time signaling, this implementation only supports batch streaming where all events are collected during activity execution and returned as a complete list. This ensures deterministic workflow execution.

No signal mechanisms: Excludes the signal-based streaming and callback options from the original PR to keep the implementation focused and simple.

Deterministic iteration: Stream events are only delivered to the workflow after the entire LLM response is complete, allowing workflows to iterate over them deterministically.

Test plan

  • Test basic streaming functionality with Runner.run_streamed
  • Verify stream events are collected and returned correctly
  • Confirm workflow determinism is maintained
  • Test with various model configurations and tools

Testing

Added comprehensive test coverage:

  • EventBuilders class for creating test streaming events
  • Updated TestModel to support streaming with streaming_fn parameter
  • Added test_batch_streaming to verify end-to-end streaming functionality
  • All tests pass and verify that streaming events are collected and delivered correctly

🤖 Generated with Claude Code

tconley1428 and others added 2 commits February 19, 2026 09:20
Implements streaming API using a list-based approach where:
- Stream events are collected during activity execution
- Complete list is returned when activity finishes
- Workflows can iterate over events deterministically
- No real-time signaling to maintain workflow determinism

Changes:
- Add batch_stream_model activity for collecting streaming events
- Implement stream_response method in _TemporalModelStub
- Update TemporalOpenAIRunner to support run_streamed
- Add streaming documentation and update feature support
- Refactor common code into helper functions

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
- Add EventBuilders class for creating test streaming events
- Update TestModel to support streaming with streaming_fn parameter
- Add streaming factory methods: streaming_events and streaming_events_with_ending
- Add test_batch_streaming to verify streaming works end-to-end
- Update testing module exports to include EventBuilders

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant