-
Notifications
You must be signed in to change notification settings - Fork 601
Open
Description
How do you use Sentry?
Sentry SaaS (sentry.io)
Version
sentry-sdk 2.56.0
Steps to Reproduce
import sentry_sdk
import litellm
sentry_sdk.init(dsn="...") # OpenAIIntegration auto-enables
# This fails:
response = litellm.completion(
model="gpt-4.1-nano",
messages=[{"role": "user", "content": "hello"}],
stream=True,
)
for chunk in response:
print(chunk)Versions:
- sentry-sdk==2.56.0
- openai==2.30.0
- litellm==1.82.6
Expected Result
Streaming works normally. litellm iterates chunks from the OpenAI response.
Actual Result
litellm.InternalServerError: OpenAIException - 'LegacyAPIResponse' object has no attribute '_iterator'
LiteLLM Retried: 3 times
The OpenAIIntegration monkey-patches the openai client and wraps the streaming response in a LegacyAPIResponse object. This object lacks the _iterator attribute that litellm expects on Stream objects. The OpenAI API returns 200 OK every time — the error occurs when litellm tries to iterate the wrapped response.
Workaround:
from sentry_sdk.integrations.openai import OpenAIIntegration
sentry_sdk.init(
dsn="...",
disabled_integrations=[OpenAIIntegration],
)Disabling only OpenAIIntegration fixes the issue. All other auto-enabled integrations (HttpxIntegration, LiteLLMIntegration, etc.) work fine.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Fields
Give feedbackNo fields configured for issues without a type.
Projects
Status
No status