fix: speed up batch queue processing by disabling cooloff and fixing retry race#3079
fix: speed up batch queue processing by disabling cooloff and fixing retry race#3079
Conversation
🦋 Changeset detectedLatest commit: ba3a29e The changes in this PR will be included in the next version bump. This PR includes changesets to release 28 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
9ea0ae2 to
d275304
Compare
|
No actionable comments were generated in the recent review. 🎉 📜 Recent review detailsConfiguration used: Repository UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (6)
✅ Files skipped from review due to trivial changes (1)
🚧 Files skipped from review as they are similar to previous changes (2)
🧰 Additional context used📓 Path-based instructions (8)**/*.{ts,tsx}📄 CodeRabbit inference engine (.github/copilot-instructions.md)
Files:
**/*.{ts,tsx,js,jsx}📄 CodeRabbit inference engine (.github/copilot-instructions.md)
Files:
**/*.ts📄 CodeRabbit inference engine (.cursor/rules/otel-metrics.mdc)
Files:
**/*.{js,ts,jsx,tsx,json,md,yaml,yml}📄 CodeRabbit inference engine (AGENTS.md)
Files:
{packages,integrations}/**/*📄 CodeRabbit inference engine (CLAUDE.md)
Files:
{packages/core,apps/webapp}/**/*.{ts,tsx}📄 CodeRabbit inference engine (.github/copilot-instructions.md)
Files:
apps/webapp/app/**/*.{ts,tsx}📄 CodeRabbit inference engine (.cursor/rules/webapp.mdc)
Files:
apps/webapp/**/*.{ts,tsx}📄 CodeRabbit inference engine (.cursor/rules/webapp.mdc)
Files:
🧠 Learnings (3)📓 Common learnings📚 Learning: 2025-11-27T16:27:35.304ZApplied to files:
📚 Learning: 2025-11-14T16:03:06.917ZApplied to files:
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (12)
🔇 Additional comments (6)
WalkthroughUpdates in the redis-worker and related packages: removed per-queue cool-off increment when availableCapacity === 0; changed retry path to pass updated message JSON into the release/visibility Lua path instead of separate HSET; extended VisibilityManager.release and the releaseMessage Redis/Lua call to accept optional updatedData and use it to replace the in-flight payload; added a test ensuring concurrency blocks do not trigger cooloff; disabled cooloff in a FairQueue config; and bumped BATCH_CONCURRENCY_LIMIT_DEFAULT from 1 to 5 in env.server.ts. No exported/public API removals. Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes 🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
d275304 to
f38f2f9
Compare
f38f2f9 to
ba3a29e
Compare
Fix slow fair queue processing by removing spurious cooloff on concurrency blocks and fixing a race condition where retry attempt counts were not atomically updated during message re-queue.
Removed cooloff entirely from the batch queue