Skip to content

Fix Flux2 DreamBooth prior preservation prompt repeats#13415

Open
azolotenkov wants to merge 1 commit intohuggingface:mainfrom
azolotenkov:fix-flux2-prior-preservation-repeat
Open

Fix Flux2 DreamBooth prior preservation prompt repeats#13415
azolotenkov wants to merge 1 commit intohuggingface:mainfrom
azolotenkov:fix-flux2-prior-preservation-repeat

Conversation

@azolotenkov
Copy link
Copy Markdown
Contributor

What does this PR do?

Fixes a prior-preservation batch size mismatch in the Flux2 DreamBooth LoRA scripts.

When custom instance prompts are not used, prompt_embeds already contains concatenated instance + class embeddings under --with_prior_preservation, but collate_fn also doubles prompts, so repeating by len(prompts) over-expands text embeddings by 2x and no longer matches the latent batch size.

This applies the same fix pattern as #13396 to:

  • train_dreambooth_lora_flux2.py
  • train_dreambooth_lora_flux2_klein.py

Before submitting

Who can review?

@sayakpaul

Copilot AI review requested due to automatic review settings April 4, 2026 18:08
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Fixes a prior-preservation batch size mismatch in the Flux2 DreamBooth LoRA training example scripts by adjusting how many times static text embeddings are repeated when --with_prior_preservation is enabled.

Changes:

  • Adjust num_repeat_elements to use len(prompts) // 2 under --with_prior_preservation (since collate_fn doubles prompts).
  • Apply the same fix to both Flux2 DreamBooth LoRA variants (standard + klein).

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.

File Description
examples/dreambooth/train_dreambooth_lora_flux2.py Fixes prompt-embed repeat count for prior preservation when using static (non-custom) prompts.
examples/dreambooth/train_dreambooth_lora_flux2_klein.py Same repeat-count fix in the klein variant.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@azolotenkov azolotenkov force-pushed the fix-flux2-prior-preservation-repeat branch from 27c01ea to b6cb7b1 Compare April 4, 2026 18:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants