aboutsummaryrefslogtreecommitdiffstats
path: root/modules/textual_inversion/textual_inversion.py
Commit message (Expand)AuthorAgeFilesLines
* Merge branch 'master' into gradient-clippingAUTOMATIC11112023-01-041-162/+251
|\
| * use shared function from processing for creating dummy mask when training inp...AUTOMATIC2023-01-041-24/+9
| * fix the mergeAUTOMATIC2023-01-041-9/+5
| * Merge branch 'master' into inpaint_textual_inversionAUTOMATIC11112023-01-041-160/+244
| |\
| | * Merge pull request #6253 from Shondoit/ti-optimAUTOMATIC11112023-01-041-8/+32
| | |\
| | | * Save Optimizer next to TI embeddingShondoit2023-01-031-8/+32
| | * | add job info to modulesVladimir Mandic2023-01-031-0/+1
| | |/
| | * feat(api): return more data for embeddingsPhilpax2023-01-021-4/+4
| | * fix the issue with training on SD2.0AUTOMATIC2023-01-011-2/+1
| | * changed embedding accepted shape detection to use existing code and support t...AUTOMATIC2022-12-311-24/+6
| | * validate textual inversion embeddingsVladimir Mandic2022-12-311-5/+38
| | * fix F541 f-string without any placeholdersYuval Aboulafia2022-12-241-1/+1
| | * Fix various typosJim Hays2022-12-151-8/+8
| | * Merge branch 'master' into racecond_fixAUTOMATIC11112022-12-031-148/+186
| | |\
| | | * Use devices.autocast instead of torch.autocastbrkirch2022-11-301-1/+1
| | | * Merge remote-tracking branch 'flamelaw/master'AUTOMATIC2022-11-271-141/+182
| | | |\
| | | | * set TI AdamW default weight decay to 0flamelaw2022-11-261-1/+1
| | | | * small fixesflamelaw2022-11-221-1/+1
| | | | * fix pin_memory with different latent sampling methodflamelaw2022-11-211-6/+1
| | | | * Gradient accumulation, autocast fix, new latent sampling method, etcflamelaw2022-11-201-137/+183
| | | * | Add support Stable Diffusion 2.0AUTOMATIC2022-11-261-4/+3
| | | |/
| | | * change StableDiffusionProcessing to internally use sampler name instead of sa...AUTOMATIC2022-11-191-2/+2
| | * | Fixes race condition in training when VAE is unloadedFampai2022-11-041-0/+5
| | |/
| * / fixed textual inversion training with inpainting modelsNerogar2022-11-011-1/+26
| |/
* | Simplify grad clipMuhammad Rizqi Nur2022-11-051-9/+7
* | Merge branch 'master' into gradient-clippingMuhammad Rizqi Nur2022-11-021-0/+10
|\|
| * Fixed minor bugFampai2022-10-311-0/+1
| * Merge branch 'master' of https://github.com/AUTOMATIC1111/stable-diffusion-we...Fampai2022-10-311-25/+62
| |\
| * | Added TI training optimizationsFampai2022-10-311-0/+9
* | | Merge masterMuhammad Rizqi Nur2022-10-301-30/+66
|\ \ \ | | |/ | |/|
| * | Fix dataset still being loaded even when training will be skippedMuhammad Rizqi Nur2022-10-291-1/+1
| * | Add missing info on hypernetwork/embedding model logMuhammad Rizqi Nur2022-10-291-13/+26
| * | Revert "Add cleanup after training"Muhammad Rizqi Nur2022-10-291-95/+90
| * | Add cleanup after trainingMuhammad Rizqi Nur2022-10-291-90/+95
| * | Add input validations before loading dataset for trainingMuhammad Rizqi Nur2022-10-291-12/+36
| |/
* | Merge branch 'master' into gradient-clippingMuhammad Rizqi Nur2022-10-291-12/+12
|\|
| * Fix log off by 1Muhammad Rizqi Nur2022-10-281-12/+12
* | Learning rate sched syntax support for grad clippingMuhammad Rizqi Nur2022-10-281-3/+9
* | Gradient clipping for textual embeddingMuhammad Rizqi Nur2022-10-281-1/+10
|/
* typo: cmd_opts.embedding_dir to cmd_opts.embeddings_dirDepFA2022-10-261-1/+1
* Implement PR #3625 but for embeddings.timntorres2022-10-261-1/+1
* Implement PR #3309 but for embeddings.timntorres2022-10-261-1/+8
* Implement PR #3189 but for embeddings.timntorres2022-10-261-5/+5
* enable creating embedding with --medvramAUTOMATIC2022-10-261-0/+3
* Merge branch 'ae'AUTOMATIC2022-10-211-0/+1
|\
| * Merge branch 'master' into test_resolve_conflictsMalumaDev2022-10-181-0/+1
| |\
| * \ Merge branch 'master' into test_resolve_conflictsMalumaDev2022-10-161-0/+1
| |\ \
| * \ \ Merge branch 'master' into test_resolve_conflictsMalumaDev2022-10-151-4/+13
| |\ \ \
| * \ \ \ Merge branch 'master' into test_resolve_conflictsMalumaDev2022-10-151-25/+54
| |\ \ \ \
| * | | | | initMalumaDev2022-10-141-10/+25