Commit message (Collapse) | Author | Age | Files | Lines | ||
---|---|---|---|---|---|---|
... | ||||||
| | * | Include model in log file. Exclude directory. | timntorres | 2023-01-05 | 1 | -13/+9 | |
| | | | ||||||
| | * | Clean up ti, add same behavior to hypernetwork. | timntorres | 2023-01-05 | 1 | -5/+9 | |
| | | | ||||||
| | * | Add option to save ti settings to file. | timntorres | 2023-01-05 | 1 | -3/+27 | |
| | | | ||||||
| * | | allow loading embeddings from subdirectories | Faber | 2023-01-05 | 1 | -11/+12 | |
| | | | ||||||
| * | | typo in TI | Kuma | 2023-01-05 | 1 | -1/+1 | |
| |/ | ||||||
| * | Merge branch 'master' into gradient-clipping | AUTOMATIC1111 | 2023-01-04 | 1 | -162/+251 | |
| |\ | ||||||
| | * | use shared function from processing for creating dummy mask when training ↵ | AUTOMATIC | 2023-01-04 | 1 | -24/+9 | |
| | | | | | | | | | | | | inpainting model | |||||
| | * | fix the merge | AUTOMATIC | 2023-01-04 | 1 | -9/+5 | |
| | | | ||||||
| | * | Merge branch 'master' into inpaint_textual_inversion | AUTOMATIC1111 | 2023-01-04 | 1 | -160/+244 | |
| | |\ | ||||||
| | | * | Merge pull request #6253 from Shondoit/ti-optim | AUTOMATIC1111 | 2023-01-04 | 1 | -8/+32 | |
| | | |\ | | | | | | | | | | | Save Optimizer next to TI embedding | |||||
| | | | * | Save Optimizer next to TI embedding | Shondoit | 2023-01-03 | 1 | -8/+32 | |
| | | | | | | | | | | | | | | | | | | | | Also add check to load only .PT and .BIN files as embeddings. (since we add .optim files in the same directory) | |||||
| | | * | | add job info to modules | Vladimir Mandic | 2023-01-03 | 1 | -0/+1 | |
| | | |/ | ||||||
| | | * | feat(api): return more data for embeddings | Philpax | 2023-01-02 | 1 | -4/+4 | |
| | | | | ||||||
| | | * | fix the issue with training on SD2.0 | AUTOMATIC | 2023-01-01 | 1 | -2/+1 | |
| | | | | ||||||
| | | * | changed embedding accepted shape detection to use existing code and support ↵ | AUTOMATIC | 2022-12-31 | 1 | -24/+6 | |
| | | | | | | | | | | | | | | | | the new alt-diffusion model, and reformatted messages a bit #6149 | |||||
| | | * | validate textual inversion embeddings | Vladimir Mandic | 2022-12-31 | 1 | -5/+38 | |
| | | | | ||||||
| | | * | fix F541 f-string without any placeholders | Yuval Aboulafia | 2022-12-24 | 1 | -1/+1 | |
| | | | | ||||||
| | | * | Fix various typos | Jim Hays | 2022-12-15 | 1 | -8/+8 | |
| | | | | ||||||
| | | * | Merge branch 'master' into racecond_fix | AUTOMATIC1111 | 2022-12-03 | 1 | -148/+186 | |
| | | |\ | ||||||
| | | | * | Use devices.autocast instead of torch.autocast | brkirch | 2022-11-30 | 1 | -1/+1 | |
| | | | | | ||||||
| | | | * | Merge remote-tracking branch 'flamelaw/master' | AUTOMATIC | 2022-11-27 | 1 | -141/+182 | |
| | | | |\ | ||||||
| | | | | * | set TI AdamW default weight decay to 0 | flamelaw | 2022-11-26 | 1 | -1/+1 | |
| | | | | | | ||||||
| | | | | * | small fixes | flamelaw | 2022-11-22 | 1 | -1/+1 | |
| | | | | | | ||||||
| | | | | * | fix pin_memory with different latent sampling method | flamelaw | 2022-11-21 | 1 | -6/+1 | |
| | | | | | | ||||||
| | | | | * | Gradient accumulation, autocast fix, new latent sampling method, etc | flamelaw | 2022-11-20 | 1 | -137/+183 | |
| | | | | | | ||||||
| | | | * | | Add support Stable Diffusion 2.0 | AUTOMATIC | 2022-11-26 | 1 | -4/+3 | |
| | | | |/ | ||||||
| | | | * | change StableDiffusionProcessing to internally use sampler name instead of ↵ | AUTOMATIC | 2022-11-19 | 1 | -2/+2 | |
| | | | | | | | | | | | | | | | | | | | | sampler index | |||||
| | | * | | Fixes race condition in training when VAE is unloaded | Fampai | 2022-11-04 | 1 | -0/+5 | |
| | | |/ | | | | | | | | | | | | | | | | | set_current_image can attempt to use the VAE when it is unloaded to the CPU while training | |||||
| | * / | fixed textual inversion training with inpainting models | Nerogar | 2022-11-01 | 1 | -1/+26 | |
| | |/ | ||||||
| * | | Simplify grad clip | Muhammad Rizqi Nur | 2022-11-05 | 1 | -9/+7 | |
| | | | ||||||
| * | | Merge branch 'master' into gradient-clipping | Muhammad Rizqi Nur | 2022-11-02 | 1 | -0/+10 | |
| |\| | ||||||
| | * | Fixed minor bug | Fampai | 2022-10-31 | 1 | -0/+1 | |
| | | | | | | | | | | | | | | | when unloading vae during TI training, generating images after training will error out | |||||
| | * | Merge branch 'master' of ↵ | Fampai | 2022-10-31 | 1 | -25/+62 | |
| | |\ | | | | | | | | | | | | | https://github.com/AUTOMATIC1111/stable-diffusion-webui into TI_optimizations | |||||
| | * | | Added TI training optimizations | Fampai | 2022-10-31 | 1 | -0/+9 | |
| | | | | | | | | | | | | | | | | | | | | option to use xattention optimizations when training option to unload vae when training | |||||
| * | | | Merge master | Muhammad Rizqi Nur | 2022-10-30 | 1 | -30/+66 | |
| |\ \ \ | | | |/ | | |/| | ||||||
| | * | | Fix dataset still being loaded even when training will be skipped | Muhammad Rizqi Nur | 2022-10-29 | 1 | -1/+1 | |
| | | | | ||||||
| | * | | Add missing info on hypernetwork/embedding model log | Muhammad Rizqi Nur | 2022-10-29 | 1 | -13/+26 | |
| | | | | | | | | | | | | | | | | | | | | | | | | Mentioned here: https://github.com/AUTOMATIC1111/stable-diffusion-webui/discussions/1528#discussioncomment-3991513 Also group the saving into one | |||||
| | * | | Revert "Add cleanup after training" | Muhammad Rizqi Nur | 2022-10-29 | 1 | -95/+90 | |
| | | | | | | | | | | | | | | | | This reverts commit 3ce2bfdf95bd5f26d0f6e250e67338ada91980d1. | |||||
| | * | | Add cleanup after training | Muhammad Rizqi Nur | 2022-10-29 | 1 | -90/+95 | |
| | | | | ||||||
| | * | | Add input validations before loading dataset for training | Muhammad Rizqi Nur | 2022-10-29 | 1 | -12/+36 | |
| | |/ | ||||||
| * | | Merge branch 'master' into gradient-clipping | Muhammad Rizqi Nur | 2022-10-29 | 1 | -12/+12 | |
| |\| | ||||||
| | * | Fix log off by 1 | Muhammad Rizqi Nur | 2022-10-28 | 1 | -12/+12 | |
| | | | ||||||
| * | | Learning rate sched syntax support for grad clipping | Muhammad Rizqi Nur | 2022-10-28 | 1 | -3/+9 | |
| | | | ||||||
| * | | Gradient clipping for textual embedding | Muhammad Rizqi Nur | 2022-10-28 | 1 | -1/+10 | |
| |/ | ||||||
| * | typo: cmd_opts.embedding_dir to cmd_opts.embeddings_dir | DepFA | 2022-10-26 | 1 | -1/+1 | |
| | | ||||||
| * | Implement PR #3625 but for embeddings. | timntorres | 2022-10-26 | 1 | -1/+1 | |
| | | ||||||
| * | Implement PR #3309 but for embeddings. | timntorres | 2022-10-26 | 1 | -1/+8 | |
| | | ||||||
| * | Implement PR #3189 but for embeddings. | timntorres | 2022-10-26 | 1 | -5/+5 | |
| | | ||||||
| * | enable creating embedding with --medvram | AUTOMATIC | 2022-10-26 | 1 | -0/+3 | |
| | | ||||||
| * | Merge branch 'ae' | AUTOMATIC | 2022-10-21 | 1 | -0/+1 | |
| |\ |