Age | Commit message (Collapse) | Author | Lines | |
---|---|---|---|---|
2023-01-08 | Move batchsize check | dan | -2/+2 | |
2023-01-08 | Add checkbox for variable training dims | dan | -4/+4 | |
2023-01-08 | Allow variable img size | dan | -9/+13 | |
2023-01-07 | CLIP hijack rework | AUTOMATIC | -1/+0 | |
2023-01-06 | rework saving training params to file #6372 | AUTOMATIC | -20/+27 | |
2023-01-06 | Merge pull request #6372 from ↵ | AUTOMATIC1111 | -1/+25 | |
timntorres/save-ti-hypernet-settings-to-txt-revised Save hypernet and textual inversion settings to text file, revised. | ||||
2023-01-06 | allow loading embeddings from subdirectories | Faber | -11/+12 | |
2023-01-05 | typo in TI | Kuma | -1/+1 | |
2023-01-05 | Include model in log file. Exclude directory. | timntorres | -13/+9 | |
2023-01-05 | Clean up ti, add same behavior to hypernetwork. | timntorres | -5/+9 | |
2023-01-05 | Add option to save ti settings to file. | timntorres | -3/+27 | |
2023-01-04 | Merge branch 'master' into gradient-clipping | AUTOMATIC1111 | -219/+354 | |
2023-01-04 | use shared function from processing for creating dummy mask when training ↵ | AUTOMATIC | -24/+9 | |
inpainting model | ||||
2023-01-04 | fix the merge | AUTOMATIC | -9/+5 | |
2023-01-04 | Merge branch 'master' into inpaint_textual_inversion | AUTOMATIC1111 | -285/+439 | |
2023-01-04 | Merge pull request #6253 from Shondoit/ti-optim | AUTOMATIC1111 | -8/+32 | |
Save Optimizer next to TI embedding | ||||
2023-01-03 | add job info to modules | Vladimir Mandic | -0/+2 | |
2023-01-03 | Save Optimizer next to TI embedding | Shondoit | -8/+32 | |
Also add check to load only .PT and .BIN files as embeddings. (since we add .optim files in the same directory) | ||||
2023-01-02 | feat(api): return more data for embeddings | Philpax | -4/+4 | |
2023-01-02 | fix the issue with training on SD2.0 | AUTOMATIC | -2/+1 | |
2022-12-31 | changed embedding accepted shape detection to use existing code and support ↵ | AUTOMATIC | -24/+6 | |
the new alt-diffusion model, and reformatted messages a bit #6149 | ||||
2022-12-31 | validate textual inversion embeddings | Vladimir Mandic | -5/+38 | |
2022-12-24 | fix F541 f-string without any placeholders | Yuval Aboulafia | -1/+1 | |
2022-12-14 | Fix various typos | Jim Hays | -13/+13 | |
2022-12-03 | Merge branch 'master' into racecond_fix | AUTOMATIC1111 | -274/+381 | |
2022-12-03 | Merge pull request #5194 from brkirch/autocast-and-mps-randn-fixes | AUTOMATIC1111 | -3/+3 | |
Use devices.autocast() and fix MPS randn issues | ||||
2022-12-02 | Fix divide by 0 error | PhytoEpidemic | -3/+3 | |
Fix of the edge case 0 weight that occasionally will pop up in some specific situations. This was crashing the script. | ||||
2022-11-30 | Use devices.autocast instead of torch.autocast | brkirch | -3/+3 | |
2022-11-27 | Merge pull request #4688 from parasi22/resolve-embedding-name-in-filewords | AUTOMATIC1111 | -1/+1 | |
resolve [name] after resolving [filewords] in training | ||||
2022-11-27 | Merge remote-tracking branch 'flamelaw/master' | AUTOMATIC | -189/+274 | |
2022-11-27 | set TI AdamW default weight decay to 0 | flamelaw | -1/+1 | |
2022-11-26 | Add support Stable Diffusion 2.0 | AUTOMATIC | -4/+3 | |
2022-11-23 | small fixes | flamelaw | -1/+1 | |
2022-11-21 | fix pin_memory with different latent sampling method | flamelaw | -10/+20 | |
2022-11-20 | moved deepdanbooru to pure pytorch implementation | AUTOMATIC | -8/+4 | |
2022-11-20 | fix random sampling with pin_memory | flamelaw | -1/+1 | |
2022-11-20 | remove unnecessary comment | flamelaw | -9/+0 | |
2022-11-20 | Gradient accumulation, autocast fix, new latent sampling method, etc | flamelaw | -185/+269 | |
2022-11-19 | Merge pull request #4812 from space-nuko/feature/interrupt-preprocessing | AUTOMATIC1111 | -1/+1 | |
Add interrupt button to preprocessing | ||||
2022-11-19 | change StableDiffusionProcessing to internally use sampler name instead of ↵ | AUTOMATIC | -2/+2 | |
sampler index | ||||
2022-11-17 | Add interrupt button to preprocessing | space-nuko | -1/+1 | |
2022-11-13 | resolve [name] after resolving [filewords] in training | parasi | -1/+1 | |
2022-11-11 | Merge pull request #4117 from TinkTheBoush/master | AUTOMATIC1111 | -1/+6 | |
Adding optional tag shuffling for training | ||||
2022-11-11 | Update dataset.py | KyuSeok Jung | -1/+1 | |
2022-11-11 | Update dataset.py | KyuSeok Jung | -1/+1 | |
2022-11-11 | adding tag drop out option | KyuSeok Jung | -4/+4 | |
2022-11-09 | Merge branch 'master' into gradient-clipping | Muhammad Rizqi Nur | -69/+93 | |
2022-11-08 | move functions out of main body for image preprocessing for easier hijacking | AUTOMATIC | -69/+93 | |
2022-11-05 | Simplify grad clip | Muhammad Rizqi Nur | -9/+7 | |
2022-11-04 | change option position to Training setting | TinkTheBoush | -5/+4 | |