aboutsummaryrefslogtreecommitdiffstats
path: root/modules/textual_inversion
AgeCommit message (Expand)AuthorLines
2023-01-07CLIP hijack reworkAUTOMATIC-1/+0
2023-01-06rework saving training params to file #6372AUTOMATIC-20/+27
2023-01-06Merge pull request #6372 from timntorres/save-ti-hypernet-settings-to-txt-rev...AUTOMATIC1111-1/+25
2023-01-06allow loading embeddings from subdirectoriesFaber-11/+12
2023-01-05typo in TIKuma-1/+1
2023-01-05Include model in log file. Exclude directory.timntorres-13/+9
2023-01-05Clean up ti, add same behavior to hypernetwork.timntorres-5/+9
2023-01-05Add option to save ti settings to file.timntorres-3/+27
2023-01-04Merge branch 'master' into gradient-clippingAUTOMATIC1111-219/+354
2023-01-04use shared function from processing for creating dummy mask when training inp...AUTOMATIC-24/+9
2023-01-04fix the mergeAUTOMATIC-9/+5
2023-01-04Merge branch 'master' into inpaint_textual_inversionAUTOMATIC1111-285/+439
2023-01-04Merge pull request #6253 from Shondoit/ti-optimAUTOMATIC1111-8/+32
2023-01-03add job info to modulesVladimir Mandic-0/+2
2023-01-03Save Optimizer next to TI embeddingShondoit-8/+32
2023-01-02feat(api): return more data for embeddingsPhilpax-4/+4
2023-01-02fix the issue with training on SD2.0AUTOMATIC-2/+1
2022-12-31changed embedding accepted shape detection to use existing code and support t...AUTOMATIC-24/+6
2022-12-31validate textual inversion embeddingsVladimir Mandic-5/+38
2022-12-24fix F541 f-string without any placeholdersYuval Aboulafia-1/+1
2022-12-14Fix various typosJim Hays-13/+13
2022-12-03Merge branch 'master' into racecond_fixAUTOMATIC1111-274/+381
2022-12-03Merge pull request #5194 from brkirch/autocast-and-mps-randn-fixesAUTOMATIC1111-3/+3
2022-12-02Fix divide by 0 errorPhytoEpidemic-3/+3
2022-11-30Use devices.autocast instead of torch.autocastbrkirch-3/+3
2022-11-27Merge pull request #4688 from parasi22/resolve-embedding-name-in-filewordsAUTOMATIC1111-1/+1
2022-11-27Merge remote-tracking branch 'flamelaw/master'AUTOMATIC-189/+274
2022-11-27set TI AdamW default weight decay to 0flamelaw-1/+1
2022-11-26Add support Stable Diffusion 2.0AUTOMATIC-4/+3
2022-11-23small fixesflamelaw-1/+1
2022-11-21fix pin_memory with different latent sampling methodflamelaw-10/+20
2022-11-20moved deepdanbooru to pure pytorch implementationAUTOMATIC-8/+4
2022-11-20fix random sampling with pin_memoryflamelaw-1/+1
2022-11-20remove unnecessary commentflamelaw-9/+0
2022-11-20Gradient accumulation, autocast fix, new latent sampling method, etcflamelaw-185/+269
2022-11-19Merge pull request #4812 from space-nuko/feature/interrupt-preprocessingAUTOMATIC1111-1/+1
2022-11-19change StableDiffusionProcessing to internally use sampler name instead of sa...AUTOMATIC-2/+2
2022-11-17Add interrupt button to preprocessingspace-nuko-1/+1
2022-11-13resolve [name] after resolving [filewords] in trainingparasi-1/+1
2022-11-11Merge pull request #4117 from TinkTheBoush/masterAUTOMATIC1111-1/+6
2022-11-11Update dataset.pyKyuSeok Jung-1/+1
2022-11-11Update dataset.pyKyuSeok Jung-1/+1
2022-11-11adding tag drop out optionKyuSeok Jung-4/+4
2022-11-09Merge branch 'master' into gradient-clippingMuhammad Rizqi Nur-69/+93
2022-11-08move functions out of main body for image preprocessing for easier hijackingAUTOMATIC-69/+93
2022-11-05Simplify grad clipMuhammad Rizqi Nur-9/+7
2022-11-04change option position to Training settingTinkTheBoush-5/+4
2022-11-04Fixes race condition in training when VAE is unloadedFampai-0/+5
2022-11-02Merge branch 'master' into gradient-clippingMuhammad Rizqi Nur-2/+15
2022-11-02Merge branch 'master' into masterKyuSeok Jung-2/+15