aboutsummaryrefslogtreecommitdiffstats
path: root/modules/textual_inversion
Commit message (Collapse)AuthorAgeFilesLines
...
* Merge branch 'master' into tensorboardAUTOMATIC11112023-01-138-338/+1128
|\
| * Merge pull request #6689 from Poktay/add_gradient_settings_to_logging_fileAUTOMATIC11112023-01-131-1/+1
| |\ | | | | | | add gradient settings to training settings log files
| | * add gradient settings to training settings log filesJosh R2023-01-131-1/+1
| | |
| * | print bucket sizes for training without resizing images #6620AUTOMATIC2023-01-133-3/+19
| | | | | | | | | | | | fix an error when generating a picture with embedding in it
| * | Merge pull request #6620 from guaneec/varsize_batchAUTOMATIC11112023-01-131-4/+32
| |\ \ | | |/ | |/| Enable batch_size>1 for mixed-sized training
| | * Enable batch_size>1 for mixed-sized trainingdan2023-01-101-4/+32
| | |
| * | Allow creation of zero vectors for TIShondoit2023-01-121-3/+6
| | |
| * | set descriptionsVladimir Mandic2023-01-112-2/+9
| | |
| * | Support loading textual inversion embeddings from safetensors filesLee Bousfield2023-01-111-0/+3
| |/
| * make a dropdown for prompt template selectionAUTOMATIC2023-01-091-8/+27
| |
| * remove/simplify some changes from #6481AUTOMATIC2023-01-092-11/+7
| |
| * Merge branch 'master' into varsizeAUTOMATIC11112023-01-091-62/+103
| |\
| | * make it possible for extensions/scripts to add their own embedding directoriesAUTOMATIC2023-01-081-66/+104
| | |
| | * skip images in embeddings dir if they have a second .preview extensionAUTOMATIC2023-01-081-0/+4
| | |
| * | Move batchsize checkdan2023-01-071-2/+2
| | |
| * | Add checkbox for variable training dimsdan2023-01-072-4/+4
| | |
| * | Allow variable img sizedan2023-01-072-9/+13
| |/
| * CLIP hijack reworkAUTOMATIC2023-01-061-1/+0
| |
| * rework saving training params to file #6372AUTOMATIC2023-01-062-20/+27
| |
| * Merge pull request #6372 from ↵AUTOMATIC11112023-01-061-1/+25
| |\ | | | | | | | | | | | | timntorres/save-ti-hypernet-settings-to-txt-revised Save hypernet and textual inversion settings to text file, revised.
| | * Include model in log file. Exclude directory.timntorres2023-01-051-13/+9
| | |
| | * Clean up ti, add same behavior to hypernetwork.timntorres2023-01-051-5/+9
| | |
| | * Add option to save ti settings to file.timntorres2023-01-051-3/+27
| | |
| * | allow loading embeddings from subdirectoriesFaber2023-01-051-11/+12
| | |
| * | typo in TIKuma2023-01-051-1/+1
| |/
| * Merge branch 'master' into gradient-clippingAUTOMATIC11112023-01-045-219/+354
| |\
| | * use shared function from processing for creating dummy mask when training ↵AUTOMATIC2023-01-041-24/+9
| | | | | | | | | | | | inpainting model
| | * fix the mergeAUTOMATIC2023-01-041-9/+5
| | |
| | * Merge branch 'master' into inpaint_textual_inversionAUTOMATIC11112023-01-045-285/+439
| | |\
| | | * Merge pull request #6253 from Shondoit/ti-optimAUTOMATIC11112023-01-041-8/+32
| | | |\ | | | | | | | | | | Save Optimizer next to TI embedding
| | | | * Save Optimizer next to TI embeddingShondoit2023-01-031-8/+32
| | | | | | | | | | | | | | | | | | | | Also add check to load only .PT and .BIN files as embeddings. (since we add .optim files in the same directory)
| | | * | add job info to modulesVladimir Mandic2023-01-032-0/+2
| | | |/
| | | * feat(api): return more data for embeddingsPhilpax2023-01-021-4/+4
| | | |
| | | * fix the issue with training on SD2.0AUTOMATIC2023-01-011-2/+1
| | | |
| | | * changed embedding accepted shape detection to use existing code and support ↵AUTOMATIC2022-12-311-24/+6
| | | | | | | | | | | | | | | | the new alt-diffusion model, and reformatted messages a bit #6149
| | | * validate textual inversion embeddingsVladimir Mandic2022-12-311-5/+38
| | | |
| | | * fix F541 f-string without any placeholdersYuval Aboulafia2022-12-241-1/+1
| | | |
| | | * Fix various typosJim Hays2022-12-152-13/+13
| | | |
| | | * Merge branch 'master' into racecond_fixAUTOMATIC11112022-12-035-274/+381
| | | |\
| | | | * Merge pull request #5194 from brkirch/autocast-and-mps-randn-fixesAUTOMATIC11112022-12-032-3/+3
| | | | |\ | | | | | | | | | | | | Use devices.autocast() and fix MPS randn issues
| | | | | * Use devices.autocast instead of torch.autocastbrkirch2022-11-302-3/+3
| | | | | |
| | | | * | Fix divide by 0 errorPhytoEpidemic2022-12-021-3/+3
| | | | |/ | | | | | | | | | | Fix of the edge case 0 weight that occasionally will pop up in some specific situations. This was crashing the script.
| | | | * Merge pull request #4688 from parasi22/resolve-embedding-name-in-filewordsAUTOMATIC11112022-11-271-1/+1
| | | | |\ | | | | | | | | | | | | resolve [name] after resolving [filewords] in training
| | | | | * resolve [name] after resolving [filewords] in trainingparasi2022-11-131-1/+1
| | | | | |
| | | | * | Merge remote-tracking branch 'flamelaw/master'AUTOMATIC2022-11-272-189/+274
| | | | |\ \
| | | | | * | set TI AdamW default weight decay to 0flamelaw2022-11-261-1/+1
| | | | | | |
| | | | | * | small fixesflamelaw2022-11-221-1/+1
| | | | | | |
| | | | | * | fix pin_memory with different latent sampling methodflamelaw2022-11-212-10/+20
| | | | | | |
| | | | | * | fix random sampling with pin_memoryflamelaw2022-11-201-1/+1
| | | | | | |
| | | | | * | remove unnecessary commentflamelaw2022-11-201-9/+0
| | | | | | |