aboutsummaryrefslogtreecommitdiffstats
path: root/modules/textual_inversion/dataset.py
Commit message (Collapse)AuthorAgeFilesLines
* Simplify a bunch of `len(x) > 0`/`len(x) == 0` style expressionsAarni Koskela2023-06-021-1/+1
|
* Autofix Ruff W (not W605) (mostly whitespace)Aarni Koskela2023-05-111-2/+2
|
* Fix up string formatting/concatenation to f-strings where feasibleAarni Koskela2023-05-091-1/+1
|
* fix for #6700AUTOMATIC2023-02-191-1/+1
|
* Add ability to choose using weighted loss or notShondoit2023-02-151-5/+10
|
* Add PNG alpha channel as weight maps to data entriesShondoit2023-02-151-13/+38
|
* print bucket sizes for training without resizing images #6620AUTOMATIC2023-01-131-0/+16
| | | | fix an error when generating a picture with embedding in it
* Enable batch_size>1 for mixed-sized trainingdan2023-01-101-4/+32
|
* remove/simplify some changes from #6481AUTOMATIC2023-01-091-9/+5
|
* Move batchsize checkdan2023-01-071-2/+2
|
* Add checkbox for variable training dimsdan2023-01-071-2/+2
|
* Allow variable img sizedan2023-01-071-7/+11
|
* Fix various typosJim Hays2022-12-151-5/+5
|
* Use devices.autocast instead of torch.autocastbrkirch2022-11-301-2/+2
|
* Merge pull request #4688 from parasi22/resolve-embedding-name-in-filewordsAUTOMATIC11112022-11-271-1/+1
|\ | | | | resolve [name] after resolving [filewords] in training
| * resolve [name] after resolving [filewords] in trainingparasi2022-11-131-1/+1
| |
* | fix pin_memory with different latent sampling methodflamelaw2022-11-211-4/+19
| |
* | fix random sampling with pin_memoryflamelaw2022-11-201-1/+1
| |
* | remove unnecessary commentflamelaw2022-11-201-9/+0
| |
* | Gradient accumulation, autocast fix, new latent sampling method, etcflamelaw2022-11-201-48/+86
|/
* Update dataset.pyKyuSeok Jung2022-11-111-1/+1
|
* Update dataset.pyKyuSeok Jung2022-11-111-1/+1
|
* adding tag drop out optionKyuSeok Jung2022-11-111-4/+4
|
* change option position to Training settingTinkTheBoush2022-11-041-3/+2
|
* append_tag_shuffleTinkTheBoush2022-11-011-2/+8
|
* Additional assert on datasetMuhammad Rizqi Nur2022-10-291-0/+2
|
* Fix random dataset shuffle on TIFlameLaw2022-10-271-2/+2
|
* Allow datasets with only 1 image in TIguaneec2022-10-211-2/+2
|
* Merge branch 'master' into masterAUTOMATIC11112022-10-151-13/+20
|\
| * add option to use batch size for trainingAUTOMATIC2022-10-151-12/+19
| |
* | Raise an assertion error if no training images have been found.Melan2022-10-141-1/+2
|/
* train: change filename processing to be more simple and configurableAUTOMATIC2022-10-121-13/+34
| | | | | | train: make it possible to make text files with prompts train: rework scheduler so that there's less repeating code in textual inversion and hypernets train: move epochs setting to options
* add an option to unload models during hypernetwork training to save VRAMAUTOMATIC2022-10-111-9/+20
|
* Switched to exception handlingalg-wiki2022-10-111-5/+5
|
* Added .webp .bmpalg-wiki2022-10-101-1/+1
|
* Textual Inversion: Preprocess and Training will only pick-up image filesalg-wiki2022-10-101-1/+2
|
* Custom Width and Heightalg-wiki2022-10-101-4/+3
|
* Textual Inversion: Added custom training image size and number of repeats ↵alg-wiki2022-10-101-3/+3
| | | | per input image in a single epoch
* add support for gelbooru tags in filenames for textual inversionAUTOMATIC2022-10-041-2/+5
|
* keep textual inversion dataset latents in CPU memory to save a bit of VRAMAUTOMATIC2022-10-021-0/+2
|
* initial support for training textual inversionAUTOMATIC2022-10-021-0/+76