aboutsummaryrefslogtreecommitdiffstats
path: root/modules/hypernetworks
AgeCommit message (Collapse)AuthorLines
2023-09-09fix whitespace for #13084AUTOMATIC1111-1/+1
2023-09-05Fix #13080 - Hypernetwork/TI preview generationAngelBottomless-2/+2
Fixes sampler name reference Same patch will be done for TI.
2023-08-04resolve some of circular import issues for kohakuAUTOMATIC1111-3/+2
2023-07-13get attention optimizations to workAUTOMATIC1111-1/+1
2023-07-10Use closing() with processing classes everywhereAarni Koskela-2/+4
Follows up on #11569
2023-06-05Remove a bunch of unused/vestigial codeAarni Koskela-24/+0
As found by Vulture and some eyes
2023-05-31rename print_error to report, use it with together with package nameAUTOMATIC-4/+3
2023-05-29Add & use modules.errors.print_error where currently printing exception info ↵Aarni Koskela-9/+5
by hand
2023-05-11Autofix Ruff W (not W605) (mostly whitespace)Aarni Koskela-6/+6
2023-05-10suggestions and fixes from the PRAUTOMATIC-2/+2
2023-05-10fixes for B007AUTOMATIC-6/+6
2023-05-10ruff auto fixesAUTOMATIC-3/+3
2023-05-10imports cleanup for ruffAUTOMATIC-4/+1
2023-03-28sort hypernetworks and checkpoints by nameAUTOMATIC-1/+1
2023-02-19Merge branch 'master' into weighted-learningAUTOMATIC1111-2/+2
2023-02-15Add ability to choose using weighted loss or notShondoit-4/+9
2023-02-15Call weighted_forward during trainingShondoit-1/+2
2023-02-06Support for hypernetworks with --upcast-samplingbrkirch-2/+2
2023-02-04add --no-hashingAUTOMATIC-1/+1
2023-01-22enable compact view for train tabAUTOMATIC-0/+2
prevent previews from ruining hypernetwork training
2023-01-21extra networks UIAUTOMATIC-35/+77
rework of hypernets: rather than via settings, hypernets are added directly to prompt as <hypernet:name:weight>
2023-01-18add option to show/hide warningsAUTOMATIC-1/+6
removed hiding warnings from LDSR fixed/reworked few places that produced warnings
2023-01-16Fix tensorboard related functionsaria1th-7/+6
2023-01-16Fix loss_dict problemaria1th-1/+3
2023-01-16fix missing 'mean loss' for tensorboard integrationAngelBottomless-1/+1
2023-01-15big rework of progressbar/preview system to allow multiple users to prompts ↵AUTOMATIC-3/+3
at the same time and do not get previews of each other
2023-01-14change hypernets to use sha256 hashesAUTOMATIC-17/+23
2023-01-14change hash to sha256AUTOMATIC-2/+2
2023-01-13Merge branch 'master' into tensorboardAUTOMATIC1111-165/+491
2023-01-11set descriptionsVladimir Mandic-1/+3
2023-01-10Variable dropout ratearia1th-27/+78
Implements variable dropout rate from #4549 Fixes hypernetwork multiplier being able to modified during training, also fixes user-errors by setting multiplier value to lower values for training. Changes function name to match torch.nn.module standard Fixes RNG reset issue when generating previews by restoring RNG state
2023-01-09make a dropdown for prompt template selectionAUTOMATIC-2/+5
2023-01-08Move batchsize checkdan-1/+1
2023-01-08Add checkbox for variable training dimsdan-1/+1
2023-01-06rework saving training params to file #6372AUTOMATIC-21/+7
2023-01-05Include model in log file. Exclude directory.timntorres-18/+10
2023-01-05Clean up ti, add same behavior to hypernetwork.timntorres-1/+30
2023-01-04Merge branch 'master' into gradient-clippingAUTOMATIC1111-166/+206
2023-01-03add job info to modulesVladimir Mandic-0/+1
2022-12-25Merge pull request #5992 from yuvalabou/F541AUTOMATIC1111-2/+2
Fix F541: f-string without any placeholders
2022-12-24implement train apiVladimir Mandic-27/+30
2022-12-24fix F541 f-string without any placeholdersYuval Aboulafia-2/+2
2022-12-03Merge branch 'master' into racecond_fixAUTOMATIC1111-131/+217
2022-11-30Use devices.autocast instead of torch.autocastbrkirch-1/+1
2022-11-23last_layer_dropout default to Falseflamelaw-1/+1
2022-11-23fix dropout, implement train/eval modeflamelaw-6/+18
2022-11-23small fixesflamelaw-3/+3
2022-11-21fix pin_memory with different latent sampling methodflamelaw-1/+4
2022-11-20Gradient accumulation, autocast fix, new latent sampling method, etcflamelaw-123/+146
2022-11-19change StableDiffusionProcessing to internally use sampler name instead of ↵AUTOMATIC-2/+2
sampler index