aboutsummaryrefslogtreecommitdiffstats
path: root/modules/hypernetworks/hypernetwork.py
AgeCommit message (Collapse)AuthorLines
2023-03-28sort hypernetworks and checkpoints by nameAUTOMATIC-1/+1
2023-02-19Merge branch 'master' into weighted-learningAUTOMATIC1111-2/+2
2023-02-15Add ability to choose using weighted loss or notShondoit-4/+9
2023-02-15Call weighted_forward during trainingShondoit-1/+2
2023-02-06Support for hypernetworks with --upcast-samplingbrkirch-2/+2
2023-02-04add --no-hashingAUTOMATIC-1/+1
2023-01-22enable compact view for train tabAUTOMATIC-0/+2
prevent previews from ruining hypernetwork training
2023-01-21extra networks UIAUTOMATIC-32/+75
rework of hypernets: rather than via settings, hypernets are added directly to prompt as <hypernet:name:weight>
2023-01-18add option to show/hide warningsAUTOMATIC-1/+6
removed hiding warnings from LDSR fixed/reworked few places that produced warnings
2023-01-16Fix tensorboard related functionsaria1th-7/+6
2023-01-16Fix loss_dict problemaria1th-1/+3
2023-01-16fix missing 'mean loss' for tensorboard integrationAngelBottomless-1/+1
2023-01-15big rework of progressbar/preview system to allow multiple users to prompts ↵AUTOMATIC-3/+3
at the same time and do not get previews of each other
2023-01-14change hypernets to use sha256 hashesAUTOMATIC-17/+23
2023-01-14change hash to sha256AUTOMATIC-2/+2
2023-01-13Merge branch 'master' into tensorboardAUTOMATIC1111-143/+484
2023-01-11set descriptionsVladimir Mandic-1/+3
2023-01-10Variable dropout ratearia1th-25/+76
Implements variable dropout rate from #4549 Fixes hypernetwork multiplier being able to modified during training, also fixes user-errors by setting multiplier value to lower values for training. Changes function name to match torch.nn.module standard Fixes RNG reset issue when generating previews by restoring RNG state
2023-01-09make a dropdown for prompt template selectionAUTOMATIC-2/+5
2023-01-08Move batchsize checkdan-1/+1
2023-01-08Add checkbox for variable training dimsdan-1/+1
2023-01-06rework saving training params to file #6372AUTOMATIC-21/+7
2023-01-05Include model in log file. Exclude directory.timntorres-18/+10
2023-01-05Clean up ti, add same behavior to hypernetwork.timntorres-1/+30
2023-01-04Merge branch 'master' into gradient-clippingAUTOMATIC1111-139/+202
2023-01-03add job info to modulesVladimir Mandic-0/+1
2022-12-25Merge pull request #5992 from yuvalabou/F541AUTOMATIC1111-2/+2
Fix F541: f-string without any placeholders
2022-12-24implement train apiVladimir Mandic-0/+26
2022-12-24fix F541 f-string without any placeholdersYuval Aboulafia-2/+2
2022-12-03Merge branch 'master' into racecond_fixAUTOMATIC1111-130/+216
2022-11-30Use devices.autocast instead of torch.autocastbrkirch-1/+1
2022-11-23last_layer_dropout default to Falseflamelaw-1/+1
2022-11-23fix dropout, implement train/eval modeflamelaw-6/+18
2022-11-23small fixesflamelaw-3/+3
2022-11-21fix pin_memory with different latent sampling methodflamelaw-1/+4
2022-11-20Gradient accumulation, autocast fix, new latent sampling method, etcflamelaw-123/+146
2022-11-19change StableDiffusionProcessing to internally use sampler name instead of ↵AUTOMATIC-2/+2
sampler index
2022-11-07Merge branch 'master' into gradient-clippingMuhammad Rizqi Nur-6/+53
2022-11-05rework the code to not use the walrus operator because colab's 3.7 does not ↵AUTOMATIC-2/+5
support it
2022-11-05Merge pull request #4273 from Omegastick/ordered_hypernetworksAUTOMATIC1111-1/+1
Sort hypernetworks list
2022-11-05Simplify grad clipMuhammad Rizqi Nur-9/+7
2022-11-04Sort straight out of the globIsaac Poulton-2/+2
2022-11-04Merge branch 'master' into gradient-clippingMuhammad Rizqi Nur-12/+24
2022-11-04Sort hypernetworksIsaac Poulton-1/+1
2022-11-04Fixes race condition in training when VAE is unloadedFampai-0/+4
set_current_image can attempt to use the VAE when it is unloaded to the CPU while training
2022-11-04only save if option is enabledaria1th-1/+1
2022-11-04split before declaring file namearia1th-1/+1
2022-11-04applyaria1th-5/+49
2022-11-04Merge branch 'master' into hn-activationAUTOMATIC1111-33/+56
2022-10-31Fix merge conflictsMuhammad Rizqi Nur-5/+0