aboutsummaryrefslogtreecommitdiffstats
path: root/modules/hypernetworks/hypernetwork.py
Commit message (Collapse)AuthorAgeFilesLines
* Merge branch 'master' into racecond_fixAUTOMATIC11112022-12-031-130/+216
|\
| * Use devices.autocast instead of torch.autocastbrkirch2022-11-301-1/+1
| |
| * last_layer_dropout default to Falseflamelaw2022-11-231-1/+1
| |
| * fix dropout, implement train/eval modeflamelaw2022-11-231-6/+18
| |
| * small fixesflamelaw2022-11-221-3/+3
| |
| * fix pin_memory with different latent sampling methodflamelaw2022-11-211-1/+4
| |
| * Gradient accumulation, autocast fix, new latent sampling method, etcflamelaw2022-11-201-123/+146
| |
| * change StableDiffusionProcessing to internally use sampler name instead of ↵AUTOMATIC2022-11-191-2/+2
| | | | | | | | sampler index
| * rework the code to not use the walrus operator because colab's 3.7 does not ↵AUTOMATIC2022-11-051-2/+5
| | | | | | | | support it
| * Merge pull request #4273 from Omegastick/ordered_hypernetworksAUTOMATIC11112022-11-051-1/+1
| |\ | | | | | | Sort hypernetworks list
| | * Sort straight out of the globIsaac Poulton2022-11-041-2/+2
| | |
| | * Sort hypernetworksIsaac Poulton2022-11-041-1/+1
| | |
| * | only save if option is enabledaria1th2022-11-041-1/+1
| | |
| * | split before declaring file namearia1th2022-11-041-1/+1
| | |
| * | applyaria1th2022-11-041-5/+49
| |/
* / Fixes race condition in training when VAE is unloadedFampai2022-11-041-0/+4
|/ | | | | set_current_image can attempt to use the VAE when it is unloaded to the CPU while training
* Merge branch 'master' into hn-activationAUTOMATIC11112022-11-041-33/+56
|\
| * Merge pull request #3928 from R-N/validate-before-loadAUTOMATIC11112022-10-301-26/+43
| |\ | | | | | | Optimize training a little
| | * Fix dataset still being loaded even when training will be skippedMuhammad Rizqi Nur2022-10-291-1/+1
| | |
| | * Add missing info on hypernetwork/embedding model logMuhammad Rizqi Nur2022-10-291-10/+21
| | | | | | | | | | | | | | | | | | Mentioned here: https://github.com/AUTOMATIC1111/stable-diffusion-webui/discussions/1528#discussioncomment-3991513 Also group the saving into one
| | * Revert "Add cleanup after training"Muhammad Rizqi Nur2022-10-291-105/+96
| | | | | | | | | | | | This reverts commit 3ce2bfdf95bd5f26d0f6e250e67338ada91980d1.
| | * Add cleanup after trainingMuhammad Rizqi Nur2022-10-291-96/+105
| | |
| | * Add input validations before loading dataset for trainingMuhammad Rizqi Nur2022-10-291-16/+22
| | |
| * | Merge branch 'AUTOMATIC1111:master' into 3825-save-hypernet-strength-to-infotimntorres2022-10-291-5/+8
| |\|
| | * Merge pull request #3858 from R-N/log-csvAUTOMATIC11112022-10-291-5/+7
| | |\ | | | | | | | | Fix log off by 1 #3847
| | | * Fix log off by 1Muhammad Rizqi Nur2022-10-281-5/+7
| | | |
| | * | Add missing support for linear activation in hypernetworkbenkyoujouzu2022-10-281-0/+1
| | |/
| * / Always ignore "None.pt" in the hypernet directory.timntorres2022-10-281-2/+5
| |/
| * patch bug (SeverianVoid's comment on 5245c7a)timntorres2022-10-261-1/+1
| |
* | Revert unresolved changes in Bias initializationAngelBottomless2022-10-271-1/+1
| | | | | | it should be zeros_ or parameterized in future properly.
* | Fix dropout logicguaneec2022-10-271-2/+2
| |
* | Squashed commit of fixing dropout silentlyAngelBottomless2022-10-271-8/+17
| | | | | | | | | | | | | | | | | | | | | | | | | | | | fix dropouts for future hypernetworks add kwargs for Hypernetwork class hypernet UI for gradio input add recommended options remove as options revert adding options in ui
* | Fix mergeguaneec2022-10-261-2/+2
| |
* | Merge fixguaneec2022-10-261-1/+1
| |
* | Merge branch 'master' into hn-activationguaneec2022-10-261-10/+39
|\|
| * remove duplicate keys and lowercaseAngelBottomless2022-10-261-1/+1
| |
| * Weight initialization and More activation funcAngelBottomless2022-10-261-9/+38
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | add weight init add weight init option in create_hypernetwork fstringify hypernet info save weight initialization info for further debugging fill bias with zero for He/Xavier initialize LayerNorm with Normal fix loading weight_init
* | Back compatibilityguaneec2022-10-261-7/+10
| |
* | Fix off-by-oneguaneec2022-10-261-2/+2
| |
* | Remove activation from final layer of HNsguaneec2022-10-261-3/+3
|/
* check length for varianceAngelBottomless2022-10-241-2/+10
|
* convert deque -> listAngelBottomless2022-10-241-1/+1
| | | I don't feel this being efficient
* statistics for pbarAngelBottomless2022-10-241-2/+10
|
* cleanup some codeAngelBottomless2022-10-241-11/+3
|
* Hypernetworks - fix KeyError in statistics cachingAngelBottomless2022-10-241-2/+2
| | | Statistics logging has changed to {filename : list[losses]}, so it has to use loss_info[key].pop()
* Update hypernetwork.pyDepFA2022-10-231-4/+7
|
* Allow tracking real-time lossAngelBottomless2022-10-221-1/+1
| | | | Someone had 6000 images in their dataset, and it was shown as 0, which was confusing. This will allow tracking real time dataset-average loss for registered objects.
* Update hypernetwork.pyAngelBottomless2022-10-221-11/+44
|
* small fixdiscus04342022-10-221-4/+3
|
* Merge branch 'AUTOMATIC1111:master' into masterdiscus04342022-10-221-0/+11
|\