aboutsummaryrefslogtreecommitdiffstats
path: root/modules/hypernetworks
AgeCommit message (Collapse)AuthorLines
2022-11-04Sort straight out of the globIsaac Poulton-2/+2
2022-11-04Merge branch 'master' into gradient-clippingMuhammad Rizqi Nur-12/+24
2022-11-04Sort hypernetworksIsaac Poulton-1/+1
2022-11-04Fixes race condition in training when VAE is unloadedFampai-0/+4
set_current_image can attempt to use the VAE when it is unloaded to the CPU while training
2022-11-04only save if option is enabledaria1th-1/+1
2022-11-04split before declaring file namearia1th-1/+1
2022-11-04applyaria1th-5/+49
2022-11-04Merge branch 'AUTOMATIC1111:master' into force-push-patch-13AngelBottomless-12/+24
2022-11-04I blame code autocompleteAngelBottomless-49/+27
2022-11-04resolve conflict - first revertaria1th-71/+52
2022-11-04Merge branch 'master' into hn-activationAUTOMATIC1111-34/+58
2022-11-03use hash to check valid optimaria1th-4/+9
2022-11-03Separate .optim file from modelaria1th-4/+8
2022-10-31Fix merge conflictsMuhammad Rizqi Nur-5/+0
2022-10-31Fix merge conflictsMuhammad Rizqi Nur-11/+6
2022-10-30Merge masterMuhammad Rizqi Nur-18/+47
2022-10-30resolve conflictsaria1th-6/+38
2022-10-30We have duplicate linear nowAngelBottomless-1/+1
2022-10-30Merge pull request #3928 from R-N/validate-before-loadAUTOMATIC1111-26/+43
Optimize training a little
2022-10-30Fix dataset still being loaded even when training will be skippedMuhammad Rizqi Nur-1/+1
2022-10-30Add missing info on hypernetwork/embedding model logMuhammad Rizqi Nur-10/+21
Mentioned here: https://github.com/AUTOMATIC1111/stable-diffusion-webui/discussions/1528#discussioncomment-3991513 Also group the saving into one
2022-10-30Revert "Add cleanup after training"Muhammad Rizqi Nur-105/+96
This reverts commit 3ce2bfdf95bd5f26d0f6e250e67338ada91980d1.
2022-10-29Add cleanup after trainingMuhammad Rizqi Nur-96/+105
2022-10-29Add input validations before loading dataset for trainingMuhammad Rizqi Nur-16/+22
2022-10-29Merge branch 'master' into gradient-clippingMuhammad Rizqi Nur-6/+10
2022-10-29Merge branch 'AUTOMATIC1111:master' into 3825-save-hypernet-strength-to-infotimntorres-6/+10
2022-10-29Merge pull request #3858 from R-N/log-csvAUTOMATIC1111-5/+7
Fix log off by 1 #3847
2022-10-29Merge pull request #3717 from benkyoujouzu/masterAUTOMATIC1111-0/+1
Add missing support for linear activation in hypernetwork
2022-10-29Re enable linearAngelBottomless-1/+1
2022-10-28Fix log off by 1Muhammad Rizqi Nur-5/+7
2022-10-28Learning rate sched syntax support for grad clippingMuhammad Rizqi Nur-3/+10
2022-10-28Always ignore "None.pt" in the hypernet directory.timntorres-2/+5
2022-10-28Add missing support for linear activation in hypernetworkbenkyoujouzu-0/+1
2022-10-28Gradient clipping in train tabMuhammad Rizqi Nur-1/+9
2022-10-27Disable unavailable or duplicate optionsAngelBottomless-1/+2
2022-10-27Revert unresolved changes in Bias initializationAngelBottomless-1/+1
it should be zeros_ or parameterized in future properly.
2022-10-27Fix dropout logicguaneec-2/+2
2022-10-27Squashed commit of fixing dropout silentlyAngelBottomless-8/+17
fix dropouts for future hypernetworks add kwargs for Hypernetwork class hypernet UI for gradio input add recommended options remove as options revert adding options in ui
2022-10-26Fix mergeguaneec-2/+2
2022-10-26patch bug (SeverianVoid's comment on 5245c7a)timntorres-1/+1
2022-10-26Merge fixguaneec-1/+1
2022-10-26Merge branch 'master' into hn-activationguaneec-11/+42
2022-10-26Back compatibilityguaneec-7/+10
2022-10-26remove duplicate keys and lowercaseAngelBottomless-1/+1
2022-10-26Weight initialization and More activation funcAngelBottomless-10/+41
add weight init add weight init option in create_hypernetwork fstringify hypernet info save weight initialization info for further debugging fill bias with zero for He/Xavier initialize LayerNorm with Normal fix loading weight_init
2022-10-26Fix off-by-oneguaneec-2/+2
2022-10-26Remove activation from final layer of HNsguaneec-3/+3
2022-10-24Removed two unused importsMelan-1/+0
2022-10-24check length for varianceAngelBottomless-2/+10
2022-10-24convert deque -> listAngelBottomless-1/+1
I don't feel this being efficient