aboutsummaryrefslogtreecommitdiffstats
path: root/modules/hypernetworks/hypernetwork.py
AgeCommit message (Collapse)AuthorLines
2022-10-22Merge branch 'AUTOMATIC1111:master' into masterdiscus0434-0/+11
2022-10-22small fixdiscus0434-7/+5
2022-10-22add an option to avoid dying reludiscus0434-6/+6
2022-10-22added a guard for hypernet training that will stop early if weights are ↵AUTOMATIC-0/+11
getting no gradients
2022-10-22Merge branch 'master' of upstreamdiscus0434-8/+18
2022-10-22add dropoutdiscus0434-26/+42
2022-10-21Remove unused variable.timntorres-1/+0
2022-10-21Match hypernet name with filename in all cases.timntorres-1/+7
2022-10-21turns out LayerNorm also has weight and bias and needs to be pre-multiplied ↵AUTOMATIC-2/+2
and trained for hypernets
2022-10-21Merge branch 'master' into training-help-textAUTOMATIC1111-40/+56
2022-10-21Revise comments.timntorres-1/+1
2022-10-21Issue #2921-Give PNG info to Hypernet previews.timntorres-2/+7
2022-10-21a more strict check for activation type and a more reasonable check for type ↵AUTOMATIC-3/+9
of layer in hypernets
2022-10-20Fixed some typos in the codeMelan-5/+5
2022-10-20Some changes to the tensorboard code and hypernetwork supportMelan-1/+17
2022-10-21Revert "fix bugs and optimizations"aria1th-59/+46
This reverts commit 108be15500aac590b4e00420635d7b61fccfa530.
2022-10-21fix bugs and optimizationsAngelBottomless-46/+59
2022-10-20only linearAngelBottomless-5/+5
2022-10-20generalized some functions and option for ignoring first layerAngelBottomless-8/+15
2022-10-20updatediscus0434-10/+19
2022-10-20change html outputDepFA-1/+1
2022-10-19fix for #3086 failing to load any previous hypernetdiscus0434-32/+28
2022-10-19fix for #3086 failing to load any previous hypernetAUTOMATIC-32/+28
2022-10-19layer options moves into create hnet uidiscus0434-32/+32
2022-10-19Merge branch 'master' into masterdiscus0434-3/+2
2022-10-19Use training width/height when training hypernetworks.Silent-2/+2
2022-10-19updatediscus0434-2/+2
2022-10-19updatediscus0434-6/+8
2022-10-19add options to custom hypernetwork layer structurediscus0434-21/+67
2022-10-15check NaN for hypernetwork tuningAngelBottomless-4/+6
2022-10-15add option to use batch size for trainingAUTOMATIC-9/+24
2022-10-14remove duplicate code for log loss, add step, make it read from options ↵AUTOMATIC-14/+6
rather than gradio input
2022-10-14Merge remote-tracking branch 'Melanpan/master'AUTOMATIC-0/+15
2022-10-14add option to read generation params for learning previews from txt2imgAUTOMATIC-5/+16
2022-10-13add hypernetwork multipliersAUTOMATIC-1/+7
2022-10-13Add learn_rate to csv and removed a left-over debug statementMelan-3/+3
2022-10-12Save a csv containing the loss while trainingMelan-1/+16
2022-10-12train: change filename processing to be more simple and configurableAUTOMATIC-24/+16
train: make it possible to make text files with prompts train: rework scheduler so that there's less repeating code in textual inversion and hypernets train: move epochs setting to options
2022-10-12change textual inversion tab to trainAUTOMATIC-1/+1
remake train interface to use tabs
2022-10-12xy_grid: Find hypernetwork by closest nameMilly-0/+11
2022-10-11apply lr schedule to hypernetsAUTOMATIC-4/+15
2022-10-11prevent extra modules from being saved/loaded with hypernetAUTOMATIC-1/+1
2022-10-11add an option to unload models during hypernetwork training to save VRAMAUTOMATIC-7/+18
2022-10-11add option to select hypernetwork modules when creatingAUTOMATIC-2/+2
2022-10-11rename hypernetwork dir to hypernetworks to prevent clash with an old ↵AUTOMATIC-0/+283
filename that people who use zip instead of git clone will have