aboutsummaryrefslogtreecommitdiffstats
path: root/modules/hypernetworks
Commit message (Expand)AuthorAgeFilesLines
* Update hypernetwork.pyDepFA2022-10-231-4/+7
* Allow tracking real-time lossAngelBottomless2022-10-221-1/+1
* Update hypernetwork.pyAngelBottomless2022-10-221-11/+44
* small fixdiscus04342022-10-221-4/+3
* Merge branch 'AUTOMATIC1111:master' into masterdiscus04342022-10-221-0/+11
|\
| * added a guard for hypernet training that will stop early if weights are getti...AUTOMATIC2022-10-221-0/+11
* | small fixdiscus04342022-10-221-7/+5
* | add an option to avoid dying reludiscus04342022-10-221-6/+6
* | Merge branch 'master' of upstreamdiscus04342022-10-222-10/+24
|\|
| * Remove unused variable.timntorres2022-10-211-1/+0
| * Match hypernet name with filename in all cases.timntorres2022-10-211-1/+7
| * Sanitize hypernet name input.timntorres2022-10-211-0/+3
| * turns out LayerNorm also has weight and bias and needs to be pre-multiplied a...AUTOMATIC2022-10-211-2/+2
| * Merge branch 'master' into training-help-textAUTOMATIC11112022-10-212-42/+59
| |\
| | * Revise comments.timntorres2022-10-211-1/+1
| | * Issue #2921-Give PNG info to Hypernet previews.timntorres2022-10-211-2/+7
| | * a more strict check for activation type and a more reasonable check for type ...AUTOMATIC2022-10-211-3/+9
| * | allow overwrite old hnDepFA2022-10-191-2/+3
| * | change html outputDepFA2022-10-191-1/+1
* | | add dropoutdiscus04342022-10-222-31/+47
* | | Revert "fix bugs and optimizations"aria1th2022-10-201-59/+46
* | | fix bugs and optimizationsAngelBottomless2022-10-201-46/+59
* | | only linearAngelBottomless2022-10-201-5/+5
* | | generalized some functions and option for ignoring first layerAngelBottomless2022-10-201-8/+15
| |/ |/|
* | Merge branch 'AUTOMATIC1111:master' into masterdiscus04342022-10-201-1/+1
|\ \
| * | allow float sizes for hypernet's layer_structureAUTOMATIC2022-10-201-1/+1
| * | fix for #3086 failing to load any previous hypernetAUTOMATIC2022-10-191-32/+28
| |/
* | updatediscus04342022-10-202-11/+21
* | fix for #3086 failing to load any previous hypernetdiscus04342022-10-191-32/+28
|/
* enable to write layer structure of hn himselfdiscus04342022-10-191-0/+4
* layer options moves into create hnet uidiscus04342022-10-192-34/+39
* Merge branch 'master' into masterdiscus04342022-10-191-3/+2
|\
| * Use training width/height when training hypernetworks.Silent2022-10-191-2/+2
* | updatediscus04342022-10-181-2/+2
* | updatediscus04342022-10-181-6/+8
* | add options to custom hypernetwork layer structurediscus04342022-10-181-21/+67
|/
* check NaN for hypernetwork tuningAngelBottomless2022-10-151-4/+6
* add option to use batch size for trainingAUTOMATIC2022-10-151-9/+24
* remove duplicate code for log loss, add step, make it read from options rathe...AUTOMATIC2022-10-141-14/+6
* Merge remote-tracking branch 'Melanpan/master'AUTOMATIC2022-10-141-0/+15
|\
| * Add learn_rate to csv and removed a left-over debug statementMelan2022-10-131-3/+3
| * Save a csv containing the loss while trainingMelan2022-10-121-1/+16
* | add option to read generation params for learning previews from txt2imgAUTOMATIC2022-10-141-5/+16
* | add hypernetwork multipliersAUTOMATIC2022-10-131-1/+7
|/
* train: change filename processing to be more simple and configurableAUTOMATIC2022-10-121-24/+16
* change textual inversion tab to trainAUTOMATIC2022-10-121-1/+1
* xy_grid: Find hypernetwork by closest nameMilly2022-10-121-0/+11
* reports that training with medvram is possible.AUTOMATIC2022-10-111-1/+1
* apply lr schedule to hypernetsAUTOMATIC2022-10-111-4/+15
* prevent extra modules from being saved/loaded with hypernetAUTOMATIC2022-10-111-1/+1