aboutsummaryrefslogtreecommitdiffstats
path: root/modules/hypernetworks/hypernetwork.py
Commit message (Collapse)AuthorAgeFilesLines
...
| | * added a guard for hypernet training that will stop early if weights are ↵AUTOMATIC2022-10-221-0/+11
| | | | | | | | | | | | getting no gradients
| * | small fixdiscus04342022-10-221-7/+5
| | |
| * | add an option to avoid dying reludiscus04342022-10-221-6/+6
| | |
| * | Merge branch 'master' of upstreamdiscus04342022-10-221-8/+18
| |\|
| | * Remove unused variable.timntorres2022-10-211-1/+0
| | |
| | * Match hypernet name with filename in all cases.timntorres2022-10-211-1/+7
| | |
| | * turns out LayerNorm also has weight and bias and needs to be pre-multiplied ↵AUTOMATIC2022-10-211-2/+2
| | | | | | | | | | | | and trained for hypernets
| | * Merge branch 'master' into training-help-textAUTOMATIC11112022-10-211-40/+56
| | |\
| | | * Revise comments.timntorres2022-10-211-1/+1
| | | |
| | | * Issue #2921-Give PNG info to Hypernet previews.timntorres2022-10-211-2/+7
| | | |
| | | * a more strict check for activation type and a more reasonable check for type ↵AUTOMATIC2022-10-211-3/+9
| | | | | | | | | | | | | | | | of layer in hypernets
| | * | change html outputDepFA2022-10-191-1/+1
| | | |
| * | | add dropoutdiscus04342022-10-221-26/+42
| | | |
| * | | Revert "fix bugs and optimizations"aria1th2022-10-201-59/+46
| | | | | | | | | | | | | | | | This reverts commit 108be15500aac590b4e00420635d7b61fccfa530.
| * | | fix bugs and optimizationsAngelBottomless2022-10-201-46/+59
| | | |
| * | | only linearAngelBottomless2022-10-201-5/+5
| | | |
| * | | generalized some functions and option for ignoring first layerAngelBottomless2022-10-201-8/+15
| | |/ | |/|
| * | updatediscus04342022-10-201-10/+19
| | |
| * | fix for #3086 failing to load any previous hypernetdiscus04342022-10-191-32/+28
| |/
* | Removed two unused importsMelan2022-10-241-1/+0
| |
* | Fixed some typos in the codeMelan2022-10-201-5/+5
| |
* | Some changes to the tensorboard code and hypernetwork supportMelan2022-10-201-1/+17
| |
* | fix for #3086 failing to load any previous hypernetAUTOMATIC2022-10-191-32/+28
|/
* layer options moves into create hnet uidiscus04342022-10-191-32/+32
|
* Merge branch 'master' into masterdiscus04342022-10-191-3/+2
|\
| * Use training width/height when training hypernetworks.Silent2022-10-191-2/+2
| |
* | updatediscus04342022-10-181-2/+2
| |
* | updatediscus04342022-10-181-6/+8
| |
* | add options to custom hypernetwork layer structurediscus04342022-10-181-21/+67
|/
* check NaN for hypernetwork tuningAngelBottomless2022-10-151-4/+6
|
* add option to use batch size for trainingAUTOMATIC2022-10-151-9/+24
|
* remove duplicate code for log loss, add step, make it read from options ↵AUTOMATIC2022-10-141-14/+6
| | | | rather than gradio input
* Merge remote-tracking branch 'Melanpan/master'AUTOMATIC2022-10-141-0/+15
|\
| * Add learn_rate to csv and removed a left-over debug statementMelan2022-10-131-3/+3
| |
| * Save a csv containing the loss while trainingMelan2022-10-121-1/+16
| |
* | add option to read generation params for learning previews from txt2imgAUTOMATIC2022-10-141-5/+16
| |
* | add hypernetwork multipliersAUTOMATIC2022-10-131-1/+7
|/
* train: change filename processing to be more simple and configurableAUTOMATIC2022-10-121-24/+16
| | | | | | train: make it possible to make text files with prompts train: rework scheduler so that there's less repeating code in textual inversion and hypernets train: move epochs setting to options
* change textual inversion tab to trainAUTOMATIC2022-10-121-1/+1
| | | | remake train interface to use tabs
* xy_grid: Find hypernetwork by closest nameMilly2022-10-121-0/+11
|
* apply lr schedule to hypernetsAUTOMATIC2022-10-111-4/+15
|
* prevent extra modules from being saved/loaded with hypernetAUTOMATIC2022-10-111-1/+1
|
* add an option to unload models during hypernetwork training to save VRAMAUTOMATIC2022-10-111-7/+18
|
* add option to select hypernetwork modules when creatingAUTOMATIC2022-10-111-2/+2
|
* rename hypernetwork dir to hypernetworks to prevent clash with an old ↵AUTOMATIC2022-10-111-0/+283
filename that people who use zip instead of git clone will have