aboutsummaryrefslogtreecommitdiffstats
path: root/modules
AgeCommit message (Expand)AuthorLines
2022-10-09Merge pull request #1752 from Greendayle/dev/deepdanbooruAUTOMATIC1111-5/+98
2022-10-09Support `Download` for txt files.aoirusann-3/+41
2022-10-09Add `Download` & `Download as zip`aoirusann-5/+34
2022-10-09fixed incorrect message about loading config; thanks anon!AUTOMATIC-1/+1
2022-10-09make main model loading and model merger use the same codeAUTOMATIC-8/+12
2022-10-08support loading .yaml config with same name as modelAUTOMATIC-8/+24
2022-10-08chore: Fix typosAidan Holland-15/+15
2022-10-08Break after finding the local directory of stable diffusionEdouard Leurent-0/+1
2022-10-08add 'Ignore last layers of CLIP model' option as a parameter to the infotextAUTOMATIC-1/+5
2022-10-08make --force-enable-xformers work without needing --xformersAUTOMATIC-1/+1
2022-10-08Added ability to ignore last n layers in FrozenCLIPEmbedderFampai-2/+10
2022-10-08Update ui.pyDepFA-1/+1
2022-10-08TI preprocess wordingDepFA-3/+3
2022-10-08Merge branch 'master' into dev/deepdanbooruGreendayle-4/+17
2022-10-08add --force-enable-xformers option and also add messages to console regarding...AUTOMATIC-1/+6
2022-10-08add fallback for xformers_attnblock_forwardAUTOMATIC-1/+4
2022-10-08made deepdanbooru optional, added to readme, automatic download of deepbooru ...Greendayle-17/+23
2022-10-08alternate promptArtem Zagidulin-2/+7
2022-10-08check for ampere without destroying the optimizations. again.C43H66N12O12S2-4/+3
2022-10-08check for ampereC43H66N12O12S2-3/+4
2022-10-08Merge branch 'master' into dev/deepdanbooruGreendayle-1/+1
2022-10-08why did you do thisAUTOMATIC-1/+1
2022-10-08fix conflictsGreendayle-46/+159
2022-10-08Fixed typoMilly-1/+1
2022-10-08restore old opt_split_attention/disable_opt_split_attention logicAUTOMATIC-1/+1
2022-10-08simplify xfrmers options: --xformers to enable and that's itAUTOMATIC-9/+15
2022-10-08emergency fix for xformers (continue + shared)AUTOMATIC-8/+8
2022-10-08Merge pull request #1851 from C43H66N12O12S2/flashAUTOMATIC1111-6/+45
2022-10-08Update sd_hijack.pyC43H66N12O12S2-1/+1
2022-10-08update sd_hijack_opt to respect new env variablesC43H66N12O12S2-3/+8
2022-10-08add xformers_available shared variableC43H66N12O12S2-1/+1
2022-10-08default to split attention if cuda is available and xformers is notC43H66N12O12S2-2/+2
2022-10-08fix bug where when using prompt composition, hijack_comments generated before...MrCheeze-1/+5
2022-10-08fix glob path in hypernetwork.pyddPn08-1/+1
2022-10-08fix AND broken for long promptsAUTOMATIC-0/+9
2022-10-08fix bugs related to variable prompt lengthsAUTOMATIC-12/+37
2022-10-08do not let user choose his own prompt token count limitAUTOMATIC-21/+12
2022-10-08check specifically for skippedTrung Ngo-7/+3
2022-10-08Add button to skip the current iterationTrung Ngo-0/+21
2022-10-08Merge remote-tracking branch 'origin/master'AUTOMATIC-1/+5
2022-10-08let user choose his own prompt token count limitAUTOMATIC-8/+16
2022-10-08fix: handles when state_dict does not existleko-1/+5
2022-10-08use new attnblock for xformers pathC43H66N12O12S2-1/+1
2022-10-08Update sd_hijack_optimizations.pyC43H66N12O12S2-1/+1
2022-10-08add xformers attnblock and hypernetwork supportC43H66N12O12S2-2/+18
2022-10-08Add hypernetwork support to split cross attention v1brkirch-5/+15
2022-10-08delete broken and unnecessary aliasesC43H66N12O12S2-6/+4
2022-10-08switch to the proper way of calling xformersC43H66N12O12S2-25/+3
2022-10-07linux testGreendayle-2/+3
2022-10-07even more powerfull fixGreendayle-2/+7