aboutsummaryrefslogtreecommitdiffstats
AgeCommit message (Collapse)AuthorLines
2022-10-08add 'Ignore last layers of CLIP model' option as a parameter to the infotextAUTOMATIC-1/+5
2022-10-08make --force-enable-xformers work without needing --xformersAUTOMATIC-1/+1
2022-10-08Added ability to ignore last n layers in FrozenCLIPEmbedderFampai-2/+10
2022-10-08Update ui.pyDepFA-1/+1
2022-10-08TI preprocess wordingDepFA-3/+3
I had to check the code to work out what splitting was 🤷🏿
2022-10-08Merge branch 'master' into dev/deepdanbooruGreendayle-6/+23
2022-10-08add --force-enable-xformers option and also add messages to console ↵AUTOMATIC-1/+6
regarding cross attention optimizations
2022-10-08add fallback for xformers_attnblock_forwardAUTOMATIC-1/+4
2022-10-08made deepdanbooru optional, added to readme, automatic download of deepbooru ↵Greendayle-23/+29
model
2022-10-08alternate promptArtem Zagidulin-2/+7
2022-10-08Add GZipMiddleware to root demoDepFA-1/+5
2022-10-08check for ampere without destroying the optimizations. again.C43H66N12O12S2-4/+3
2022-10-08check for ampereC43H66N12O12S2-3/+4
2022-10-08check for 3.10C43H66N12O12S2-1/+1
2022-10-08Merge branch 'master' into dev/deepdanbooruGreendayle-1/+1
2022-10-08why did you do thisAUTOMATIC-1/+1
2022-10-08fix conflictsGreendayle-57/+403
2022-10-08Fixed typoMilly-1/+1
2022-10-08restore old opt_split_attention/disable_opt_split_attention logicAUTOMATIC-1/+1
2022-10-08simplify xfrmers options: --xformers to enable and that's itAUTOMATIC-10/+16
2022-10-08emergency fix for xformers (continue + shared)AUTOMATIC-8/+8
2022-10-08Merge pull request #1851 from C43H66N12O12S2/flashAUTOMATIC1111-6/+55
xformers attention
2022-10-08Update sd_hijack.pyC43H66N12O12S2-1/+1
2022-10-08Update requirements_versions.txtC43H66N12O12S2-0/+1
2022-10-08Update launch.pyC43H66N12O12S2-1/+1
2022-10-08update sd_hijack_opt to respect new env variablesC43H66N12O12S2-3/+8
2022-10-08add xformers_available shared variableC43H66N12O12S2-1/+1
2022-10-08default to split attention if cuda is available and xformers is notC43H66N12O12S2-2/+2
2022-10-08check for OS and env variableC43H66N12O12S2-2/+7
2022-10-08fix bug where when using prompt composition, hijack_comments generated ↵MrCheeze-1/+5
before the final AND will be dropped
2022-10-08Remove duplicate event listenersguaneec-0/+3
2022-10-08fix glob path in hypernetwork.pyddPn08-1/+1
2022-10-08fix AND broken for long promptsAUTOMATIC-0/+9
2022-10-08fix bugs related to variable prompt lengthsAUTOMATIC-12/+37
2022-10-08Update requirements.txtC43H66N12O12S2-1/+0
2022-10-08install xformersC43H66N12O12S2-0/+3
2022-10-08do not let user choose his own prompt token count limitAUTOMATIC-21/+13
2022-10-08check specifically for skippedTrung Ngo-7/+3
2022-10-08Add button to skip the current iterationTrung Ngo-8/+49
2022-10-08Merge remote-tracking branch 'origin/master'AUTOMATIC-1/+5
2022-10-08let user choose his own prompt token count limitAUTOMATIC-8/+16
2022-10-08fix: handles when state_dict does not existleko-1/+5
2022-10-08use new attnblock for xformers pathC43H66N12O12S2-1/+1
2022-10-08Update sd_hijack_optimizations.pyC43H66N12O12S2-1/+1
2022-10-08add xformers attnblock and hypernetwork supportC43H66N12O12S2-2/+18
2022-10-08add info about cross attention javascript shortcut codeAUTOMATIC-1/+1
2022-10-08implement removalDepFA-3/+10
2022-10-08context menu stylingDepFA-1/+28
2022-10-08Context MenusDepFA-0/+165
2022-10-08Add hypernetwork support to split cross attention v1brkirch-5/+15
* Add hypernetwork support to split_cross_attention_forward_v1 * Fix device check in esrgan_model.py to use devices.device_esrgan instead of shared.device