aboutsummaryrefslogtreecommitdiffstats
path: root/modules
Commit message (Collapse)AuthorAgeFilesLines
* make main model loading and model merger use the same codeAUTOMATIC2022-10-092-8/+12
|
* support loading .yaml config with same name as modelAUTOMATIC2022-10-082-8/+24
| | | | support EMA weights in processing (????)
* chore: Fix typosAidan Holland2022-10-088-15/+15
|
* Break after finding the local directory of stable diffusionEdouard Leurent2022-10-081-0/+1
| | | | | Otherwise, we may override it with one of the next two path (. or ..) if it is present there, and then the local paths of other modules (taming transformers, codeformers, etc.) wont be found in sd_path/../. Fix https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/1085
* add 'Ignore last layers of CLIP model' option as a parameter to the infotextAUTOMATIC2022-10-081-1/+5
|
* make --force-enable-xformers work without needing --xformersAUTOMATIC2022-10-081-1/+1
|
* Added ability to ignore last n layers in FrozenCLIPEmbedderFampai2022-10-082-2/+10
|
* Update ui.pyDepFA2022-10-081-1/+1
|
* TI preprocess wordingDepFA2022-10-081-3/+3
| | | I had to check the code to work out what splitting was 🤷🏿
* add --force-enable-xformers option and also add messages to console ↵AUTOMATIC2022-10-082-1/+6
| | | | regarding cross attention optimizations
* add fallback for xformers_attnblock_forwardAUTOMATIC2022-10-081-1/+4
|
* alternate promptArtem Zagidulin2022-10-081-2/+7
|
* check for ampere without destroying the optimizations. again.C43H66N12O12S22022-10-081-4/+3
|
* check for ampereC43H66N12O12S22022-10-081-3/+4
|
* why did you do thisAUTOMATIC2022-10-081-1/+1
|
* Fixed typoMilly2022-10-081-1/+1
|
* restore old opt_split_attention/disable_opt_split_attention logicAUTOMATIC2022-10-081-1/+1
|
* simplify xfrmers options: --xformers to enable and that's itAUTOMATIC2022-10-083-9/+15
|
* emergency fix for xformers (continue + shared)AUTOMATIC2022-10-081-8/+8
|
* Merge pull request #1851 from C43H66N12O12S2/flashAUTOMATIC11112022-10-083-6/+45
|\ | | | | xformers attention
| * Update sd_hijack.pyC43H66N12O12S22022-10-081-1/+1
| |
| * update sd_hijack_opt to respect new env variablesC43H66N12O12S22022-10-081-3/+8
| |
| * add xformers_available shared variableC43H66N12O12S22022-10-081-1/+1
| |
| * default to split attention if cuda is available and xformers is notC43H66N12O12S22022-10-081-2/+2
| |
| * use new attnblock for xformers pathC43H66N12O12S22022-10-081-1/+1
| |
| * Update sd_hijack_optimizations.pyC43H66N12O12S22022-10-081-1/+1
| |
| * add xformers attnblock and hypernetwork supportC43H66N12O12S22022-10-081-2/+18
| |
| * delete broken and unnecessary aliasesC43H66N12O12S22022-10-081-6/+4
| |
| * switch to the proper way of calling xformersC43H66N12O12S22022-10-081-25/+3
| |
| * Update sd_hijack.pyC43H66N12O12S22022-10-071-1/+1
| |
| * Update sd_hijack.pyC43H66N12O12S22022-10-071-2/+2
| |
| * Update sd_hijack.pyC43H66N12O12S22022-10-071-2/+1
| |
| * Update shared.pyC43H66N12O12S22022-10-071-0/+1
| |
| * Update sd_hijack.pyC43H66N12O12S22022-10-071-4/+9
| |
| * add xformers attentionC43H66N12O12S22022-10-071-1/+38
| |
* | fix bug where when using prompt composition, hijack_comments generated ↵MrCheeze2022-10-082-1/+5
| | | | | | | | before the final AND will be dropped
* | fix glob path in hypernetwork.pyddPn082022-10-081-1/+1
| |
* | fix AND broken for long promptsAUTOMATIC2022-10-081-0/+9
| |
* | fix bugs related to variable prompt lengthsAUTOMATIC2022-10-082-12/+37
| |
* | do not let user choose his own prompt token count limitAUTOMATIC2022-10-083-21/+12
| |
* | check specifically for skippedTrung Ngo2022-10-084-7/+3
| |
* | Add button to skip the current iterationTrung Ngo2022-10-084-0/+21
| |
* | Merge remote-tracking branch 'origin/master'AUTOMATIC2022-10-081-1/+5
|\ \
| * | fix: handles when state_dict does not existleko2022-10-081-1/+5
| | |
* | | let user choose his own prompt token count limitAUTOMATIC2022-10-083-8/+16
|/ /
* | Add hypernetwork support to split cross attention v1brkirch2022-10-082-5/+15
| | | | | | | | | | * Add hypernetwork support to split_cross_attention_forward_v1 * Fix device check in esrgan_model.py to use devices.device_esrgan instead of shared.device
* | make it possible to use hypernetworks without opt split attentionAUTOMATIC2022-10-072-10/+38
| |
* | do not stop working on failed hypernetwork loadAUTOMATIC2022-10-071-2/+9
| |
* | support loading VAEAUTOMATIC2022-10-071-0/+8
| |
* | added support for hypernetworks (???)AUTOMATIC2022-10-073-3/+78
|/