Commit message (Collapse) | Author | Age | Files | Lines | |
---|---|---|---|---|---|
* | add vae path args | ssysm | 2022-10-10 | 2 | -2/+2 |
| | |||||
* | support loading .yaml config with same name as model | AUTOMATIC | 2022-10-08 | 2 | -8/+24 |
| | | | | support EMA weights in processing (????) | ||||
* | chore: Fix typos | Aidan Holland | 2022-10-08 | 8 | -15/+15 |
| | |||||
* | Break after finding the local directory of stable diffusion | Edouard Leurent | 2022-10-08 | 1 | -0/+1 |
| | | | | | Otherwise, we may override it with one of the next two path (. or ..) if it is present there, and then the local paths of other modules (taming transformers, codeformers, etc.) wont be found in sd_path/../. Fix https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/1085 | ||||
* | add 'Ignore last layers of CLIP model' option as a parameter to the infotext | AUTOMATIC | 2022-10-08 | 1 | -1/+5 |
| | |||||
* | make --force-enable-xformers work without needing --xformers | AUTOMATIC | 2022-10-08 | 1 | -1/+1 |
| | |||||
* | Added ability to ignore last n layers in FrozenCLIPEmbedder | Fampai | 2022-10-08 | 2 | -2/+10 |
| | |||||
* | Update ui.py | DepFA | 2022-10-08 | 1 | -1/+1 |
| | |||||
* | TI preprocess wording | DepFA | 2022-10-08 | 1 | -3/+3 |
| | | | I had to check the code to work out what splitting was 🤷🏿 | ||||
* | add --force-enable-xformers option and also add messages to console ↵ | AUTOMATIC | 2022-10-08 | 2 | -1/+6 |
| | | | | regarding cross attention optimizations | ||||
* | add fallback for xformers_attnblock_forward | AUTOMATIC | 2022-10-08 | 1 | -1/+4 |
| | |||||
* | alternate prompt | Artem Zagidulin | 2022-10-08 | 1 | -2/+7 |
| | |||||
* | check for ampere without destroying the optimizations. again. | C43H66N12O12S2 | 2022-10-08 | 1 | -4/+3 |
| | |||||
* | check for ampere | C43H66N12O12S2 | 2022-10-08 | 1 | -3/+4 |
| | |||||
* | why did you do this | AUTOMATIC | 2022-10-08 | 1 | -1/+1 |
| | |||||
* | Fixed typo | Milly | 2022-10-08 | 1 | -1/+1 |
| | |||||
* | restore old opt_split_attention/disable_opt_split_attention logic | AUTOMATIC | 2022-10-08 | 1 | -1/+1 |
| | |||||
* | simplify xfrmers options: --xformers to enable and that's it | AUTOMATIC | 2022-10-08 | 3 | -9/+15 |
| | |||||
* | emergency fix for xformers (continue + shared) | AUTOMATIC | 2022-10-08 | 1 | -8/+8 |
| | |||||
* | Merge pull request #1851 from C43H66N12O12S2/flash | AUTOMATIC1111 | 2022-10-08 | 3 | -6/+45 |
|\ | | | | | xformers attention | ||||
| * | Update sd_hijack.py | C43H66N12O12S2 | 2022-10-08 | 1 | -1/+1 |
| | | |||||
| * | update sd_hijack_opt to respect new env variables | C43H66N12O12S2 | 2022-10-08 | 1 | -3/+8 |
| | | |||||
| * | add xformers_available shared variable | C43H66N12O12S2 | 2022-10-08 | 1 | -1/+1 |
| | | |||||
| * | default to split attention if cuda is available and xformers is not | C43H66N12O12S2 | 2022-10-08 | 1 | -2/+2 |
| | | |||||
| * | use new attnblock for xformers path | C43H66N12O12S2 | 2022-10-08 | 1 | -1/+1 |
| | | |||||
| * | Update sd_hijack_optimizations.py | C43H66N12O12S2 | 2022-10-08 | 1 | -1/+1 |
| | | |||||
| * | add xformers attnblock and hypernetwork support | C43H66N12O12S2 | 2022-10-08 | 1 | -2/+18 |
| | | |||||
| * | delete broken and unnecessary aliases | C43H66N12O12S2 | 2022-10-08 | 1 | -6/+4 |
| | | |||||
| * | switch to the proper way of calling xformers | C43H66N12O12S2 | 2022-10-08 | 1 | -25/+3 |
| | | |||||
| * | Update sd_hijack.py | C43H66N12O12S2 | 2022-10-07 | 1 | -1/+1 |
| | | |||||
| * | Update sd_hijack.py | C43H66N12O12S2 | 2022-10-07 | 1 | -2/+2 |
| | | |||||
| * | Update sd_hijack.py | C43H66N12O12S2 | 2022-10-07 | 1 | -2/+1 |
| | | |||||
| * | Update shared.py | C43H66N12O12S2 | 2022-10-07 | 1 | -0/+1 |
| | | |||||
| * | Update sd_hijack.py | C43H66N12O12S2 | 2022-10-07 | 1 | -4/+9 |
| | | |||||
| * | add xformers attention | C43H66N12O12S2 | 2022-10-07 | 1 | -1/+38 |
| | | |||||
* | | fix bug where when using prompt composition, hijack_comments generated ↵ | MrCheeze | 2022-10-08 | 2 | -1/+5 |
| | | | | | | | | before the final AND will be dropped | ||||
* | | fix glob path in hypernetwork.py | ddPn08 | 2022-10-08 | 1 | -1/+1 |
| | | |||||
* | | fix AND broken for long prompts | AUTOMATIC | 2022-10-08 | 1 | -0/+9 |
| | | |||||
* | | fix bugs related to variable prompt lengths | AUTOMATIC | 2022-10-08 | 2 | -12/+37 |
| | | |||||
* | | do not let user choose his own prompt token count limit | AUTOMATIC | 2022-10-08 | 3 | -21/+12 |
| | | |||||
* | | check specifically for skipped | Trung Ngo | 2022-10-08 | 4 | -7/+3 |
| | | |||||
* | | Add button to skip the current iteration | Trung Ngo | 2022-10-08 | 4 | -0/+21 |
| | | |||||
* | | Merge remote-tracking branch 'origin/master' | AUTOMATIC | 2022-10-08 | 1 | -1/+5 |
|\ \ | |||||
| * | | fix: handles when state_dict does not exist | leko | 2022-10-08 | 1 | -1/+5 |
| | | | |||||
* | | | let user choose his own prompt token count limit | AUTOMATIC | 2022-10-08 | 3 | -8/+16 |
|/ / | |||||
* | | Add hypernetwork support to split cross attention v1 | brkirch | 2022-10-08 | 2 | -5/+15 |
| | | | | | | | | | | * Add hypernetwork support to split_cross_attention_forward_v1 * Fix device check in esrgan_model.py to use devices.device_esrgan instead of shared.device | ||||
* | | make it possible to use hypernetworks without opt split attention | AUTOMATIC | 2022-10-07 | 2 | -10/+38 |
| | | |||||
* | | do not stop working on failed hypernetwork load | AUTOMATIC | 2022-10-07 | 1 | -2/+9 |
| | | |||||
* | | support loading VAE | AUTOMATIC | 2022-10-07 | 1 | -0/+8 |
| | | |||||
* | | added support for hypernetworks (???) | AUTOMATIC | 2022-10-07 | 3 | -3/+78 |
|/ |