Commit message (Collapse) | Author | Age | Files | Lines | |
---|---|---|---|---|---|
* | alternate implementation for unet forward replacement that does not depend ↵ | AUTOMATIC1111 | 2023-12-02 | 1 | -2/+5 |
| | | | | on hijack being applied | ||||
* | potential fix for #14172 | AUTOMATIC1111 | 2023-12-02 | 1 | -8/+4 |
| | |||||
* | Fix bug where is_using_v_parameterization_for_sd2 fails because the ↵ | MrCheeze | 2023-12-02 | 1 | -3/+6 |
| | | | | sd_hijack is only partially undone | ||||
* | fix exception related to the pix2pix | AUTOMATIC1111 | 2023-11-06 | 1 | -0/+4 |
| | |||||
* | more changes for #13865: fix formatting, rename the function, add comment ↵ | AUTOMATIC1111 | 2023-11-05 | 1 | -11/+13 |
| | | | | and add a readme entry | ||||
* | linter | AUTOMATIC1111 | 2023-11-05 | 1 | -1/+1 |
| | |||||
* | Merge branch 'dev' into master | AUTOMATIC1111 | 2023-11-05 | 1 | -7/+14 |
|\ | |||||
| * | Merge pull request #13364 from superhero-7/master | AUTOMATIC1111 | 2023-10-14 | 1 | -2/+2 |
| |\ | | | | | | | Add altdiffusion-m18 support | ||||
| | * | fix linter issues | superhero-7 | 2023-10-01 | 1 | -2/+2 |
| | | | |||||
| | * | support altdiffusion-m18 | superhero-7 | 2023-09-23 | 1 | -0/+2 |
| | | | |||||
| | * | support m18 | superhero-7 | 2023-09-23 | 1 | -4/+2 |
| | | | |||||
| * | | initial work on sd_unet for SDXL | AUTOMATIC1111 | 2023-09-11 | 1 | -5/+12 |
| |/ | |||||
* | | Use devices.torch_gc() instead of empty_cache() | Ritesh Gangnani | 2023-11-05 | 1 | -4/+1 |
| | | |||||
* | | Added memory clearance after deletion | Ritesh Gangnani | 2023-11-05 | 1 | -1/+5 |
| | | |||||
* | | Add SSD-1B as a supported model | Ritesh Gangnani | 2023-11-05 | 1 | -0/+11 |
|/ | |||||
* | implement undo hijack for SDXL | AUTOMATIC1111 | 2023-08-19 | 1 | -1/+15 |
| | |||||
* | REMOVE | AUTOMATIC1111 | 2023-08-08 | 1 | -3/+1 |
| | |||||
* | Merge branch 'dev' into multiple_loaded_models | AUTOMATIC1111 | 2023-08-05 | 1 | -3/+3 |
|\ | |||||
| * | resolve some of circular import issues for kohaku | AUTOMATIC1111 | 2023-08-04 | 1 | -3/+3 |
| | | |||||
* | | repair PLMS | AUTOMATIC1111 | 2023-07-31 | 1 | -1/+3 |
| | | |||||
* | | option to keep multiple models in memory | AUTOMATIC1111 | 2023-07-31 | 1 | -2/+4 |
|/ | |||||
* | textual inversion support for SDXL | AUTOMATIC1111 | 2023-07-29 | 1 | -3/+5 |
| | |||||
* | Merge pull request #11878 from Bourne-M/patch-1 | AUTOMATIC1111 | 2023-07-19 | 1 | -1/+1 |
|\ | | | | | 【bug】reload altclip model error | ||||
| * | 【bug】reload altclip model error | yfzhou | 2023-07-19 | 1 | -1/+1 |
| | | | | | | When using BertSeriesModelWithTransformation as the cond_stage_model, the undo_hijack should be performed using the FrozenXLMREmbedderWithCustomWords type; otherwise, it will result in a failed model reload. | ||||
* | | Merge pull request #11757 from AUTOMATIC1111/sdxl | AUTOMATIC1111 | 2023-07-16 | 1 | -0/+38 |
|\ \ | | | | | | | SD XL support | ||||
| * | | initial SDXL refiner support | AUTOMATIC1111 | 2023-07-14 | 1 | -5/+13 |
| | | | |||||
| * | | fix CLIP doing the unneeded normalization | AUTOMATIC1111 | 2023-07-13 | 1 | -1/+1 |
| | | | | | | | | | | | | | | | revert SD2.1 back to use the original repo add SDXL's force_zero_embeddings to negative prompt | ||||
| * | | SDXL support | AUTOMATIC1111 | 2023-07-12 | 1 | -1/+22 |
| | | | |||||
| * | | getting SD2.1 to run on SDXL repo | AUTOMATIC1111 | 2023-07-11 | 1 | -0/+9 |
| |/ | |||||
* / | add textual inversion hashes to infotext | AUTOMATIC1111 | 2023-07-15 | 1 | -1/+4 |
|/ | |||||
* | revert default cross attention optimization to Doggettx | AUTOMATIC | 2023-06-01 | 1 | -0/+2 |
| | | | | make --disable-opt-split-attention command line option work again | ||||
* | custom unet support | AUTOMATIC | 2023-05-27 | 1 | -6/+14 |
| | |||||
* | possible fix for empty list of optimizations #10605 | AUTOMATIC | 2023-05-23 | 1 | -6/+15 |
| | |||||
* | make it actually work after suggestions | AUTOMATIC | 2023-05-19 | 1 | -1/+1 |
| | |||||
* | fix linter issues | AUTOMATIC | 2023-05-18 | 1 | -1/+0 |
| | |||||
* | make it possible for scripts to add cross attention optimizations | AUTOMATIC | 2023-05-18 | 1 | -41/+49 |
| | | | | add UI selection for cross attention optimization | ||||
* | fix model loading twice in some situations | AUTOMATIC | 2023-05-14 | 1 | -0/+3 |
| | |||||
* | Autofix Ruff W (not W605) (mostly whitespace) | Aarni Koskela | 2023-05-11 | 1 | -6/+6 |
| | |||||
* | ruff auto fixes | AUTOMATIC | 2023-05-10 | 1 | -1/+1 |
| | |||||
* | imports cleanup for ruff | AUTOMATIC | 2023-05-10 | 1 | -1/+1 |
| | |||||
* | autofixes from ruff | AUTOMATIC | 2023-05-10 | 1 | -2/+2 |
| | |||||
* | sdp_attnblock_forward hijack | Pam | 2023-03-10 | 1 | -0/+2 |
| | |||||
* | sdp refactoring | Pam | 2023-03-10 | 1 | -9/+10 |
| | |||||
* | argument to disable memory efficient for sdp | Pam | 2023-03-10 | 1 | -3/+8 |
| | |||||
* | scaled dot product attention | Pam | 2023-03-06 | 1 | -0/+4 |
| | |||||
* | Merge branch 'master' into weighted-learning | AUTOMATIC1111 | 2023-02-19 | 1 | -0/+2 |
|\ | |||||
| * | Apply hijacks in ddpm_edit for upcast sampling | brkirch | 2023-02-08 | 1 | -0/+3 |
| | | | | | | | | To avoid import errors, ddpm_edit hijacks are done after an instruct pix2pix model is loaded. | ||||
* | | Hijack to add weighted_forward to model: return loss * weight map | Shondoit | 2023-02-15 | 1 | -0/+52 |
|/ | |||||
* | Merge pull request #7309 from brkirch/fix-embeddings | AUTOMATIC1111 | 2023-01-28 | 1 | -1/+1 |
|\ | | | | | Fix embeddings, upscalers, and refactor `--upcast-sampling` | ||||
| * | Refactor conditional casting, fix upscalers | brkirch | 2023-01-28 | 1 | -1/+1 |
| | |