aboutsummaryrefslogtreecommitdiffstats
path: root/modules/sd_hijack.py
Commit message (Collapse)AuthorAgeFilesLines
* REMOVEAUTOMATIC11112023-08-081-3/+1
|
* Merge branch 'dev' into multiple_loaded_modelsAUTOMATIC11112023-08-051-3/+3
|\
| * resolve some of circular import issues for kohakuAUTOMATIC11112023-08-041-3/+3
| |
* | repair PLMSAUTOMATIC11112023-07-311-1/+3
| |
* | option to keep multiple models in memoryAUTOMATIC11112023-07-311-2/+4
|/
* textual inversion support for SDXLAUTOMATIC11112023-07-291-3/+5
|
* Merge pull request #11878 from Bourne-M/patch-1AUTOMATIC11112023-07-191-1/+1
|\ | | | | 【bug】reload altclip model error
| * 【bug】reload altclip model erroryfzhou2023-07-191-1/+1
| | | | | | When using BertSeriesModelWithTransformation as the cond_stage_model, the undo_hijack should be performed using the FrozenXLMREmbedderWithCustomWords type; otherwise, it will result in a failed model reload.
* | Merge pull request #11757 from AUTOMATIC1111/sdxlAUTOMATIC11112023-07-161-0/+38
|\ \ | | | | | | SD XL support
| * | initial SDXL refiner supportAUTOMATIC11112023-07-141-5/+13
| | |
| * | fix CLIP doing the unneeded normalizationAUTOMATIC11112023-07-131-1/+1
| | | | | | | | | | | | | | | revert SD2.1 back to use the original repo add SDXL's force_zero_embeddings to negative prompt
| * | SDXL supportAUTOMATIC11112023-07-121-1/+22
| | |
| * | getting SD2.1 to run on SDXL repoAUTOMATIC11112023-07-111-0/+9
| |/
* / add textual inversion hashes to infotextAUTOMATIC11112023-07-151-1/+4
|/
* revert default cross attention optimization to DoggettxAUTOMATIC2023-06-011-0/+2
| | | | make --disable-opt-split-attention command line option work again
* custom unet supportAUTOMATIC2023-05-271-6/+14
|
* possible fix for empty list of optimizations #10605AUTOMATIC2023-05-231-6/+15
|
* make it actually work after suggestionsAUTOMATIC2023-05-191-1/+1
|
* fix linter issuesAUTOMATIC2023-05-181-1/+0
|
* make it possible for scripts to add cross attention optimizationsAUTOMATIC2023-05-181-41/+49
| | | | add UI selection for cross attention optimization
* fix model loading twice in some situationsAUTOMATIC2023-05-141-0/+3
|
* Autofix Ruff W (not W605) (mostly whitespace)Aarni Koskela2023-05-111-6/+6
|
* ruff auto fixesAUTOMATIC2023-05-101-1/+1
|
* imports cleanup for ruffAUTOMATIC2023-05-101-1/+1
|
* autofixes from ruffAUTOMATIC2023-05-101-2/+2
|
* sdp_attnblock_forward hijackPam2023-03-101-0/+2
|
* sdp refactoringPam2023-03-101-9/+10
|
* argument to disable memory efficient for sdpPam2023-03-101-3/+8
|
* scaled dot product attentionPam2023-03-061-0/+4
|
* Merge branch 'master' into weighted-learningAUTOMATIC11112023-02-191-0/+2
|\
| * Apply hijacks in ddpm_edit for upcast samplingbrkirch2023-02-081-0/+3
| | | | | | | | To avoid import errors, ddpm_edit hijacks are done after an instruct pix2pix model is loaded.
* | Hijack to add weighted_forward to model: return loss * weight mapShondoit2023-02-151-0/+52
|/
* Merge pull request #7309 from brkirch/fix-embeddingsAUTOMATIC11112023-01-281-1/+1
|\ | | | | Fix embeddings, upscalers, and refactor `--upcast-sampling`
| * Refactor conditional casting, fix upscalersbrkirch2023-01-281-1/+1
| |
| * Fix embeddings dtype mismatchbrkirch2023-01-261-1/+1
| |
* | automatically detect v-parameterization for SD2 checkpointsAUTOMATIC2023-01-281-0/+2
|/
* write a comment for fix_checkpoint functionAUTOMATIC2023-01-191-0/+7
|
* add option to show/hide warningsAUTOMATIC2023-01-181-8/+0
| | | | | removed hiding warnings from LDSR fixed/reworked few places that produced warnings
* make it possible for extensions/scripts to add their own embedding directoriesAUTOMATIC2023-01-081-3/+4
|
* Merge pull request #6055 from brkirch/sub-quad_attn_optAUTOMATIC11112023-01-071-12/+9
|\ | | | | Add Birch-san's sub-quadratic attention implementation
| * Allow Doggettx's cross attention opt without CUDAbrkirch2023-01-061-1/+1
| |
| * Merge remote-tracking branch 'upstream/master' into sub-quad_attn_optbrkirch2023-01-061-1/+11
| |\
| * \ Merge branch 'AUTOMATIC1111:master' into sub-quad_attn_optbrkirch2023-01-061-6/+19
| |\ \
| * | | Add Birch-san's sub-quadratic attention implementationbrkirch2023-01-061-9/+6
| | | |
* | | | CLIP hijack reworkAUTOMATIC2023-01-061-3/+3
| |_|/ |/| |
* | | add cross-attention infoVladimir Mandic2023-01-041-1/+11
| |/ |/|
* | alt-diffusion integrationAUTOMATIC2022-12-311-8/+10
| |
* | Merge remote-tracking branch 'baai-open-internal/master' into alt-diffusionAUTOMATIC2022-12-311-6/+17
|\ \ | |/ |/|
| * add hash and fix undo hijack bugzhaohu xing2022-12-061-1/+5
| | | | | | | | Signed-off-by: zhaohu xing <920232796@qq.com>
| * Merge pull request #3 from 920232796/masterZac Liu2022-12-061-1/+1
| |\ | | | | | | | | | fix device support for mps update the support for SD2.0