aboutsummaryrefslogtreecommitdiffstats
path: root/modules/sd_hijack.py
Commit message (Collapse)AuthorAgeFilesLines
* alternate implementation for unet forward replacement that does not depend ↵AUTOMATIC11112023-12-021-2/+5
| | | | on hijack being applied
* potential fix for #14172AUTOMATIC11112023-12-021-8/+4
|
* Fix bug where is_using_v_parameterization_for_sd2 fails because the ↵MrCheeze2023-12-021-3/+6
| | | | sd_hijack is only partially undone
* fix exception related to the pix2pixAUTOMATIC11112023-11-061-0/+4
|
* more changes for #13865: fix formatting, rename the function, add comment ↵AUTOMATIC11112023-11-051-11/+13
| | | | and add a readme entry
* linterAUTOMATIC11112023-11-051-1/+1
|
* Merge branch 'dev' into masterAUTOMATIC11112023-11-051-7/+14
|\
| * Merge pull request #13364 from superhero-7/masterAUTOMATIC11112023-10-141-2/+2
| |\ | | | | | | Add altdiffusion-m18 support
| | * fix linter issuessuperhero-72023-10-011-2/+2
| | |
| | * support altdiffusion-m18superhero-72023-09-231-0/+2
| | |
| | * support m18superhero-72023-09-231-4/+2
| | |
| * | initial work on sd_unet for SDXLAUTOMATIC11112023-09-111-5/+12
| |/
* | Use devices.torch_gc() instead of empty_cache()Ritesh Gangnani2023-11-051-4/+1
| |
* | Added memory clearance after deletionRitesh Gangnani2023-11-051-1/+5
| |
* | Add SSD-1B as a supported modelRitesh Gangnani2023-11-051-0/+11
|/
* implement undo hijack for SDXLAUTOMATIC11112023-08-191-1/+15
|
* REMOVEAUTOMATIC11112023-08-081-3/+1
|
* Merge branch 'dev' into multiple_loaded_modelsAUTOMATIC11112023-08-051-3/+3
|\
| * resolve some of circular import issues for kohakuAUTOMATIC11112023-08-041-3/+3
| |
* | repair PLMSAUTOMATIC11112023-07-311-1/+3
| |
* | option to keep multiple models in memoryAUTOMATIC11112023-07-311-2/+4
|/
* textual inversion support for SDXLAUTOMATIC11112023-07-291-3/+5
|
* Merge pull request #11878 from Bourne-M/patch-1AUTOMATIC11112023-07-191-1/+1
|\ | | | | 【bug】reload altclip model error
| * 【bug】reload altclip model erroryfzhou2023-07-191-1/+1
| | | | | | When using BertSeriesModelWithTransformation as the cond_stage_model, the undo_hijack should be performed using the FrozenXLMREmbedderWithCustomWords type; otherwise, it will result in a failed model reload.
* | Merge pull request #11757 from AUTOMATIC1111/sdxlAUTOMATIC11112023-07-161-0/+38
|\ \ | | | | | | SD XL support
| * | initial SDXL refiner supportAUTOMATIC11112023-07-141-5/+13
| | |
| * | fix CLIP doing the unneeded normalizationAUTOMATIC11112023-07-131-1/+1
| | | | | | | | | | | | | | | revert SD2.1 back to use the original repo add SDXL's force_zero_embeddings to negative prompt
| * | SDXL supportAUTOMATIC11112023-07-121-1/+22
| | |
| * | getting SD2.1 to run on SDXL repoAUTOMATIC11112023-07-111-0/+9
| |/
* / add textual inversion hashes to infotextAUTOMATIC11112023-07-151-1/+4
|/
* revert default cross attention optimization to DoggettxAUTOMATIC2023-06-011-0/+2
| | | | make --disable-opt-split-attention command line option work again
* custom unet supportAUTOMATIC2023-05-271-6/+14
|
* possible fix for empty list of optimizations #10605AUTOMATIC2023-05-231-6/+15
|
* make it actually work after suggestionsAUTOMATIC2023-05-191-1/+1
|
* fix linter issuesAUTOMATIC2023-05-181-1/+0
|
* make it possible for scripts to add cross attention optimizationsAUTOMATIC2023-05-181-41/+49
| | | | add UI selection for cross attention optimization
* fix model loading twice in some situationsAUTOMATIC2023-05-141-0/+3
|
* Autofix Ruff W (not W605) (mostly whitespace)Aarni Koskela2023-05-111-6/+6
|
* ruff auto fixesAUTOMATIC2023-05-101-1/+1
|
* imports cleanup for ruffAUTOMATIC2023-05-101-1/+1
|
* autofixes from ruffAUTOMATIC2023-05-101-2/+2
|
* sdp_attnblock_forward hijackPam2023-03-101-0/+2
|
* sdp refactoringPam2023-03-101-9/+10
|
* argument to disable memory efficient for sdpPam2023-03-101-3/+8
|
* scaled dot product attentionPam2023-03-061-0/+4
|
* Merge branch 'master' into weighted-learningAUTOMATIC11112023-02-191-0/+2
|\
| * Apply hijacks in ddpm_edit for upcast samplingbrkirch2023-02-081-0/+3
| | | | | | | | To avoid import errors, ddpm_edit hijacks are done after an instruct pix2pix model is loaded.
* | Hijack to add weighted_forward to model: return loss * weight mapShondoit2023-02-151-0/+52
|/
* Merge pull request #7309 from brkirch/fix-embeddingsAUTOMATIC11112023-01-281-1/+1
|\ | | | | Fix embeddings, upscalers, and refactor `--upcast-sampling`
| * Refactor conditional casting, fix upscalersbrkirch2023-01-281-1/+1
| |