aboutsummaryrefslogtreecommitdiffstats
path: root/modules/sd_hijack.py
Commit message (Expand)AuthorAgeFilesLines
* alternate implementation for unet forward replacement that does not depend on...AUTOMATIC11112023-12-021-2/+5
* potential fix for #14172AUTOMATIC11112023-12-021-8/+4
* Fix bug where is_using_v_parameterization_for_sd2 fails because the sd_hijack...MrCheeze2023-12-021-3/+6
* fix exception related to the pix2pixAUTOMATIC11112023-11-061-0/+4
* more changes for #13865: fix formatting, rename the function, add comment and...AUTOMATIC11112023-11-051-11/+13
* linterAUTOMATIC11112023-11-051-1/+1
* Merge branch 'dev' into masterAUTOMATIC11112023-11-051-7/+14
|\
| * Merge pull request #13364 from superhero-7/masterAUTOMATIC11112023-10-141-2/+2
| |\
| | * fix linter issuessuperhero-72023-10-011-2/+2
| | * support altdiffusion-m18superhero-72023-09-231-0/+2
| | * support m18superhero-72023-09-231-4/+2
| * | initial work on sd_unet for SDXLAUTOMATIC11112023-09-111-5/+12
| |/
* | Use devices.torch_gc() instead of empty_cache()Ritesh Gangnani2023-11-051-4/+1
* | Added memory clearance after deletionRitesh Gangnani2023-11-051-1/+5
* | Add SSD-1B as a supported modelRitesh Gangnani2023-11-051-0/+11
|/
* implement undo hijack for SDXLAUTOMATIC11112023-08-191-1/+15
* REMOVEAUTOMATIC11112023-08-081-3/+1
* Merge branch 'dev' into multiple_loaded_modelsAUTOMATIC11112023-08-051-3/+3
|\
| * resolve some of circular import issues for kohakuAUTOMATIC11112023-08-041-3/+3
* | repair PLMSAUTOMATIC11112023-07-311-1/+3
* | option to keep multiple models in memoryAUTOMATIC11112023-07-311-2/+4
|/
* textual inversion support for SDXLAUTOMATIC11112023-07-291-3/+5
* Merge pull request #11878 from Bourne-M/patch-1AUTOMATIC11112023-07-191-1/+1
|\
| * 【bug】reload altclip model erroryfzhou2023-07-191-1/+1
* | Merge pull request #11757 from AUTOMATIC1111/sdxlAUTOMATIC11112023-07-161-0/+38
|\ \
| * | initial SDXL refiner supportAUTOMATIC11112023-07-141-5/+13
| * | fix CLIP doing the unneeded normalizationAUTOMATIC11112023-07-131-1/+1
| * | SDXL supportAUTOMATIC11112023-07-121-1/+22
| * | getting SD2.1 to run on SDXL repoAUTOMATIC11112023-07-111-0/+9
| |/
* / add textual inversion hashes to infotextAUTOMATIC11112023-07-151-1/+4
|/
* revert default cross attention optimization to DoggettxAUTOMATIC2023-06-011-0/+2
* custom unet supportAUTOMATIC2023-05-271-6/+14
* possible fix for empty list of optimizations #10605AUTOMATIC2023-05-231-6/+15
* make it actually work after suggestionsAUTOMATIC2023-05-191-1/+1
* fix linter issuesAUTOMATIC2023-05-181-1/+0
* make it possible for scripts to add cross attention optimizationsAUTOMATIC2023-05-181-41/+49
* fix model loading twice in some situationsAUTOMATIC2023-05-141-0/+3
* Autofix Ruff W (not W605) (mostly whitespace)Aarni Koskela2023-05-111-6/+6
* ruff auto fixesAUTOMATIC2023-05-101-1/+1
* imports cleanup for ruffAUTOMATIC2023-05-101-1/+1
* autofixes from ruffAUTOMATIC2023-05-101-2/+2
* sdp_attnblock_forward hijackPam2023-03-101-0/+2
* sdp refactoringPam2023-03-101-9/+10
* argument to disable memory efficient for sdpPam2023-03-101-3/+8
* scaled dot product attentionPam2023-03-061-0/+4
* Merge branch 'master' into weighted-learningAUTOMATIC11112023-02-191-0/+2
|\
| * Apply hijacks in ddpm_edit for upcast samplingbrkirch2023-02-081-0/+3
* | Hijack to add weighted_forward to model: return loss * weight mapShondoit2023-02-151-0/+52
|/
* Merge pull request #7309 from brkirch/fix-embeddingsAUTOMATIC11112023-01-281-1/+1
|\
| * Refactor conditional casting, fix upscalersbrkirch2023-01-281-1/+1