aboutsummaryrefslogtreecommitdiffstats
path: root/modules/sd_hijack.py
Commit message (Expand)AuthorAgeFilesLines
* fix for incorrect model weight loading for #814AUTOMATIC2022-09-291-0/+9
* new implementation for attention/emphasisAUTOMATIC2022-09-291-4/+98
* switched the token counter to use hidden buttons instead of api callLiam2022-09-271-2/+1
* added token counter next to txt2img and img2img promptsLiam2022-09-271-8/+22
* potential fix for embeddings no loading on AMD cardsAUTOMATIC2022-09-251-2/+2
* Fix token max lengthguaneec2022-09-251-1/+1
* --opt-split-attention now on by default for torch.cuda, off for others (cpu a...AUTOMATIC2022-09-211-1/+1
* fix for too large embeddings causing an errorAUTOMATIC2022-09-201-1/+1
* fix a off by one error with embedding at the start of the sentenceAUTOMATIC2022-09-201-1/+1
* add the part that was missing for word textual inversion checksumsAUTOMATIC2022-09-201-1/+1
* Making opt split attention the default. Are you upset about this? Sorry.AUTOMATIC2022-09-181-3/+3
* .....C43H66N12O12S22022-09-171-2/+2
* Move scale multiplication to the frontC43H66N12O12S22022-09-171-2/+2
* fix typoC43H66N12O12S22022-09-151-1/+1
* pass dtype to torch.zeros as wellC43H66N12O12S22022-09-151-1/+1
* Complete cross attention updateC43H66N12O12S22022-09-131-1/+73
* Update cross attention to the newest versionC43H66N12O12S22022-09-121-3/+4
* added --opt-split-attention-v1AUTOMATIC2022-09-101-0/+33
* Update to cross attention from https://github.com/Doggettx/stable-diffusion #219AUTOMATIC2022-09-101-10/+37
* support for sd-concepts as alternatives for textual inversion #151AUTOMATIC2022-09-081-5/+15
* directly convert list to tensorxeonvs2022-09-071-4/+1
* Added support for launching on Apple Siliconxeonvs2022-09-071-1/+4
* re-integrated tiling option as a UI elementAUTOMATIC2022-09-051-0/+20
* add an option to enable tiling image generationAUTOMATIC2022-09-041-0/+5
* add split attention layer optimization from https://github.com/basujindal/sta...AUTOMATIC2022-09-041-1/+43
* split codebase into multiple files; to anyone this affects negatively: sorryAUTOMATIC2022-09-031-0/+208