aboutsummaryrefslogtreecommitdiffstats
path: root/modules/sd_hijack_optimizations.py
AgeCommit message (Expand)AuthorLines
2023-05-11Autofix Ruff W (not W605) (mostly whitespace)Aarni Koskela-16/+16
2023-05-10ruff auto fixesAUTOMATIC-7/+7
2023-05-10autofixes from ruffAUTOMATIC-1/+0
2023-05-08Fix for Unet NaNsbrkirch-0/+3
2023-03-24Update sd_hijack_optimizations.pyFNSpd-1/+1
2023-03-21Update sd_hijack_optimizations.pyFNSpd-1/+1
2023-03-10sdp_attnblock_forward hijackPam-0/+24
2023-03-10argument to disable memory efficient for sdpPam-0/+4
2023-03-07scaled dot product attentionPam-0/+42
2023-01-25Add UI setting for upcasting attention to float32brkirch-60/+99
2023-01-23better support for xformers flash attention on older versions of torchAUTOMATIC-24/+18
2023-01-21add --xformers-flash-attention option & implTakuma Mori-2/+24
2023-01-21extra networks UIAUTOMATIC-5/+5
2023-01-06Added licensebrkirch-0/+1
2023-01-06Change sub-quad chunk threshold to use percentagebrkirch-9/+9
2023-01-06Add Birch-san's sub-quadratic attention implementationbrkirch-25/+99
2022-12-20Use other MPS optimization for large q.shape[0] * q.shape[1]brkirch-4/+6
2022-12-10cleanup some unneeded imports for hijack filesAUTOMATIC-3/+0
2022-12-10do not replace entire unet for the resolution hackAUTOMATIC-28/+0
2022-11-23Patch UNet Forward to support resolutions that are not multiples of 64Billy Cao-0/+31
2022-10-19Remove wrong self reference in CUDA support for invokeaiCheka-1/+1
2022-10-18Update sd_hijack_optimizations.pyC43H66N12O12S2-0/+3
2022-10-18readd xformers attnblockC43H66N12O12S2-0/+15
2022-10-18delete xformers attnblockC43H66N12O12S2-12/+0
2022-10-11Use apply_hypernetwork functionbrkirch-10/+4
2022-10-11Add InvokeAI and lstein to credits, add back CUDA supportbrkirch-0/+13
2022-10-11Add check for psutilbrkirch-4/+15
2022-10-11Add cross-attention optimization from InvokeAIbrkirch-0/+79
2022-10-11rename hypernetwork dir to hypernetworks to prevent clash with an old filenam...AUTOMATIC-1/+1
2022-10-11fixes related to mergeAUTOMATIC-1/+2
2022-10-11replace duplicate code with a functionAUTOMATIC-29/+15
2022-10-10remove functorchC43H66N12O12S2-2/+0
2022-10-09Fix VRAM Issue by only loading in hypernetwork when selected in settingsFampai-3/+3
2022-10-08make --force-enable-xformers work without needing --xformersAUTOMATIC-1/+1
2022-10-08add fallback for xformers_attnblock_forwardAUTOMATIC-1/+4
2022-10-08simplify xfrmers options: --xformers to enable and that's itAUTOMATIC-7/+13
2022-10-08emergency fix for xformers (continue + shared)AUTOMATIC-8/+8
2022-10-08Merge pull request #1851 from C43H66N12O12S2/flashAUTOMATIC1111-1/+37
2022-10-08update sd_hijack_opt to respect new env variablesC43H66N12O12S2-3/+8
2022-10-08Update sd_hijack_optimizations.pyC43H66N12O12S2-1/+1
2022-10-08add xformers attnblock and hypernetwork supportC43H66N12O12S2-2/+18
2022-10-08Add hypernetwork support to split cross attention v1brkirch-4/+14
2022-10-08switch to the proper way of calling xformersC43H66N12O12S2-25/+3
2022-10-07added support for hypernetworks (???)AUTOMATIC-2/+15
2022-10-07add xformers attentionC43H66N12O12S2-1/+38
2022-10-02Merge branch 'master' into stableJairo Correa-0/+156
2022-10-02initial support for training textual inversionAUTOMATIC-0/+164