aboutsummaryrefslogtreecommitdiffstats
path: root/modules
Commit message (Expand)AuthorAgeFilesLines
* Added ability to ignore last n layers in FrozenCLIPEmbedderFampai2022-10-082-2/+10
* Update ui.pyDepFA2022-10-081-1/+1
* TI preprocess wordingDepFA2022-10-081-3/+3
* add --force-enable-xformers option and also add messages to console regarding...AUTOMATIC2022-10-082-1/+6
* add fallback for xformers_attnblock_forwardAUTOMATIC2022-10-081-1/+4
* alternate promptArtem Zagidulin2022-10-081-2/+7
* check for ampere without destroying the optimizations. again.C43H66N12O12S22022-10-081-4/+3
* check for ampereC43H66N12O12S22022-10-081-3/+4
* why did you do thisAUTOMATIC2022-10-081-1/+1
* Fixed typoMilly2022-10-081-1/+1
* restore old opt_split_attention/disable_opt_split_attention logicAUTOMATIC2022-10-081-1/+1
* simplify xfrmers options: --xformers to enable and that's itAUTOMATIC2022-10-083-9/+15
* emergency fix for xformers (continue + shared)AUTOMATIC2022-10-081-8/+8
* Merge pull request #1851 from C43H66N12O12S2/flashAUTOMATIC11112022-10-083-6/+45
|\
| * Update sd_hijack.pyC43H66N12O12S22022-10-081-1/+1
| * update sd_hijack_opt to respect new env variablesC43H66N12O12S22022-10-081-3/+8
| * add xformers_available shared variableC43H66N12O12S22022-10-081-1/+1
| * default to split attention if cuda is available and xformers is notC43H66N12O12S22022-10-081-2/+2
| * use new attnblock for xformers pathC43H66N12O12S22022-10-081-1/+1
| * Update sd_hijack_optimizations.pyC43H66N12O12S22022-10-081-1/+1
| * add xformers attnblock and hypernetwork supportC43H66N12O12S22022-10-081-2/+18
| * delete broken and unnecessary aliasesC43H66N12O12S22022-10-081-6/+4
| * switch to the proper way of calling xformersC43H66N12O12S22022-10-081-25/+3
| * Update sd_hijack.pyC43H66N12O12S22022-10-071-1/+1
| * Update sd_hijack.pyC43H66N12O12S22022-10-071-2/+2
| * Update sd_hijack.pyC43H66N12O12S22022-10-071-2/+1
| * Update shared.pyC43H66N12O12S22022-10-071-0/+1
| * Update sd_hijack.pyC43H66N12O12S22022-10-071-4/+9
| * add xformers attentionC43H66N12O12S22022-10-071-1/+38
* | fix bug where when using prompt composition, hijack_comments generated before...MrCheeze2022-10-082-1/+5
* | fix glob path in hypernetwork.pyddPn082022-10-081-1/+1
* | fix AND broken for long promptsAUTOMATIC2022-10-081-0/+9
* | fix bugs related to variable prompt lengthsAUTOMATIC2022-10-082-12/+37
* | do not let user choose his own prompt token count limitAUTOMATIC2022-10-083-21/+12
* | check specifically for skippedTrung Ngo2022-10-084-7/+3
* | Add button to skip the current iterationTrung Ngo2022-10-084-0/+21
* | Merge remote-tracking branch 'origin/master'AUTOMATIC2022-10-081-1/+5
|\ \
| * | fix: handles when state_dict does not existleko2022-10-081-1/+5
* | | let user choose his own prompt token count limitAUTOMATIC2022-10-083-8/+16
|/ /
* | Add hypernetwork support to split cross attention v1brkirch2022-10-082-5/+15
* | make it possible to use hypernetworks without opt split attentionAUTOMATIC2022-10-072-10/+38
* | do not stop working on failed hypernetwork loadAUTOMATIC2022-10-071-2/+9
* | support loading VAEAUTOMATIC2022-10-071-0/+8
* | added support for hypernetworks (???)AUTOMATIC2022-10-073-3/+78
|/
* karras samplers for img2img?AUTOMATIC2022-10-061-2/+4
* Prefer using `Processed.sd_model_hash` attribute when filename patternMilly2022-10-061-1/+1
* Added job_timestamp to ProcessedMilly2022-10-062-1/+3
* Added styles to ProcessedMilly2022-10-062-6/+3
* Removed duplicate defined models_pathMilly2022-10-061-10/+9
* add generation parameters to images shown in web uiAUTOMATIC2022-10-061-2/+6