aboutsummaryrefslogtreecommitdiffstats
path: root/modules
Commit message (Expand)AuthorAgeFilesLines
...
* | | | | | | source checkpoint hash from current checkpointDepFA2022-10-091-4/+2
* | | | | | | correct case on embeddingFromB64DepFA2022-10-091-1/+1
* | | | | | | change json tensor key nameDepFA2022-10-091-3/+3
* | | | | | | add encoder and decoder classesDepFA2022-10-091-0/+21
* | | | | | | add alternate checkpoint hash sourceDepFA2022-10-091-2/+5
* | | | | | | add embedding load and save from b64 jsonDepFA2022-10-091-9/+21
* | | | | | | Add pretty image captioning functionsDepFA2022-10-091-0/+31
* | | | | | | add embed embedding to uiDepFA2022-10-091-1/+3
* | | | | | | Update textual_inversion.pyDepFA2022-10-091-3/+22
|/ / / / / /
* | | | | | support loading .yaml config with same name as modelAUTOMATIC2022-10-082-8/+24
* | | | | | chore: Fix typosAidan Holland2022-10-088-15/+15
* | | | | | Break after finding the local directory of stable diffusionEdouard Leurent2022-10-081-0/+1
* | | | | | add 'Ignore last layers of CLIP model' option as a parameter to the infotextAUTOMATIC2022-10-081-1/+5
* | | | | | make --force-enable-xformers work without needing --xformersAUTOMATIC2022-10-081-1/+1
* | | | | | Added ability to ignore last n layers in FrozenCLIPEmbedderFampai2022-10-082-2/+10
* | | | | | Update ui.pyDepFA2022-10-081-1/+1
* | | | | | TI preprocess wordingDepFA2022-10-081-3/+3
| |_|_|_|/ |/| | | |
* | | | | add --force-enable-xformers option and also add messages to console regarding...AUTOMATIC2022-10-082-1/+6
* | | | | add fallback for xformers_attnblock_forwardAUTOMATIC2022-10-081-1/+4
* | | | | alternate promptArtem Zagidulin2022-10-081-2/+7
* | | | | check for ampere without destroying the optimizations. again.C43H66N12O12S22022-10-081-4/+3
* | | | | check for ampereC43H66N12O12S22022-10-081-3/+4
| |_|_|/ |/| | |
* | | | why did you do thisAUTOMATIC2022-10-081-1/+1
| |_|/ |/| |
* | | Fixed typoMilly2022-10-081-1/+1
* | | restore old opt_split_attention/disable_opt_split_attention logicAUTOMATIC2022-10-081-1/+1
* | | simplify xfrmers options: --xformers to enable and that's itAUTOMATIC2022-10-083-9/+15
* | | emergency fix for xformers (continue + shared)AUTOMATIC2022-10-081-8/+8
* | | Merge pull request #1851 from C43H66N12O12S2/flashAUTOMATIC11112022-10-083-6/+45
|\ \ \
| * | | Update sd_hijack.pyC43H66N12O12S22022-10-081-1/+1
| * | | update sd_hijack_opt to respect new env variablesC43H66N12O12S22022-10-081-3/+8
| * | | add xformers_available shared variableC43H66N12O12S22022-10-081-1/+1
| * | | default to split attention if cuda is available and xformers is notC43H66N12O12S22022-10-081-2/+2
| * | | use new attnblock for xformers pathC43H66N12O12S22022-10-081-1/+1
| * | | Update sd_hijack_optimizations.pyC43H66N12O12S22022-10-081-1/+1
| * | | add xformers attnblock and hypernetwork supportC43H66N12O12S22022-10-081-2/+18
| * | | delete broken and unnecessary aliasesC43H66N12O12S22022-10-081-6/+4
| * | | switch to the proper way of calling xformersC43H66N12O12S22022-10-081-25/+3
| * | | Update sd_hijack.pyC43H66N12O12S22022-10-071-1/+1
| * | | Update sd_hijack.pyC43H66N12O12S22022-10-071-2/+2
| * | | Update sd_hijack.pyC43H66N12O12S22022-10-071-2/+1
| * | | Update shared.pyC43H66N12O12S22022-10-071-0/+1
| * | | Update sd_hijack.pyC43H66N12O12S22022-10-071-4/+9
| * | | add xformers attentionC43H66N12O12S22022-10-071-1/+38
* | | | fix bug where when using prompt composition, hijack_comments generated before...MrCheeze2022-10-082-1/+5
* | | | fix glob path in hypernetwork.pyddPn082022-10-081-1/+1
* | | | fix AND broken for long promptsAUTOMATIC2022-10-081-0/+9
* | | | fix bugs related to variable prompt lengthsAUTOMATIC2022-10-082-12/+37
* | | | do not let user choose his own prompt token count limitAUTOMATIC2022-10-083-21/+12
* | | | check specifically for skippedTrung Ngo2022-10-084-7/+3
* | | | Add button to skip the current iterationTrung Ngo2022-10-084-0/+21