aboutsummaryrefslogtreecommitdiffstats
path: root/modules/sd_hijack.py
Commit message (Expand)AuthorAgeFilesLines
* removed aesthetic gradients as built-inAUTOMATIC2022-10-221-1/+0
* make aestetic embedding ciompatible with prompts longer than 75 tokensAUTOMATIC2022-10-211-1/+1
* Merge branch 'ae'AUTOMATIC2022-10-211-15/+15
|\
| * ui fix, re organization of the codeMalumaDev2022-10-161-97/+5
| * ui fixMalumaDev2022-10-161-1/+1
| * ui fixMalumaDev2022-10-161-2/+1
| * Merge remote-tracking branch 'origin/test_resolve_conflicts' into test_resolv...MalumaDev2022-10-151-2/+2
| |\
| | * Merge branch 'master' into test_resolve_conflictsMalumaDev2022-10-151-2/+2
| | |\
| * | | fixed dropbox updateMalumaDev2022-10-151-2/+2
| |/ /
| * | fix to tokens lenght, addend embs generator, add new features to edit the emb...MalumaDev2022-10-151-38/+73
| * | initMalumaDev2022-10-141-2/+78
* | | Update sd_hijack.pyC43H66N12O12S22022-10-181-1/+1
* | | use legacy attnblockC43H66N12O12S22022-10-181-1/+1
| |/ |/|
* | Update sd_hijack.pyC43H66N12O12S22022-10-151-1/+1
|/
* fix iterator bug for #2295AUTOMATIC2022-10-121-4/+4
* Account when lines are mismatchedhentailord85ez2022-10-121-1/+11
* Add check for psutilbrkirch2022-10-111-2/+8
* Add cross-attention optimization from InvokeAIbrkirch2022-10-111-1/+4
* rename hypernetwork dir to hypernetworks to prevent clash with an old filenam...AUTOMATIC2022-10-111-1/+1
* Merge branch 'master' into hypernetwork-trainingAUTOMATIC2022-10-111-30/+93
|\
| * Comma backtrack padding (#2192)hentailord85ez2022-10-111-1/+18
| * allow pascal onwardsC43H66N12O12S22022-10-101-1/+1
| * Add back in output hidden states parameterhentailord85ez2022-10-101-1/+1
| * Pad beginning of textual inversion embeddinghentailord85ez2022-10-101-0/+5
| * Unlimited Token Workshentailord85ez2022-10-101-23/+46
| * Removed unnecessary tmp variableFampai2022-10-091-4/+3
| * Updated code for legibilityFampai2022-10-091-2/+5
| * Optimized code for Ignoring last CLIP layersFampai2022-10-091-8/+4
| * Added ability to ignore last n layers in FrozenCLIPEmbedderFampai2022-10-081-2/+9
| * add --force-enable-xformers option and also add messages to console regarding...AUTOMATIC2022-10-081-1/+5
| * check for ampere without destroying the optimizations. again.C43H66N12O12S22022-10-081-4/+3
| * check for ampereC43H66N12O12S22022-10-081-3/+4
| * why did you do thisAUTOMATIC2022-10-081-1/+1
| * restore old opt_split_attention/disable_opt_split_attention logicAUTOMATIC2022-10-081-1/+1
| * simplify xfrmers options: --xformers to enable and that's itAUTOMATIC2022-10-081-1/+1
| * Merge pull request #1851 from C43H66N12O12S2/flashAUTOMATIC11112022-10-081-4/+6
| |\
| | * Update sd_hijack.pyC43H66N12O12S22022-10-081-1/+1
| | * default to split attention if cuda is available and xformers is notC43H66N12O12S22022-10-081-2/+2
| | * use new attnblock for xformers pathC43H66N12O12S22022-10-081-1/+1
| | * delete broken and unnecessary aliasesC43H66N12O12S22022-10-081-6/+4
| | * Update sd_hijack.pyC43H66N12O12S22022-10-071-1/+1
| | * Update sd_hijack.pyC43H66N12O12S22022-10-071-2/+2
| | * Update sd_hijack.pyC43H66N12O12S22022-10-071-2/+1
| | * Update sd_hijack.pyC43H66N12O12S22022-10-071-4/+9
| * | fix bug where when using prompt composition, hijack_comments generated before...MrCheeze2022-10-081-1/+4
| * | fix bugs related to variable prompt lengthsAUTOMATIC2022-10-081-5/+9
| * | do not let user choose his own prompt token count limitAUTOMATIC2022-10-081-13/+12
| * | let user choose his own prompt token count limitAUTOMATIC2022-10-081-6/+7
* | | hypernetwork training mk1AUTOMATIC2022-10-071-1/+3
|/ /
* / make it possible to use hypernetworks without opt split attentionAUTOMATIC2022-10-071-2/+4
|/