aboutsummaryrefslogtreecommitdiffstats
path: root/modules/sd_hijack.py
Commit message (Collapse)AuthorAgeFilesLines
* cleanup some unneeded imports for hijack filesAUTOMATIC2022-12-101-8/+2
|
* do not replace entire unet for the resolution hackAUTOMATIC2022-12-101-2/+3
|
* Merge pull request #4978 from aliencaocao/support_any_resolutionAUTOMATIC11112022-12-101-0/+1
|\ | | | | Patch UNet Forward to support resolutions that are not multiples of 64
| * Merge branch 'master' into support_any_resolutionBilly Cao2022-11-271-269/+29
| |\
| * | Patch UNet Forward to support resolutions that are not multiples of 64Billy Cao2022-11-231-0/+2
| | | | | | | | | | | | Also modifed the UI to no longer step in 64
* | | move #5216 to the extensionAUTOMATIC2022-12-031-1/+1
| | |
* | | Merge remote-tracking branch 'wywywywy/autoencoder-hijack'AUTOMATIC2022-12-031-1/+1
|\ \ \
| * | | Add autoencoder to sd_hijackwywywywy2022-11-291-1/+1
| | | |
* | | | Merge pull request #5194 from brkirch/autocast-and-mps-randn-fixesAUTOMATIC11112022-12-031-5/+1
|\ \ \ \ | | | | | | | | | | Use devices.autocast() and fix MPS randn issues
| * | | | Refactor and instead check if mps is being used, not availabilitybrkirch2022-11-291-5/+1
| |/ / /
* / / / Fixed AttributeError where openaimodel is not foundSmirkingFace2022-12-021-0/+1
|/ / /
* | | Merge remote-tracking branch 'flamelaw/master'AUTOMATIC2022-11-271-2/+7
|\ \ \ | |_|/ |/| |
| * | Gradient accumulation, autocast fix, new latent sampling method, etcflamelaw2022-11-201-2/+7
| |/
* | restore hypernetworks to seemingly working stateAUTOMATIC2022-11-261-1/+2
| |
* | Add support Stable Diffusion 2.0AUTOMATIC2022-11-261-269/+28
|/
* cleanly undo circular hijack #4818killfrenzy962022-11-181-1/+1
|
* use the new devices.has_mps() function in register_buffer for DDIM/PLMS fix ↵AUTOMATIC2022-11-121-2/+1
| | | | for OSX
* move DDIM/PLMS fix for OSX out of the file with inpainting code.AUTOMATIC2022-11-111-0/+23
|
* Unload sd_model before loading the otherJairo Correa2022-11-011-0/+4
|
* removed aesthetic gradients as built-inAUTOMATIC2022-10-221-1/+0
| | | | added support for extensions
* make aestetic embedding ciompatible with prompts longer than 75 tokensAUTOMATIC2022-10-211-1/+1
|
* Merge branch 'ae'AUTOMATIC2022-10-211-15/+15
|\
| * ui fix, re organization of the codeMalumaDev2022-10-161-97/+5
| |
| * ui fixMalumaDev2022-10-161-1/+1
| |
| * ui fixMalumaDev2022-10-161-2/+1
| |
| * Merge remote-tracking branch 'origin/test_resolve_conflicts' into ↵MalumaDev2022-10-151-2/+2
| |\ | | | | | | | | | test_resolve_conflicts
| | * Merge branch 'master' into test_resolve_conflictsMalumaDev2022-10-151-2/+2
| | |\
| * | | fixed dropbox updateMalumaDev2022-10-151-2/+2
| |/ /
| * | fix to tokens lenght, addend embs generator, add new features to edit the ↵MalumaDev2022-10-151-38/+73
| | | | | | | | | | | | embedding before the generation using text
| * | initMalumaDev2022-10-141-2/+78
| | |
* | | Update sd_hijack.pyC43H66N12O12S22022-10-181-1/+1
| | |
* | | use legacy attnblockC43H66N12O12S22022-10-181-1/+1
| |/ |/|
* | Update sd_hijack.pyC43H66N12O12S22022-10-151-1/+1
|/
* fix iterator bug for #2295AUTOMATIC2022-10-121-4/+4
|
* Account when lines are mismatchedhentailord85ez2022-10-121-1/+11
|
* Add check for psutilbrkirch2022-10-111-2/+8
|
* Add cross-attention optimization from InvokeAIbrkirch2022-10-111-1/+4
| | | | | | * Add cross-attention optimization from InvokeAI (~30% speed improvement on MPS) * Add command line option for it * Make it default when CUDA is unavailable
* rename hypernetwork dir to hypernetworks to prevent clash with an old ↵AUTOMATIC2022-10-111-1/+1
| | | | filename that people who use zip instead of git clone will have
* Merge branch 'master' into hypernetwork-trainingAUTOMATIC2022-10-111-30/+93
|\
| * Comma backtrack padding (#2192)hentailord85ez2022-10-111-1/+18
| | | | | | Comma backtrack padding
| * allow pascal onwardsC43H66N12O12S22022-10-101-1/+1
| |
| * Add back in output hidden states parameterhentailord85ez2022-10-101-1/+1
| |
| * Pad beginning of textual inversion embeddinghentailord85ez2022-10-101-0/+5
| |
| * Unlimited Token Workshentailord85ez2022-10-101-23/+46
| | | | | | Unlimited tokens actually work now. Works with textual inversion too. Replaces the previous not-so-much-working implementation.
| * Removed unnecessary tmp variableFampai2022-10-091-4/+3
| |
| * Updated code for legibilityFampai2022-10-091-2/+5
| |
| * Optimized code for Ignoring last CLIP layersFampai2022-10-091-8/+4
| |
| * Added ability to ignore last n layers in FrozenCLIPEmbedderFampai2022-10-081-2/+9
| |
| * add --force-enable-xformers option and also add messages to console ↵AUTOMATIC2022-10-081-1/+5
| | | | | | | | regarding cross attention optimizations
| * check for ampere without destroying the optimizations. again.C43H66N12O12S22022-10-081-4/+3
| |