Commit message (Collapse) | Author | Age | Files | Lines | |
---|---|---|---|---|---|
* | cleanup some unneeded imports for hijack files | AUTOMATIC | 2022-12-10 | 1 | -8/+2 |
| | |||||
* | do not replace entire unet for the resolution hack | AUTOMATIC | 2022-12-10 | 1 | -2/+3 |
| | |||||
* | Merge pull request #4978 from aliencaocao/support_any_resolution | AUTOMATIC1111 | 2022-12-10 | 1 | -0/+1 |
|\ | | | | | Patch UNet Forward to support resolutions that are not multiples of 64 | ||||
| * | Merge branch 'master' into support_any_resolution | Billy Cao | 2022-11-27 | 1 | -269/+29 |
| |\ | |||||
| * | | Patch UNet Forward to support resolutions that are not multiples of 64 | Billy Cao | 2022-11-23 | 1 | -0/+2 |
| | | | | | | | | | | | | Also modifed the UI to no longer step in 64 | ||||
* | | | move #5216 to the extension | AUTOMATIC | 2022-12-03 | 1 | -1/+1 |
| | | | |||||
* | | | Merge remote-tracking branch 'wywywywy/autoencoder-hijack' | AUTOMATIC | 2022-12-03 | 1 | -1/+1 |
|\ \ \ | |||||
| * | | | Add autoencoder to sd_hijack | wywywywy | 2022-11-29 | 1 | -1/+1 |
| | | | | |||||
* | | | | Merge pull request #5194 from brkirch/autocast-and-mps-randn-fixes | AUTOMATIC1111 | 2022-12-03 | 1 | -5/+1 |
|\ \ \ \ | | | | | | | | | | | Use devices.autocast() and fix MPS randn issues | ||||
| * | | | | Refactor and instead check if mps is being used, not availability | brkirch | 2022-11-29 | 1 | -5/+1 |
| |/ / / | |||||
* / / / | Fixed AttributeError where openaimodel is not found | SmirkingFace | 2022-12-02 | 1 | -0/+1 |
|/ / / | |||||
* | | | Merge remote-tracking branch 'flamelaw/master' | AUTOMATIC | 2022-11-27 | 1 | -2/+7 |
|\ \ \ | |_|/ |/| | | |||||
| * | | Gradient accumulation, autocast fix, new latent sampling method, etc | flamelaw | 2022-11-20 | 1 | -2/+7 |
| |/ | |||||
* | | restore hypernetworks to seemingly working state | AUTOMATIC | 2022-11-26 | 1 | -1/+2 |
| | | |||||
* | | Add support Stable Diffusion 2.0 | AUTOMATIC | 2022-11-26 | 1 | -269/+28 |
|/ | |||||
* | cleanly undo circular hijack #4818 | killfrenzy96 | 2022-11-18 | 1 | -1/+1 |
| | |||||
* | use the new devices.has_mps() function in register_buffer for DDIM/PLMS fix ↵ | AUTOMATIC | 2022-11-12 | 1 | -2/+1 |
| | | | | for OSX | ||||
* | move DDIM/PLMS fix for OSX out of the file with inpainting code. | AUTOMATIC | 2022-11-11 | 1 | -0/+23 |
| | |||||
* | Unload sd_model before loading the other | Jairo Correa | 2022-11-01 | 1 | -0/+4 |
| | |||||
* | removed aesthetic gradients as built-in | AUTOMATIC | 2022-10-22 | 1 | -1/+0 |
| | | | | added support for extensions | ||||
* | make aestetic embedding ciompatible with prompts longer than 75 tokens | AUTOMATIC | 2022-10-21 | 1 | -1/+1 |
| | |||||
* | Merge branch 'ae' | AUTOMATIC | 2022-10-21 | 1 | -15/+15 |
|\ | |||||
| * | ui fix, re organization of the code | MalumaDev | 2022-10-16 | 1 | -97/+5 |
| | | |||||
| * | ui fix | MalumaDev | 2022-10-16 | 1 | -1/+1 |
| | | |||||
| * | ui fix | MalumaDev | 2022-10-16 | 1 | -2/+1 |
| | | |||||
| * | Merge remote-tracking branch 'origin/test_resolve_conflicts' into ↵ | MalumaDev | 2022-10-15 | 1 | -2/+2 |
| |\ | | | | | | | | | | test_resolve_conflicts | ||||
| | * | Merge branch 'master' into test_resolve_conflicts | MalumaDev | 2022-10-15 | 1 | -2/+2 |
| | |\ | |||||
| * | | | fixed dropbox update | MalumaDev | 2022-10-15 | 1 | -2/+2 |
| |/ / | |||||
| * | | fix to tokens lenght, addend embs generator, add new features to edit the ↵ | MalumaDev | 2022-10-15 | 1 | -38/+73 |
| | | | | | | | | | | | | embedding before the generation using text | ||||
| * | | init | MalumaDev | 2022-10-14 | 1 | -2/+78 |
| | | | |||||
* | | | Update sd_hijack.py | C43H66N12O12S2 | 2022-10-18 | 1 | -1/+1 |
| | | | |||||
* | | | use legacy attnblock | C43H66N12O12S2 | 2022-10-18 | 1 | -1/+1 |
| |/ |/| | |||||
* | | Update sd_hijack.py | C43H66N12O12S2 | 2022-10-15 | 1 | -1/+1 |
|/ | |||||
* | fix iterator bug for #2295 | AUTOMATIC | 2022-10-12 | 1 | -4/+4 |
| | |||||
* | Account when lines are mismatched | hentailord85ez | 2022-10-12 | 1 | -1/+11 |
| | |||||
* | Add check for psutil | brkirch | 2022-10-11 | 1 | -2/+8 |
| | |||||
* | Add cross-attention optimization from InvokeAI | brkirch | 2022-10-11 | 1 | -1/+4 |
| | | | | | | * Add cross-attention optimization from InvokeAI (~30% speed improvement on MPS) * Add command line option for it * Make it default when CUDA is unavailable | ||||
* | rename hypernetwork dir to hypernetworks to prevent clash with an old ↵ | AUTOMATIC | 2022-10-11 | 1 | -1/+1 |
| | | | | filename that people who use zip instead of git clone will have | ||||
* | Merge branch 'master' into hypernetwork-training | AUTOMATIC | 2022-10-11 | 1 | -30/+93 |
|\ | |||||
| * | Comma backtrack padding (#2192) | hentailord85ez | 2022-10-11 | 1 | -1/+18 |
| | | | | | | Comma backtrack padding | ||||
| * | allow pascal onwards | C43H66N12O12S2 | 2022-10-10 | 1 | -1/+1 |
| | | |||||
| * | Add back in output hidden states parameter | hentailord85ez | 2022-10-10 | 1 | -1/+1 |
| | | |||||
| * | Pad beginning of textual inversion embedding | hentailord85ez | 2022-10-10 | 1 | -0/+5 |
| | | |||||
| * | Unlimited Token Works | hentailord85ez | 2022-10-10 | 1 | -23/+46 |
| | | | | | | Unlimited tokens actually work now. Works with textual inversion too. Replaces the previous not-so-much-working implementation. | ||||
| * | Removed unnecessary tmp variable | Fampai | 2022-10-09 | 1 | -4/+3 |
| | | |||||
| * | Updated code for legibility | Fampai | 2022-10-09 | 1 | -2/+5 |
| | | |||||
| * | Optimized code for Ignoring last CLIP layers | Fampai | 2022-10-09 | 1 | -8/+4 |
| | | |||||
| * | Added ability to ignore last n layers in FrozenCLIPEmbedder | Fampai | 2022-10-08 | 1 | -2/+9 |
| | | |||||
| * | add --force-enable-xformers option and also add messages to console ↵ | AUTOMATIC | 2022-10-08 | 1 | -1/+5 |
| | | | | | | | | regarding cross attention optimizations | ||||
| * | check for ampere without destroying the optimizations. again. | C43H66N12O12S2 | 2022-10-08 | 1 | -4/+3 |
| | |