Commit message (Collapse) | Author | Age | Files | Lines | |
---|---|---|---|---|---|
* | add comment for #4407 and remove seemingly unnecessary cudnn.enabled | AUTOMATIC | 2022-12-03 | 1 | -1/+3 |
| | |||||
* | fix #4407 breaking UI entirely for card other than ones related to the PR | AUTOMATIC | 2022-12-03 | 1 | -4/+2 |
| | |||||
* | Merge pull request #4407 from yoinked-h/patch-1 | AUTOMATIC1111 | 2022-12-03 | 1 | -0/+7 |
|\ | | | | | Fix issue with 16xx cards | ||||
| * | actual better fix | pepe10-gpu | 2022-11-08 | 1 | -5/+2 |
| | | | | | | thanks C43H66N12O12S2 | ||||
| * | terrible hack | pepe10-gpu | 2022-11-08 | 1 | -2/+9 |
| | | |||||
| * | 16xx card fix | pepe10-gpu | 2022-11-07 | 1 | -0/+3 |
| | | | | | | cudnn | ||||
* | | Merge pull request #5251 from adieyal/bug/negative-prompt-infotext | AUTOMATIC1111 | 2022-12-03 | 1 | -1/+1 |
|\ \ | | | | | | | Fixed incorrect negative prompt text in infotext | ||||
| * | | Fixed incorrect negative prompt text in infotext | Adi Eyal | 2022-11-30 | 1 | -1/+1 |
| | | | | | | | | | | | | | | | | | | Previously only the first negative prompt in all_negative_prompts was being used for infotext. This fixes that by selecting the index-th negative prompt | ||||
* | | | Merge branch 'master' into racecond_fix | AUTOMATIC1111 | 2022-12-03 | 50 | -1375/+3161 |
|\ \ \ | |||||
| * \ \ | Merge pull request #4459 from kavorite/color-sketch-inpainting | AUTOMATIC1111 | 2022-12-03 | 3 | -10/+36 |
| |\ \ \ | | | | | | | | | | | add `--gradio-inpaint-tool` and option to specify `color-sketch` | ||||
| | * | | | blur mask with color-sketch + add paint transparency slider | kavorite | 2022-11-09 | 2 | -8/+16 |
| | | | | | |||||
| | * | | | add new color-sketch state to img2img invocation | kavorite | 2022-11-08 | 1 | -0/+1 |
| | | | | | |||||
| | * | | | add gradio-inpaint-tool; color-sketch | kavorite | 2022-11-08 | 3 | -7/+24 |
| | | | | | |||||
| * | | | | Merge pull request #5194 from brkirch/autocast-and-mps-randn-fixes | AUTOMATIC1111 | 2022-12-03 | 8 | -31/+29 |
| |\ \ \ \ | | | | | | | | | | | | | Use devices.autocast() and fix MPS randn issues | ||||
| | * | | | | Rework MPS randn fix, add randn_like fix | brkirch | 2022-11-30 | 2 | -15/+8 |
| | | | | | | | | | | | | | | | | | | | | | | | | torch.manual_seed() already sets a CPU generator, so there is no reason to create a CPU generator manually. torch.randn_like also needs a MPS fix for k-diffusion, but a torch hijack with randn_like already exists so it can also be used for that. | ||||
| | * | | | | Use devices.autocast instead of torch.autocast | brkirch | 2022-11-30 | 5 | -11/+6 |
| | | | | | | |||||
| | * | | | | Add workaround for using MPS with torchsde | brkirch | 2022-11-30 | 1 | -0/+14 |
| | | | | | | |||||
| | * | | | | Refactor and instead check if mps is being used, not availability | brkirch | 2022-11-29 | 1 | -5/+1 |
| | | | | | | |||||
| * | | | | | more simple config option name plus mouseover hint for clip skip | AUTOMATIC | 2022-12-03 | 1 | -1/+1 |
| | | | | | | |||||
| * | | | | | Merge pull request #5304 from space-nuko/fix/clip-skip-application | AUTOMATIC1111 | 2022-12-03 | 2 | -1/+5 |
| |\ \ \ \ \ | | | | | | | | | | | | | | | Fix clip skip of 1 not being restored from prompts | ||||
| | * | | | | | Fix clip skip of 1 not being restored from prompts | space-nuko | 2022-12-01 | 2 | -1/+5 |
| | | |_|/ / | | |/| | | | |||||
| * | | | | | Merge pull request #5328 from jcowens/fix-typo | AUTOMATIC1111 | 2022-12-03 | 1 | -1/+1 |
| |\ \ \ \ \ | | | | | | | | | | | | | | | fix typo | ||||
| | * | | | | | fix typo | jcowens | 2022-12-02 | 1 | -1/+1 |
| | |/ / / / | |||||
| * | | | | | Merge pull request #5331 from smirkingface/openaimodel_fix | AUTOMATIC1111 | 2022-12-03 | 1 | -0/+1 |
| |\ \ \ \ \ | | | | | | | | | | | | | | | Fixed AttributeError where openaimodel is not found | ||||
| | * | | | | | Fixed AttributeError where openaimodel is not found | SmirkingFace | 2022-12-02 | 1 | -0/+1 |
| | |/ / / / | |||||
| * | | | | | Merge pull request #5340 from PhytoEpidemic/master | AUTOMATIC1111 | 2022-12-03 | 1 | -3/+3 |
| |\ \ \ \ \ | | | | | | | | | | | | | | | Fix divide by 0 error | ||||
| | * | | | | | Fix divide by 0 error | PhytoEpidemic | 2022-12-02 | 1 | -3/+3 |
| | |/ / / / | | | | | | | | | | | | | Fix of the edge case 0 weight that occasionally will pop up in some specific situations. This was crashing the script. | ||||
| * | | | | | prevent include_init_images from being passed to ↵ | AUTOMATIC | 2022-12-03 | 1 | -2/+5 |
| | | | | | | | | | | | | | | | | | | | | | | | | StableDiffusionProcessingImg2Img in API #4989 | ||||
| * | | | | | Merge pull request #5165 from klimaleksus/fix-sequential-vae | AUTOMATIC1111 | 2022-12-03 | 1 | -2/+2 |
| |\ \ \ \ \ | | | | | | | | | | | | | | | Make VAE step sequential to prevent VRAM spikes, will fix #3059, #2082, #2561, #3462 | ||||
| | * | | | | | Make VAE step sequential to prevent VRAM spikes | klimaleksus | 2022-11-28 | 1 | -2/+2 |
| | | |/ / / | | |/| | | | |||||
| * | | | | | Fixed safe.py for pytorch 1.13 ckpt files | SmirkingFace | 2022-12-02 | 1 | -7/+11 |
| | |/ / / | |/| | | | |||||
| * | | | | Merge remote-tracking branch 'pattontim/safetensors' | AUTOMATIC | 2022-11-29 | 1 | -0/+1 |
| |\ \ \ \ | | |/ / / | |/| | | | |||||
| | * | | | safetensors optional for now | Tim Patton | 2022-11-22 | 1 | -1/+8 |
| | | | | | |||||
| | * | | | Use GPU for loading safetensors, disable export | Tim Patton | 2022-11-21 | 2 | -3/+5 |
| | | | | | |||||
| | * | | | Patch line ui endings | Tim Patton | 2022-11-21 | 1 | -1814/+1814 |
| | | | | | |||||
| | * | | | Generalize SD torch load/save to implement safetensor merging compat | Tim Patton | 2022-11-20 | 3 | -1826/+1840 |
| | | | | | |||||
| | * | | | Label and load SD .safetensors model files | Tim Patton | 2022-11-19 | 2 | -8/+17 |
| | | | | | |||||
| * | | | | fix an error that happens when you type into prompt while switching model, ↵ | AUTOMATIC | 2022-11-28 | 2 | -64/+101 |
| | | | | | | | | | | | | | | | | | | | | put queue stuff into separate file | ||||
| * | | | | make it possible to save nai model using safetensors | AUTOMATIC | 2022-11-28 | 1 | -2/+2 |
| | | | | | |||||
| * | | | | if image on disk was deleted between being generated and request being ↵ | AUTOMATIC | 2022-11-27 | 1 | -1/+1 |
| | | | | | | | | | | | | | | | | | | | | completed, do use temporary dir to store it for the browser | ||||
| * | | | | fix the bug that makes it impossible to send images to other tabs | AUTOMATIC | 2022-11-27 | 1 | -3/+4 |
| | | | | | |||||
| * | | | | Merge pull request #4688 from parasi22/resolve-embedding-name-in-filewords | AUTOMATIC1111 | 2022-11-27 | 1 | -1/+1 |
| |\ \ \ \ | | | | | | | | | | | | | resolve [name] after resolving [filewords] in training | ||||
| | * | | | | resolve [name] after resolving [filewords] in training | parasi | 2022-11-13 | 1 | -1/+1 |
| | | | | | | |||||
| * | | | | | Merge pull request #4416 from Keavon/cors-regex | AUTOMATIC1111 | 2022-11-27 | 1 | -4/+5 |
| |\ \ \ \ \ | | | | | | | | | | | | | | | Add CORS-allow policy launch argument using regex | ||||
| | * \ \ \ \ | Merge branch 'master' into cors-regex | Keavon Chambers | 2022-11-19 | 27 | -300/+624 |
| | |\ \ \ \ \ | | | | |/ / / | | | |/| | | | |||||
| | * | | | | | Add CORS-allow policy launch argument using regex | Keavon Chambers | 2022-11-07 | 1 | -3/+4 |
| | | |_|_|/ | | |/| | | | |||||
| * | | | | | Merge pull request #4919 from brkirch/deepbooru-fix | AUTOMATIC1111 | 2022-11-27 | 1 | -1/+1 |
| |\ \ \ \ \ | | | | | | | | | | | | | | | Fix support for devices other than CUDA in DeepBooru | ||||
| | * | | | | | Change .cuda() to .to(devices.device) | brkirch | 2022-11-21 | 1 | -1/+1 |
| | | | | | | | |||||
| * | | | | | | Merge pull request #5117 from aliencaocao/fix_api_sampler_name | AUTOMATIC1111 | 2022-11-27 | 1 | -2/+6 |
| |\ \ \ \ \ \ | | | | | | | | | | | | | | | | | Fix api ignoring sampler_name settings | ||||
| | * | | | | | | Prevent warning on sampler_index if sampler_name is being used | Billy Cao | 2022-11-27 | 1 | -0/+4 |
| | | | | | | | |