aboutsummaryrefslogtreecommitdiffstats
AgeCommit message (Collapse)AuthorLines
2023-06-27Merge pull request #11415 from netux/extensions-toggle-allAUTOMATIC1111-2/+23
Add checkbox to check/uncheck all extensions in the Installed tab
2023-06-27Merge pull request #11146 from AUTOMATIC1111/api-quit-restartAUTOMATIC1111-11/+21
api quit restart
2023-06-27Merge branch 'master' into devAUTOMATIC-0/+0
2023-06-27Merge branch 'release_candidate'AUTOMATIC-995/+3379
2023-06-27Merge branch 'release_candidate' into devAUTOMATIC-84/+201
2023-06-27Merge pull request #11189 from daswer123/devAUTOMATIC1111-65/+176
Zoom and pan: More options in the settings and improved error output
2023-06-27Merge pull request #11136 from arch-fan/typoAUTOMATIC1111-1/+1
fixed typos
2023-06-27Merge pull request #11199 from akx/makedirsAUTOMATIC1111-15/+6
Use os.makedirs(..., exist_ok=True)
2023-06-27Merge pull request #11294 from zhtttylz/Fix_Typo_of_hints.jsAUTOMATIC1111-2/+2
Fix Typo of hints.js
2023-06-27Merge pull request #11408 from wfjsw/patch-1AUTOMATIC1111-0/+5
Strip whitespaces from URL and dirname prior to extension installation
2023-06-27add missing infotext entry for the pad cond/uncond optionAUTOMATIC-1/+11
2023-06-25feat(extensions): add toggle all checkbox to Installed tabMartín (Netux) Rodríguez-2/+23
Small QoL addition. While there is the option to disable all extensions with the radio buttons at the top, that only acts as an added flag and doesn't really change the state of the extensions in the UI. An use case for this checkbox is to disable all extensions except for a few, which is important for debugging extensions. You could do that before, but you'd have to uncheck and recheck every extension one by one.
2023-06-25Strip whitespaces from URL and dirname prior to extension installationJabasukuriputo Wang-0/+5
This avoid some cryptic errors brought by accidental spaces around urls
2023-06-18Fix Typo of hints.jszhtttylz-2/+2
2023-06-18update the description of --add-stop-routw-e-w-1/+1
2023-06-14terminate -> stopw-e-w-3/+3
2023-06-14response 501 if not a able to restartw-e-w-0/+1
2023-06-14update workflow kill test serverw-e-w-1/+1
2023-06-14rename routesw-e-w-5/+5
2023-06-14Formatting code with PrettierDanil Boldyrev-75/+95
2023-06-14Reworked the disabling of functions, refactored part of the codeDanil Boldyrev-135/+121
2023-06-13Use os.makedirs(..., exist_ok=True)Aarni Koskela-15/+6
2023-06-12remove console.logDanil Boldyrev-2/+0
2023-06-12Improved error output, improved settings menuDanil Boldyrev-38/+145
2023-06-12remove fastapi.Responsew-e-w-1/+1
2023-06-12move _stop route to apiw-e-w-9/+9
2023-06-10quit restartw-e-w-1/+10
2023-06-09fixed typosarch-fan-2/+2
2023-06-09Merge branch 'dev' into release_candidateAUTOMATIC-977/+3244
2023-06-09add changelog for 1.4.0AUTOMATIC-0/+57
2023-06-09linterAUTOMATIC-1/+1
2023-06-09Merge pull request #11092 from AUTOMATIC1111/Generate-Forever-during-generationAUTOMATIC1111-4/+10
Allow activation of Generate Forever during generation
2023-06-09Merge pull request #11087 from AUTOMATIC1111/persistent_conds_cacheAUTOMATIC1111-10/+17
persistent conds cache
2023-06-09Merge pull request #11123 from akx/dont-die-on-bad-symlink-loraAUTOMATIC1111-1/+5
Don't die when a LoRA is a broken symlink
2023-06-09Merge pull request #10295 from Splendide-Imaginarius/mk2-blur-maskAUTOMATIC1111-13/+38
Split mask blur into X and Y components, patch Outpainting MK2 accordingly
2023-06-09Merge pull request #11048 from DGdev91/force_python1_navi_renoirAUTOMATIC1111-1/+16
Forcing Torch Version to 1.13.1 for RX 5000 series GPUs
2023-06-09Don't die when a LoRA is a broken symlinkAarni Koskela-1/+5
Fixes #11098
2023-06-09Split Outpainting MK2 mask blur into X and Y componentsSplendide Imaginarius-9/+21
Fixes unexpected noise in non-outpainted borders when using MK2 script.
2023-06-09Split mask blur into X and Y componentsSplendide Imaginarius-4/+17
Prequisite to fixing Outpainting MK2 mask blur bug.
2023-06-08Generate Forever during generationw-e-w-4/+10
Generate Forever during generation
2023-06-08persistent conds cachew-e-w-10/+17
Update shared.py
2023-06-07Merge pull request #11058 from AUTOMATIC1111/api-wikiAUTOMATIC1111-2/+2
link footer API to Wiki when API is not active
2023-06-07Merge pull request #11066 from aljungberg/patch-1AUTOMATIC1111-1/+1
Fix upcast attention dtype error.
2023-06-06Fix upcast attention dtype error.Alexander Ljungberg-1/+1
Without this fix, enabling the "Upcast cross attention layer to float32" option while also using `--opt-sdp-attention` breaks generation with an error: ``` File "/ext3/automatic1111/stable-diffusion-webui/modules/sd_hijack_optimizations.py", line 612, in sdp_attnblock_forward out = torch.nn.functional.scaled_dot_product_attention(q, k, v, dropout_p=0.0, is_causal=False) RuntimeError: Expected query, key, and value to have the same dtype, but got query.dtype: float key.dtype: float and value.dtype: c10::Half instead. ``` The fix is to make sure to upcast the value tensor too.
2023-06-06Skip force pyton and pytorch ver if TORCH_COMMAND already setDGdev91-9/+12
2023-06-06link footer API to Wiki when API is not activew-e-w-2/+2
2023-06-06Write "RX 5000 Series" instead of "Navi" in errDGdev91-1/+1
2023-06-06Check python version for Navi 1 onlyDGdev91-1/+1
2023-06-06Force python1 for Navi1 only, use python_cmd for pythonDGdev91-15/+8
2023-06-06Fix error in webui.shDGdev91-0/+1