aboutsummaryrefslogtreecommitdiffstats
AgeCommit message (Collapse)AuthorLines
2022-10-29Merge pull request #3877 from Yaiol/masterAUTOMATIC1111-4/+5
Filename tags are wrongly referencing to process size instead of image size
2022-10-29Merge pull request #3717 from benkyoujouzu/masterAUTOMATIC1111-0/+1
Add missing support for linear activation in hypernetwork
2022-10-29Merge pull request #3771 from aria1th/patch-12AUTOMATIC1111-1/+2
Disable unavailable or duplicate options for Activation functions
2022-10-29Merge pull request #3511 from bamarillo/masterAUTOMATIC1111-56/+103
[API][Feature] Add extras endpoints
2022-10-28Merge branch 'AUTOMATIC1111:master' into masterBruno Seoane-1/+1
2022-10-29Re enable linearAngelBottomless-1/+1
2022-10-29Update images.pyYaiol-4/+5
Filename tags [height] and [width] are wrongly referencing to process size instead of resulting image size. Making all upscale files named wrongly.
2022-10-28extras: upscaler blending should not be considered in cache keyChris OBryan-1/+1
2022-10-28extras-tweaks: autoformat changed linesChris OBryan-15/+15
2022-10-28extras: Make image cache LRUChris OBryan-29/+43
This changes the extras image cache into a Least-Recently-Used cache. This allows more experimentation with different upscalers without missing the cache. Max cache size is increased to 5 and is cleared on source image update.
2022-10-28extras: Rework image cacheChris OBryan-20/+32
Bit of a refactor to the image cache to make it easier to extend. Also takes into account the entire image instead of just a cropped portion.
2022-10-28extras: Add option to run upscaling before face fixingChris OBryan-50/+99
Face restoration can look much better if ran after upscaling, as it allows the restoration to fix upscaling artifacts. This patch adds an option to choose which order to run upscaling/face fixing in.
2022-10-28Fix log off by 1Muhammad Rizqi Nur-18/+20
2022-10-28Add missing support for linear activation in hypernetworkbenkyoujouzu-0/+1
2022-10-28Natural sorting for dropdown checkpoint listAntonio-2/+5
Example: Before After 11.ckpt 11.ckpt ab.ckpt ab.ckpt ade_pablo_step_1000.ckpt ade_pablo_step_500.ckpt ade_pablo_step_500.ckpt ade_pablo_step_1000.ckpt ade_step_1000.ckpt ade_step_500.ckpt ade_step_1500.ckpt ade_step_1000.ckpt ade_step_2000.ckpt ade_step_1500.ckpt ade_step_2500.ckpt ade_step_2000.ckpt ade_step_3000.ckpt ade_step_2500.ckpt ade_step_500.ckpt ade_step_3000.ckpt atp_step_5500.ckpt atp_step_5500.ckpt model1.ckpt model1.ckpt model10.ckpt model10.ckpt model1000.ckpt model33.ckpt model33.ckpt model50.ckpt model400.ckpt model400.ckpt model50.ckpt model1000.ckpt moo44.ckpt moo44.ckpt v1-4-pruned-emaonly.ckpt v1-4-pruned-emaonly.ckpt v1-5-pruned-emaonly.ckpt v1-5-pruned-emaonly.ckpt v1-5-pruned.ckpt v1-5-pruned.ckpt v1-5-vae.ckpt v1-5-vae.ckpt
2022-10-28adjustments to zh_TW localisation per suggestions by snowmeow2benlisquare-135/+135
2022-10-28hide save btn for other tabs than txt2img and img2imgFlorian Horn-2/+12
2022-10-27Update pt_BR.jsonMartucci-3/+3
2022-10-27Update pt_BR.jsonMartucci-1/+1
2022-10-27Reduce peak memory usage when changing modelsJosh Watzman-4/+7
A few tweaks to reduce peak memory usage, the biggest being that if we aren't using the checkpoint cache, we shouldn't duplicate the model state dict just to immediately throw it away. On my machine with 16GB of RAM, this change means I can typically change models, whereas before it would typically OOM.
2022-10-27Compromise with other PR for this forkMartucci-128/+125
2022-10-27Updated name and hover text.random_thoughtss-1/+2
2022-10-27Moved mask weight config to SD sectionrandom_thoughtss-1/+1
2022-10-27Highres fix works with unmasked latent.random_thoughtss-58/+76
Also refactor the mask creation to make it more accesible.
2022-10-27Merge branch 'AUTOMATIC1111:master' into masterrandom-thoughtss-97/+1944
2022-10-28Fix random dataset shuffle on TIFlameLaw-2/+2
2022-10-27Add forced LTR for training progressxmodar-1/+1
2022-10-27fixed position to be in line with the other iconsFlorian Horn-3/+3
2022-10-27fixed indentationFlorian Horn-1/+1
2022-10-27added save button and shortcut (s) to Modal ViewFlorian Horn-5/+50
2022-10-27Mais ajustes de traduçãoMartucci-97/+96
2022-10-27Attention editing hotkey fix part 2Dynamic-2/+2
2022-10-27Remove files that shouldn't be hereDynamic-62/+0
2022-10-27Attention editing with hotkeys should work with KR nowDynamic-0/+62
Added the word "Prompt" in the placeholders to pass the check from edit-attention.js
2022-10-27Merge branch 'AUTOMATIC1111:master' into kr-localizationDynamic-1/+1
2022-10-27Add minor edits to Arabic localizationxmodar-6/+6
2022-10-27Apparently brackets don't work, gitlab docs fooled meDynamic-2/+1
2022-10-27Update 2 cause I'm an idiotDynamic-1/+2
2022-10-27Update CODEOWNERS fileDynamic-0/+1
2022-10-27Edit CODEOWNERS for ko_KR.json permissionsDynamic-0/+1
2022-10-27create send to buttons by extensionsyfszzx-3/+5
2022-10-27Disable unavailable or duplicate optionsAngelBottomless-1/+2
2022-10-27create send to buttons in one moduleyfszzx-1/+1
2022-10-27create send to buttons in one moduleyfszzx-0/+2
2022-10-27create send to buttons in one moduleyfszzx-270/+187
2022-10-27Add german Localization file.LunixWasTaken-0/+419
2022-10-27Update localizations/zh_TW.json per dtlnor's suggestionbenlisquare-1/+1
Co-authored-by: dtlnor <dtlnor@hotmail.com>
2022-10-27Update localizations/zh_TW.json per dtlnor's suggestionbenlisquare-1/+1
Co-authored-by: dtlnor <dtlnor@hotmail.com>
2022-10-27adjustments to zh_TW localisation per suggestions by dtlnorbenlisquare-7/+7
2022-10-26Merge pull request #1 from M-art-ucci/pt_BR-localizationMartucci-0/+472
Localization file for portuguese (brazil)