aboutsummaryrefslogtreecommitdiffstats
path: root/modules
Commit message (Collapse)AuthorAgeFilesLines
* Merge pull request #14131 from read-0nly/patch-1AUTOMATIC11112023-12-021-1/+1
|\ | | | | Update devices.py - Make 'use-cpu all' actually apply to 'all'
| * Update devices.pyobsol2023-11-281-1/+1
| | | | | | | | | | fixes issue where "--use-cpu" all properly makes SD run on CPU but leaves ControlNet (and other extensions, I presume) pointed at GPU, causing a crash in ControlNet caused by a mismatch between devices between SD and CN https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/14097
* | Merge pull request #14121 from ↵AUTOMATIC11112023-12-022-119/+124
|\ \ | | | | | | | | | | | | AUTOMATIC1111/fix-Auto-focal-point-crop-for-opencv-4.8.x Fix auto focal point crop for opencv >= 4.8
| * | reformat file with uniform indentationw-e-w2023-11-281-104/+106
| | |
| * | fix Auto focal point crop for opencv >= 4.8.xw-e-w2023-11-282-15/+18
| | | | | | | | | | | | | | | | | | | | | autocrop.download_and_cache_models in opencv >= 4.8 the face detection model was updated download the base on opencv version returns the model path or raise exception
* | | Merge pull request #14119 from ↵AUTOMATIC11112023-12-021-0/+10
|\ \ \ | | | | | | | | | | | | | | | | AUTOMATIC1111/add-Block-component-creation-callback add Block component creation callback
| * | | add Block component creation callbackw-e-w2023-11-271-0/+10
| |/ /
* | | Merge pull request #14046 from hidenorly/AddFP32FallbackSupportOnSdVaeApproxAUTOMATIC11112023-12-021-0/+15
|\ \ \ | | | | | | | | Add FP32 fallback support on sd_vae_approx
| * | | Fix the Ruff error about unused importhidenorly2023-11-281-1/+0
| | | |
| * | | Add FP32 fallback support on torch.nn.functional.interpolatehidenorly2023-11-281-0/+16
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | This tries to execute interpolate with FP32 if it failed. Background is that on some environment such as Mx chip MacOS devices, we get error as follows: ``` "torch/nn/functional.py", line 3931, in interpolate return torch._C._nn.upsample_nearest2d(input, output_size, scale_factors) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ RuntimeError: "upsample_nearest2d_channels_last" not implemented for 'Half' ``` In this case, ```--no-half``` doesn't help to solve. Therefore this commits add the FP32 fallback execution to solve it. Note that the ```upsample_nearest2d``` is called from ```torch.nn.functional.interpolate```. And the fallback for torch.nn.functional.interpolate is necessary at ```modules/sd_vae_approx.py``` 's ```VAEApprox.forward``` ```repositories/stable-diffusion-stability-ai/ldm/modules/diffusionmodules/openaimodel.py``` 's ```Upsample.forward```
| * | | Revert "Add FP32 fallback support on sd_vae_approx"hidenorly2023-11-281-7/+1
| | | | | | | | | | | | | | | | | | | | | | | | This reverts commit 58c19545c83fa6925c9ce2216ee64964eb5129ce. Since the modification is expected to move to mac_specific.py (https://github.com/AUTOMATIC1111/stable-diffusion-webui/pull/14046#issuecomment-1826731532)
| * | | Add FP32 fallback support on sd_vae_approxhidenorly2023-11-201-1/+7
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | This tries to execute interpolate with FP32 if it failed. Background is that on some environment such as Mx chip MacOS devices, we get error as follows: ``` "torch/nn/functional.py", line 3931, in interpolate return torch._C._nn.upsample_nearest2d(input, output_size, scale_factors) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ RuntimeError: "upsample_nearest2d_channels_last" not implemented for 'Half' ``` In this case, ```--no-half``` doesn't help to solve. Therefore this commits add the FP32 fallback execution to solve it. Note that the submodule may require additional modifications. The following is the example modification on the other submodule. ```repositories/stable-diffusion-stability-ai/ldm/modules/diffusionmodules/openaimodel.py class Upsample(nn.Module): ..snip.. def forward(self, x): assert x.shape[1] == self.channels if self.dims == 3: x = F.interpolate( x, (x.shape[2], x.shape[3] * 2, x.shape[4] * 2), mode="nearest" ) else: try: x = F.interpolate(x, scale_factor=2, mode="nearest") except: x = F.interpolate(x.to(th.float32), scale_factor=2, mode="nearest").to(x.dtype) if self.use_conv: x = self.conv(x) return x ..snip.. ``` You can see the FP32 fallback execution as same as sd_vae_approx.py.
* | | | Merge pull request #14170 from MrCheeze/sd-turboAUTOMATIC11112023-12-022-7/+19
|\ \ \ \ | | | | | | | | | | Add support for SD 2.1 Turbo
| * | | | Add support for SD 2.1 Turbo, by converting the state dict from SGM to LDM ↵MrCheeze2023-12-021-4/+13
| | | | | | | | | | | | | | | | | | | | on load
| * | | | Fix bug where is_using_v_parameterization_for_sd2 fails because the ↵MrCheeze2023-12-021-3/+6
| | |/ / | |/| | | | | | | | | | sd_hijack is only partially undone
* | | | split UI settings page into manyAUTOMATIC11112023-12-021-25/+32
| | | |
* | | | infotext updates: add option to disregard certain infotext fields, add ↵AUTOMATIC11112023-12-024-12/+41
|/ / / | | | | | | | | | option to not include VAE in infotext, add explanation to infotext settings page, move some options to infotext settings page
* | | add categories to settingsAUTOMATIC11112023-11-262-28/+96
| | |
* | | json.dump(ensure_ascii=False)w-e-w2023-11-265-5/+5
| | | | | | | | | | | | improve json readability
* | | compact prompt layout: preserve scroll when switching between lora tabsAUTOMATIC11112023-11-261-1/+4
| | |
* | | Merge pull request #14059 from akx/upruffAUTOMATIC11112023-11-261-1/+1
|\ \ \ | | | | | | | | Update Ruff to 0.1.6
| * | | Simplify restart_sampler (suggested by ruff)Aarni Koskela2023-11-221-1/+1
| | | |
* | | | do not save HTML explanations from options page to configAUTOMATIC11112023-11-261-2/+2
| | | |
* | | | Merge pull request #14084 from wfjsw/move-from-sysinfo-to-errorsAUTOMATIC11112023-11-262-19/+17
|\ \ \ \ | | | | | | | | | | Move exception_records related methods to errors.py
| * | | | remove traceback in sysinfoJabasukuriputo Wang2023-11-241-1/+0
| | | | |
| * | | | Move exception_records related methods to errors.pyJabasukuriputo Wang2023-11-242-18/+17
| | | | |
* | | | | Merge branch 'hypertile-in-sample' into devAUTOMATIC11112023-11-263-403/+13
|\ \ \ \ \
| * | | | | rework hypertile into a built-in extensionAUTOMATIC11112023-11-262-32/+13
| | | | | |
| * | | | | move fileAUTOMATIC11112023-11-261-371/+0
| | | | | |
* | | | | | Merge pull request #13948 from aria1th/hypertile-in-sampleAUTOMATIC11112023-11-263-19/+404
|\| | | | | | |_|/ / / |/| | | | support HyperTile optimization
| * | | | fix double gc and decoding with unet contextaria1th2023-11-171-3/+2
| | | | |
| * | | | set empty value for SD XL 3rd layeraria1th2023-11-171-0/+1
| | | | |
| * | | | Fix inverted option issuearia1th2023-11-171-2/+2
| | | | | | | | | | | | | | | | | | | | I'm pretty sure I was sleepy while implementing this
| * | | | Fix critical issue - unet applyaria1th2023-11-171-4/+4
| | | | |
| * | | | fix ruff - add newlineAngelBottomless2023-11-161-1/+1
| | | | |
| * | | | convert/add hypertile optionsAngelBottomless2023-11-163-10/+53
| | | | |
| * | | | copy LDM VAE key from XLaria1th2023-11-151-0/+1
| | | | |
| * | | | Implement Hypertilearia1th2023-11-152-40/+358
| | | | | | | | | | | | | | | | | | | | Co-Authored-By: Kieran Hunt <kph@hotmail.ca>
| * | | | add hyperTilearia1th2023-11-112-3/+26
| | | | | | | | | | | | | | | | | | | | https://github.com/tfernd/HyperTile
* | | | | fix [Bug]: (Dev Branch) Placing "Dimensions" first in "ui_reorder_list" ↵AUTOMATIC11112023-11-211-6/+6
| |_|/ / |/| | | | | | | | | | | prevents start #14047
* | | | Merge pull request #14009 from ↵AUTOMATIC11112023-11-202-4/+21
|\ \ \ \ | | | | | | | | | | | | | | | | | | | | AUTOMATIC1111/Option-to-show-batch-img2img-results-in-UI Option to show batch img2img results in UI
| * | | | Option to show batch img2img results in UIw-e-w2023-11-192-4/+21
| |/ / / | | | | | | | | | | | | | | | | | | | | | | | | | | | | shared.opts.img2img_batch_show_results_limit limit the number of images return to the UI for batch img2img default limit 32 0 no images are shown -1 unlimited, all images are shown
* | | | Merge branch 'dag' into devAUTOMATIC11112023-11-202-153/+148
|\ \ \ \
| * | | | rework extensions metadata: use custom sorter that doesn't mess the order as ↵AUTOMATIC11112023-11-202-153/+148
| | | | | | | | | | | | | | | | | | | | much and ignores cyclic errors, use classes with named fields instead of dictionaries, eliminate some duplicated code
* | | | | Merge pull request #13944 from wfjsw/dagAUTOMATIC11112023-11-202-24/+188
|\| | | | | | | | | | | | | | implementing script metadata and DAG sorting mechanism
| * | | | use metadata.ini for meta filenamewfjsw2023-11-191-6/+6
| | | | |
| * | | | bug fixwfjsw2023-11-111-7/+18
| | | | |
| * | | | fixwfjsw2023-11-111-1/+0
| | | | |
| * | | | allow comma and whitespace as separatorwfjsw2023-11-112-6/+9
| | | | |
| * | | | remove the assumption of same namewfjsw2023-11-111-51/+30
| | | | |