aboutsummaryrefslogtreecommitdiffstats
Commit message (Collapse)AuthorAgeFilesLines
* Fix typoNicolas Noullet2022-10-091-1/+1
|
* remove line breakfrostydad2022-10-091-1/+0
|
* Fix incorrect sampler name in outputfrostydad2022-10-092-8/+17
|
* Fix VRAM Issue by only loading in hypernetwork when selected in settingsFampai2022-10-094-16/+23
|
* Merge pull request #1752 from Greendayle/dev/deepdanbooruAUTOMATIC11112022-10-097-6/+110
|\ | | | | Added DeepDanbooru interrogator
| * Merge branch 'master' into dev/deepdanbooruGreendayle2022-10-086-6/+23
| |\
| * | made deepdanbooru optional, added to readme, automatic download of deepbooru ↵Greendayle2022-10-087-23/+29
| | | | | | | | | | | | model
| * | Merge branch 'master' into dev/deepdanbooruGreendayle2022-10-081-1/+1
| |\ \
| * \ \ fix conflictsGreendayle2022-10-0822-57/+403
| |\ \ \
| * | | | linux testGreendayle2022-10-071-2/+3
| | | | |
| * | | | even more powerfull fixGreendayle2022-10-071-2/+7
| | | | |
| * | | | loading tf only in interrogation processGreendayle2022-10-071-3/+4
| | | | |
| * | | | Merge branch 'master' into dev/deepdanbooruGreendayle2022-10-0721-187/+615
| |\ \ \ \
| * | | | | removing underscores and colonsGreendayle2022-10-051-1/+1
| | | | | |
| * | | | | better model searchGreendayle2022-10-051-2/+9
| | | | | |
| * | | | | removing problematic tagGreendayle2022-10-051-3/+2
| | | | | |
| * | | | | deepdanbooru interrogatorGreendayle2022-10-056-6/+91
| | | | | |
* | | | | | Support `Download` for txt files.aoirusann2022-10-092-3/+41
| | | | | |
* | | | | | Add `Download` & `Download as zip`aoirusann2022-10-091-5/+34
| | | | | |
* | | | | | fixed incorrect message about loading config; thanks anon!AUTOMATIC2022-10-091-1/+1
| | | | | |
* | | | | | make main model loading and model merger use the same codeAUTOMATIC2022-10-092-8/+12
| | | | | |
* | | | | | support loading .yaml config with same name as modelAUTOMATIC2022-10-082-8/+24
| | | | | | | | | | | | | | | | | | | | | | | | support EMA weights in processing (????)
* | | | | | chore: Fix typosAidan Holland2022-10-0810-17/+17
| | | | | |
* | | | | | Break after finding the local directory of stable diffusionEdouard Leurent2022-10-081-0/+1
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Otherwise, we may override it with one of the next two path (. or ..) if it is present there, and then the local paths of other modules (taming transformers, codeformers, etc.) wont be found in sd_path/../. Fix https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/1085
* | | | | | add 'Ignore last layers of CLIP model' option as a parameter to the infotextAUTOMATIC2022-10-081-1/+5
| | | | | |
* | | | | | make --force-enable-xformers work without needing --xformersAUTOMATIC2022-10-081-1/+1
| | | | | |
* | | | | | Added ability to ignore last n layers in FrozenCLIPEmbedderFampai2022-10-082-2/+10
| | | | | |
* | | | | | Update ui.pyDepFA2022-10-081-1/+1
| | | | | |
* | | | | | TI preprocess wordingDepFA2022-10-081-3/+3
| |_|_|_|/ |/| | | | | | | | | I had to check the code to work out what splitting was 🤷🏿
* | | | | add --force-enable-xformers option and also add messages to console ↵AUTOMATIC2022-10-082-1/+6
| | | | | | | | | | | | | | | | | | | | regarding cross attention optimizations
* | | | | add fallback for xformers_attnblock_forwardAUTOMATIC2022-10-081-1/+4
| | | | |
* | | | | alternate promptArtem Zagidulin2022-10-081-2/+7
| | | | |
* | | | | Add GZipMiddleware to root demoDepFA2022-10-081-1/+5
| | | | |
* | | | | check for ampere without destroying the optimizations. again.C43H66N12O12S22022-10-081-4/+3
| | | | |
* | | | | check for ampereC43H66N12O12S22022-10-081-3/+4
| | | | |
* | | | | check for 3.10C43H66N12O12S22022-10-081-1/+1
| |_|_|/ |/| | |
* | | | why did you do thisAUTOMATIC2022-10-081-1/+1
| |_|/ |/| |
* | | Fixed typoMilly2022-10-081-1/+1
| | |
* | | restore old opt_split_attention/disable_opt_split_attention logicAUTOMATIC2022-10-081-1/+1
| | |
* | | simplify xfrmers options: --xformers to enable and that's itAUTOMATIC2022-10-084-10/+16
| | |
* | | emergency fix for xformers (continue + shared)AUTOMATIC2022-10-081-8/+8
| | |
* | | Merge pull request #1851 from C43H66N12O12S2/flashAUTOMATIC11112022-10-086-6/+55
|\ \ \ | | | | | | | | xformers attention
| * | | Update sd_hijack.pyC43H66N12O12S22022-10-081-1/+1
| | | |
| * | | Update requirements_versions.txtC43H66N12O12S22022-10-081-0/+1
| | | |
| * | | Update launch.pyC43H66N12O12S22022-10-081-1/+1
| | | |
| * | | update sd_hijack_opt to respect new env variablesC43H66N12O12S22022-10-081-3/+8
| | | |
| * | | add xformers_available shared variableC43H66N12O12S22022-10-081-1/+1
| | | |
| * | | default to split attention if cuda is available and xformers is notC43H66N12O12S22022-10-081-2/+2
| | | |
| * | | check for OS and env variableC43H66N12O12S22022-10-081-2/+7
| | | |
| * | | Update requirements.txtC43H66N12O12S22022-10-081-1/+0
| | | |