index
:
stable-diffusion-webui-gfx803.git
master
stable-diffusion-webui by AUTOMATIC1111 with patches for gfx803 GPU and Dockerfile
about
summary
refs
log
tree
commit
diff
stats
log msg
author
committer
range
path:
root
/
modules
Age
Commit message (
Collapse
)
Author
Lines
2022-10-08
Merge branch 'master' into dev/deepdanbooru
Greendayle
-4
/
+17
2022-10-08
add --force-enable-xformers option and also add messages to console ↵
AUTOMATIC
-1
/
+6
regarding cross attention optimizations
2022-10-08
add fallback for xformers_attnblock_forward
AUTOMATIC
-1
/
+4
2022-10-08
made deepdanbooru optional, added to readme, automatic download of deepbooru ↵
Greendayle
-17
/
+23
model
2022-10-08
alternate prompt
Artem Zagidulin
-2
/
+7
2022-10-08
check for ampere without destroying the optimizations. again.
C43H66N12O12S2
-4
/
+3
2022-10-08
check for ampere
C43H66N12O12S2
-3
/
+4
2022-10-08
Merge branch 'master' into dev/deepdanbooru
Greendayle
-1
/
+1
2022-10-08
why did you do this
AUTOMATIC
-1
/
+1
2022-10-08
fix conflicts
Greendayle
-46
/
+159
2022-10-08
Fixed typo
Milly
-1
/
+1
2022-10-08
restore old opt_split_attention/disable_opt_split_attention logic
AUTOMATIC
-1
/
+1
2022-10-08
simplify xfrmers options: --xformers to enable and that's it
AUTOMATIC
-9
/
+15
2022-10-08
emergency fix for xformers (continue + shared)
AUTOMATIC
-8
/
+8
2022-10-08
Merge pull request #1851 from C43H66N12O12S2/flash
AUTOMATIC1111
-6
/
+45
xformers attention
2022-10-08
Update sd_hijack.py
C43H66N12O12S2
-1
/
+1
2022-10-08
update sd_hijack_opt to respect new env variables
C43H66N12O12S2
-3
/
+8
2022-10-08
add xformers_available shared variable
C43H66N12O12S2
-1
/
+1
2022-10-08
default to split attention if cuda is available and xformers is not
C43H66N12O12S2
-2
/
+2
2022-10-08
fix bug where when using prompt composition, hijack_comments generated ↵
MrCheeze
-1
/
+5
before the final AND will be dropped
2022-10-08
fix glob path in hypernetwork.py
ddPn08
-1
/
+1
2022-10-08
fix AND broken for long prompts
AUTOMATIC
-0
/
+9
2022-10-08
fix bugs related to variable prompt lengths
AUTOMATIC
-12
/
+37
2022-10-08
do not let user choose his own prompt token count limit
AUTOMATIC
-21
/
+12
2022-10-08
check specifically for skipped
Trung Ngo
-7
/
+3
2022-10-08
Add button to skip the current iteration
Trung Ngo
-0
/
+21
2022-10-08
Merge remote-tracking branch 'origin/master'
AUTOMATIC
-1
/
+5
2022-10-08
let user choose his own prompt token count limit
AUTOMATIC
-8
/
+16
2022-10-08
fix: handles when state_dict does not exist
leko
-1
/
+5
2022-10-08
use new attnblock for xformers path
C43H66N12O12S2
-1
/
+1
2022-10-08
Update sd_hijack_optimizations.py
C43H66N12O12S2
-1
/
+1
2022-10-08
add xformers attnblock and hypernetwork support
C43H66N12O12S2
-2
/
+18
2022-10-08
Add hypernetwork support to split cross attention v1
brkirch
-5
/
+15
* Add hypernetwork support to split_cross_attention_forward_v1 * Fix device check in esrgan_model.py to use devices.device_esrgan instead of shared.device
2022-10-08
delete broken and unnecessary aliases
C43H66N12O12S2
-6
/
+4
2022-10-08
switch to the proper way of calling xformers
C43H66N12O12S2
-25
/
+3
2022-10-07
linux test
Greendayle
-2
/
+3
2022-10-07
even more powerfull fix
Greendayle
-2
/
+7
2022-10-07
loading tf only in interrogation process
Greendayle
-3
/
+4
2022-10-07
Merge branch 'master' into dev/deepdanbooru
Greendayle
-177
/
+513
2022-10-07
make it possible to use hypernetworks without opt split attention
AUTOMATIC
-10
/
+38
2022-10-07
do not stop working on failed hypernetwork load
AUTOMATIC
-2
/
+9
2022-10-07
support loading VAE
AUTOMATIC
-0
/
+8
2022-10-07
added support for hypernetworks (???)
AUTOMATIC
-3
/
+78
2022-10-07
Update sd_hijack.py
C43H66N12O12S2
-1
/
+1
2022-10-07
Update sd_hijack.py
C43H66N12O12S2
-2
/
+2
2022-10-07
Update sd_hijack.py
C43H66N12O12S2
-2
/
+1
2022-10-07
Update shared.py
C43H66N12O12S2
-0
/
+1
2022-10-07
Update sd_hijack.py
C43H66N12O12S2
-4
/
+9
2022-10-07
add xformers attention
C43H66N12O12S2
-1
/
+38
2022-10-06
karras samplers for img2img?
AUTOMATIC
-2
/
+4
[next]