index
:
stable-diffusion-webui-gfx803.git
master
stable-diffusion-webui by AUTOMATIC1111 with patches for gfx803 GPU and Dockerfile
about
summary
refs
log
tree
commit
diff
stats
log msg
author
committer
range
path:
root
/
modules
/
sd_hijack.py
Commit message (
Expand
)
Author
Age
Files
Lines
*
removed aesthetic gradients as built-in
AUTOMATIC
2022-10-22
1
-1
/
+0
*
make aestetic embedding ciompatible with prompts longer than 75 tokens
AUTOMATIC
2022-10-21
1
-1
/
+1
*
Merge branch 'ae'
AUTOMATIC
2022-10-21
1
-15
/
+15
|
\
|
*
ui fix, re organization of the code
MalumaDev
2022-10-16
1
-97
/
+5
|
*
ui fix
MalumaDev
2022-10-16
1
-1
/
+1
|
*
ui fix
MalumaDev
2022-10-16
1
-2
/
+1
|
*
Merge remote-tracking branch 'origin/test_resolve_conflicts' into test_resolv...
MalumaDev
2022-10-15
1
-2
/
+2
|
|
\
|
|
*
Merge branch 'master' into test_resolve_conflicts
MalumaDev
2022-10-15
1
-2
/
+2
|
|
|
\
|
*
|
|
fixed dropbox update
MalumaDev
2022-10-15
1
-2
/
+2
|
|
/
/
|
*
|
fix to tokens lenght, addend embs generator, add new features to edit the emb...
MalumaDev
2022-10-15
1
-38
/
+73
|
*
|
init
MalumaDev
2022-10-14
1
-2
/
+78
*
|
|
Update sd_hijack.py
C43H66N12O12S2
2022-10-18
1
-1
/
+1
*
|
|
use legacy attnblock
C43H66N12O12S2
2022-10-18
1
-1
/
+1
|
|
/
|
/
|
*
|
Update sd_hijack.py
C43H66N12O12S2
2022-10-15
1
-1
/
+1
|
/
*
fix iterator bug for #2295
AUTOMATIC
2022-10-12
1
-4
/
+4
*
Account when lines are mismatched
hentailord85ez
2022-10-12
1
-1
/
+11
*
Add check for psutil
brkirch
2022-10-11
1
-2
/
+8
*
Add cross-attention optimization from InvokeAI
brkirch
2022-10-11
1
-1
/
+4
*
rename hypernetwork dir to hypernetworks to prevent clash with an old filenam...
AUTOMATIC
2022-10-11
1
-1
/
+1
*
Merge branch 'master' into hypernetwork-training
AUTOMATIC
2022-10-11
1
-30
/
+93
|
\
|
*
Comma backtrack padding (#2192)
hentailord85ez
2022-10-11
1
-1
/
+18
|
*
allow pascal onwards
C43H66N12O12S2
2022-10-10
1
-1
/
+1
|
*
Add back in output hidden states parameter
hentailord85ez
2022-10-10
1
-1
/
+1
|
*
Pad beginning of textual inversion embedding
hentailord85ez
2022-10-10
1
-0
/
+5
|
*
Unlimited Token Works
hentailord85ez
2022-10-10
1
-23
/
+46
|
*
Removed unnecessary tmp variable
Fampai
2022-10-09
1
-4
/
+3
|
*
Updated code for legibility
Fampai
2022-10-09
1
-2
/
+5
|
*
Optimized code for Ignoring last CLIP layers
Fampai
2022-10-09
1
-8
/
+4
|
*
Added ability to ignore last n layers in FrozenCLIPEmbedder
Fampai
2022-10-08
1
-2
/
+9
|
*
add --force-enable-xformers option and also add messages to console regarding...
AUTOMATIC
2022-10-08
1
-1
/
+5
|
*
check for ampere without destroying the optimizations. again.
C43H66N12O12S2
2022-10-08
1
-4
/
+3
|
*
check for ampere
C43H66N12O12S2
2022-10-08
1
-3
/
+4
|
*
why did you do this
AUTOMATIC
2022-10-08
1
-1
/
+1
|
*
restore old opt_split_attention/disable_opt_split_attention logic
AUTOMATIC
2022-10-08
1
-1
/
+1
|
*
simplify xfrmers options: --xformers to enable and that's it
AUTOMATIC
2022-10-08
1
-1
/
+1
|
*
Merge pull request #1851 from C43H66N12O12S2/flash
AUTOMATIC1111
2022-10-08
1
-4
/
+6
|
|
\
|
|
*
Update sd_hijack.py
C43H66N12O12S2
2022-10-08
1
-1
/
+1
|
|
*
default to split attention if cuda is available and xformers is not
C43H66N12O12S2
2022-10-08
1
-2
/
+2
|
|
*
use new attnblock for xformers path
C43H66N12O12S2
2022-10-08
1
-1
/
+1
|
|
*
delete broken and unnecessary aliases
C43H66N12O12S2
2022-10-08
1
-6
/
+4
|
|
*
Update sd_hijack.py
C43H66N12O12S2
2022-10-07
1
-1
/
+1
|
|
*
Update sd_hijack.py
C43H66N12O12S2
2022-10-07
1
-2
/
+2
|
|
*
Update sd_hijack.py
C43H66N12O12S2
2022-10-07
1
-2
/
+1
|
|
*
Update sd_hijack.py
C43H66N12O12S2
2022-10-07
1
-4
/
+9
|
*
|
fix bug where when using prompt composition, hijack_comments generated before...
MrCheeze
2022-10-08
1
-1
/
+4
|
*
|
fix bugs related to variable prompt lengths
AUTOMATIC
2022-10-08
1
-5
/
+9
|
*
|
do not let user choose his own prompt token count limit
AUTOMATIC
2022-10-08
1
-13
/
+12
|
*
|
let user choose his own prompt token count limit
AUTOMATIC
2022-10-08
1
-6
/
+7
*
|
|
hypernetwork training mk1
AUTOMATIC
2022-10-07
1
-1
/
+3
|
/
/
*
/
make it possible to use hypernetworks without opt split attention
AUTOMATIC
2022-10-07
1
-2
/
+4
|
/
[next]