index
:
stable-diffusion-webui-gfx803.git
master
stable-diffusion-webui by AUTOMATIC1111 with patches for gfx803 GPU and Dockerfile
about
summary
refs
log
tree
commit
diff
stats
log msg
author
committer
range
path:
root
/
modules
/
textual_inversion
Age
Commit message (
Expand
)
Author
Lines
2023-01-07
CLIP hijack rework
AUTOMATIC
-1
/
+0
2023-01-06
rework saving training params to file #6372
AUTOMATIC
-20
/
+27
2023-01-06
Merge pull request #6372 from timntorres/save-ti-hypernet-settings-to-txt-rev...
AUTOMATIC1111
-1
/
+25
2023-01-06
allow loading embeddings from subdirectories
Faber
-11
/
+12
2023-01-05
typo in TI
Kuma
-1
/
+1
2023-01-05
Include model in log file. Exclude directory.
timntorres
-13
/
+9
2023-01-05
Clean up ti, add same behavior to hypernetwork.
timntorres
-5
/
+9
2023-01-05
Add option to save ti settings to file.
timntorres
-3
/
+27
2023-01-04
Merge branch 'master' into gradient-clipping
AUTOMATIC1111
-219
/
+354
2023-01-04
use shared function from processing for creating dummy mask when training inp...
AUTOMATIC
-24
/
+9
2023-01-04
fix the merge
AUTOMATIC
-9
/
+5
2023-01-04
Merge branch 'master' into inpaint_textual_inversion
AUTOMATIC1111
-285
/
+439
2023-01-04
Merge pull request #6253 from Shondoit/ti-optim
AUTOMATIC1111
-8
/
+32
2023-01-03
add job info to modules
Vladimir Mandic
-0
/
+2
2023-01-03
Save Optimizer next to TI embedding
Shondoit
-8
/
+32
2023-01-02
feat(api): return more data for embeddings
Philpax
-4
/
+4
2023-01-02
fix the issue with training on SD2.0
AUTOMATIC
-2
/
+1
2022-12-31
changed embedding accepted shape detection to use existing code and support t...
AUTOMATIC
-24
/
+6
2022-12-31
validate textual inversion embeddings
Vladimir Mandic
-5
/
+38
2022-12-24
fix F541 f-string without any placeholders
Yuval Aboulafia
-1
/
+1
2022-12-14
Fix various typos
Jim Hays
-13
/
+13
2022-12-03
Merge branch 'master' into racecond_fix
AUTOMATIC1111
-274
/
+381
2022-12-03
Merge pull request #5194 from brkirch/autocast-and-mps-randn-fixes
AUTOMATIC1111
-3
/
+3
2022-12-02
Fix divide by 0 error
PhytoEpidemic
-3
/
+3
2022-11-30
Use devices.autocast instead of torch.autocast
brkirch
-3
/
+3
2022-11-27
Merge pull request #4688 from parasi22/resolve-embedding-name-in-filewords
AUTOMATIC1111
-1
/
+1
2022-11-27
Merge remote-tracking branch 'flamelaw/master'
AUTOMATIC
-189
/
+274
2022-11-27
set TI AdamW default weight decay to 0
flamelaw
-1
/
+1
2022-11-26
Add support Stable Diffusion 2.0
AUTOMATIC
-4
/
+3
2022-11-23
small fixes
flamelaw
-1
/
+1
2022-11-21
fix pin_memory with different latent sampling method
flamelaw
-10
/
+20
2022-11-20
moved deepdanbooru to pure pytorch implementation
AUTOMATIC
-8
/
+4
2022-11-20
fix random sampling with pin_memory
flamelaw
-1
/
+1
2022-11-20
remove unnecessary comment
flamelaw
-9
/
+0
2022-11-20
Gradient accumulation, autocast fix, new latent sampling method, etc
flamelaw
-185
/
+269
2022-11-19
Merge pull request #4812 from space-nuko/feature/interrupt-preprocessing
AUTOMATIC1111
-1
/
+1
2022-11-19
change StableDiffusionProcessing to internally use sampler name instead of sa...
AUTOMATIC
-2
/
+2
2022-11-17
Add interrupt button to preprocessing
space-nuko
-1
/
+1
2022-11-13
resolve [name] after resolving [filewords] in training
parasi
-1
/
+1
2022-11-11
Merge pull request #4117 from TinkTheBoush/master
AUTOMATIC1111
-1
/
+6
2022-11-11
Update dataset.py
KyuSeok Jung
-1
/
+1
2022-11-11
Update dataset.py
KyuSeok Jung
-1
/
+1
2022-11-11
adding tag drop out option
KyuSeok Jung
-4
/
+4
2022-11-09
Merge branch 'master' into gradient-clipping
Muhammad Rizqi Nur
-69
/
+93
2022-11-08
move functions out of main body for image preprocessing for easier hijacking
AUTOMATIC
-69
/
+93
2022-11-05
Simplify grad clip
Muhammad Rizqi Nur
-9
/
+7
2022-11-04
change option position to Training setting
TinkTheBoush
-5
/
+4
2022-11-04
Fixes race condition in training when VAE is unloaded
Fampai
-0
/
+5
2022-11-02
Merge branch 'master' into gradient-clipping
Muhammad Rizqi Nur
-2
/
+15
2022-11-02
Merge branch 'master' into master
KyuSeok Jung
-2
/
+15
[next]