aboutsummaryrefslogtreecommitdiffstats
AgeCommit message (Collapse)AuthorLines
2023-11-05add a visible checkbox to input accordionAUTOMATIC1111-25/+59
2023-11-05correct a typogibiee-1/+1
modify "defaul" to "default"
2023-11-04Merge pull request #2 from v0xie/network-oft-change-implv0xie-17/+27
Use same updown implementation for LyCORIS OFT as kohya-ss OFT
2023-11-04refactor: remove unused functionv0xie-47/+0
2023-11-04refactor: use same updown for both kohya OFT and LyCORIS diag-oftv0xie-17/+74
2023-11-04Merge branch 'dev' into test-fp8Kohaku-Blueleaf-67/+92
2023-11-03Merge pull request #1 from v0xie/oft-fasterv0xie-17/+135
Support LyCORIS diag-oft OFT implementation (minus MultiheadAttention layer), maintains support for kohya-ss OFT
2023-11-03refactor: move factorization to lyco_helpers, separate calc_updown for kohya ↵v0xie-101/+77
and kb
2023-11-03skip multihead attn for nowv0xie-17/+37
2023-11-03Merge pull request #13718 from avantcontra/bugfix_gfpgan_custom_pathAUTOMATIC1111-5/+20
fix bug when using --gfpgan-models-path
2023-11-03Merge pull request #13733 from dben/patch-1AUTOMATIC1111-2/+15
Update prompts_from_file script to allow concatenating entries with the general prompt.
2023-11-03Merge pull request #13762 from wkpark/nextjobAUTOMATIC1111-2/+2
call state.jobnext() before postproces*()
2023-11-03Merge pull request #13797 from Meerkov/masterAUTOMATIC1111-1/+1
Fix #13796
2023-11-03Merge pull request #13829 from AUTOMATIC1111/paren-fixAUTOMATIC1111-1/+1
Fix parenthesis auto selection
2023-11-03add changelog entryAUTOMATIC1111-0/+5
2023-11-03Merge pull request #13839 from AUTOMATIC1111/httpx==0.24.1AUTOMATIC1111-0/+1
requirements_versions httpx==0.24.1
2023-11-03Merge pull request #13839 from AUTOMATIC1111/httpx==0.24.1AUTOMATIC1111-0/+1
requirements_versions httpx==0.24.1
2023-11-04Update requirements_versions.txtw-e-w-0/+1
2023-11-03Fix parenthesis auto selectionmissionfloyd-1/+1
Fixes #13813
2023-11-02added accordion settings optionsEmily Zeng-250/+254
2023-11-02no idea what i'm doing, trying to support both type of OFT, kblueleaf ↵v0xie-47/+145
diag_oft has MultiheadAttn which kohya's doesn't?, attempt create new module based off network_lora.py, errors about tensor dim mismatch
2023-11-02detect diag_oft typev0xie-0/+7
2023-11-01test implementation based on kohaku diag-oft implementationv0xie-21/+38
2023-10-29Fix #13796Meerkov-1/+1
Fix comment error that makes understanding scheduling more confusing.
2023-10-29Remove blank line whitespaceNick Harrison-1/+1
2023-10-29Add assertions for checking additional settings freezing parametersNick Harrison-4/+19
2023-10-29Add new arguments to known command promptsNick Harrison-5/+6
2023-10-28Add MPS manual castKohakuBlueleaf-1/+5
2023-10-28ManualCast for 10/16 series gpuKohaku-Blueleaf-16/+64
2023-10-25call state.jobnext() before postproces*()Won-Kyu Park-2/+2
2023-10-25change torch versionKohaku-Blueleaf-2/+2
2023-10-25ignore mps for fp8Kohaku-Blueleaf-1/+3
2023-10-25Fix alphas cumprodKohaku-Blueleaf-2/+3
2023-10-25Fix alphas_cumprod dtypeKohaku-Blueleaf-0/+1
2023-10-25fp8 for TEKohaku-Blueleaf-0/+7
2023-10-24Fix lintKohaku-Blueleaf-1/+1
2023-10-24Add CPU fp8 supportKohaku-Blueleaf-6/+22
Since norm layer need fp32, I only convert the linear operation layer(conv2d/linear) And TE have some pytorch function not support bf16 amp in CPU. I add a condition to indicate if the autocast is for unet.
2023-10-23linting issueDavid Benson-1/+1
2023-10-23Update prompts_from_file script to allow concatenating entries with the ↵David Benson-2/+15
general prompt.
2023-10-22style: conform stylev0xie-1/+1
2023-10-22fix: multiplier applied twice in finalize_updownv0xie-1/+22
2023-10-22refactor: remove used OFT functionsv0xie-72/+10
2023-10-21fix: use merge_weight to cache valuev0xie-17/+40
2023-10-21style: cleanup oftv0xie-75/+7
2023-10-21fix: support multiplier, no forward pass hookv0xie-10/+33
2023-10-21fix: return orig weights during updown, merge weights before forwardv0xie-21/+69
2023-10-21refactor: use forward hook instead of custom forwardv0xie-9/+24
2023-10-22fix Blank line contains whitespaceavantcontra-1/+1
2023-10-22fix bug when using --gfpgan-models-pathavantcontra-5/+20
2023-10-21fix the situation with emphasis editing (aaaa:1.1) bbbb (cccc:1.1)AUTOMATIC1111-0/+6