Commit message (Collapse) | Author | Age | Files | Lines | |
---|---|---|---|---|---|
* | call apply_alpha_schedule_override in load_model_weights for #14979 | AUTOMATIC1111 | 2024-03-02 | 1 | -1/+2 |
| | |||||
* | Merge pull request #14979 from drhead/refiner_cumprod_fix | AUTOMATIC1111 | 2024-03-02 | 1 | -0/+32 |
| | | | Protect alphas_cumprod during refiner switchover | ||||
* | Execute model_loaded_callback after moving to target device | Nuullll | 2024-01-06 | 1 | -3/+3 |
| | |||||
* | Merge pull request #14145 from drhead/zero-terminal-snr | AUTOMATIC1111 | 2024-01-01 | 1 | -0/+6 |
|\ | | | | | Implement zero terminal SNR noise schedule option | ||||
| * | fix variable | drhead | 2023-12-02 | 1 | -1/+1 |
| | | |||||
| * | Create alphas_cumprod_original on full precision path | drhead | 2023-12-02 | 1 | -0/+1 |
| | | |||||
| * | Protect alphas_cumprod from downcasting | drhead | 2023-11-29 | 1 | -0/+5 |
| | | |||||
* | | Fix forced reload | Kohaku-Blueleaf | 2023-12-06 | 1 | -1/+1 |
| | | |||||
* | | Ensure the cached weight will not be affected | Kohaku-Blueleaf | 2023-12-02 | 1 | -2/+2 |
| | | |||||
* | | Merge branch 'dev' into test-fp8 | Kohaku-Blueleaf | 2023-12-02 | 1 | -4/+13 |
|\ \ | |||||
| * | | Add support for SD 2.1 Turbo, by converting the state dict from SGM to LDM ↵ | MrCheeze | 2023-12-02 | 1 | -4/+13 |
| |/ | | | | | | | on load | ||||
* | | Fix pre-fp8 | Kohaku-Blueleaf | 2023-11-25 | 1 | -1/+1 |
| | | |||||
* | | Option for using fp16 weight when apply lora | Kohaku-Blueleaf | 2023-11-21 | 1 | -3/+11 |
| | | |||||
* | | Use options instead of cmd_args | Kohaku-Blueleaf | 2023-11-19 | 1 | -29/+32 |
| | | |||||
* | | Merge branch 'dev' into test-fp8 | Kohaku-Blueleaf | 2023-11-16 | 1 | -1/+4 |
|\| | |||||
| * | more changes for #13865: fix formatting, rename the function, add comment ↵ | AUTOMATIC1111 | 2023-11-05 | 1 | -1/+1 |
| | | | | | | | | and add a readme entry | ||||
| * | linter | AUTOMATIC1111 | 2023-11-05 | 1 | -2/+2 |
| | | |||||
| * | Merge branch 'dev' into master | AUTOMATIC1111 | 2023-11-05 | 1 | -20/+30 |
| |\ | |||||
| * | | Use devices.torch_gc() instead of empty_cache() | Ritesh Gangnani | 2023-11-05 | 1 | -1/+0 |
| | | | |||||
| * | | Add SSD-1B as a supported model | Ritesh Gangnani | 2023-11-05 | 1 | -2/+6 |
| | | | |||||
* | | | ManualCast for 10/16 series gpu | Kohaku-Blueleaf | 2023-10-28 | 1 | -9/+12 |
| | | | |||||
* | | | ignore mps for fp8 | Kohaku-Blueleaf | 2023-10-25 | 1 | -1/+3 |
| | | | |||||
* | | | Fix alphas cumprod | Kohaku-Blueleaf | 2023-10-25 | 1 | -1/+2 |
| | | | |||||
* | | | Fix alphas_cumprod dtype | Kohaku-Blueleaf | 2023-10-25 | 1 | -0/+1 |
| | | | |||||
* | | | fp8 for TE | Kohaku-Blueleaf | 2023-10-25 | 1 | -0/+7 |
| | | | |||||
* | | | Fix lint | Kohaku-Blueleaf | 2023-10-23 | 1 | -1/+1 |
| | | | |||||
* | | | Add CPU fp8 support | Kohaku-Blueleaf | 2023-10-23 | 1 | -4/+16 |
| | | | | | | | | | | | | | | | | | | Since norm layer need fp32, I only convert the linear operation layer(conv2d/linear) And TE have some pytorch function not support bf16 amp in CPU. I add a condition to indicate if the autocast is for unet. | ||||
* | | | Add sdxl only arg | Kohaku-Blueleaf | 2023-10-19 | 1 | -0/+3 |
| | | | |||||
* | | | Add fp8 for sd unet | Kohaku-Blueleaf | 2023-10-19 | 1 | -0/+3 |
| |/ |/| | |||||
* | | repair unload sd checkpoint button | AUTOMATIC1111 | 2023-10-15 | 1 | -12/+1 |
| | | |||||
* | | use shallow copy for #13535 | AUTOMATIC1111 | 2023-10-14 | 1 | -2/+1 |
| | | |||||
* | | Merge pull request #13535 from chu8129/dev | AUTOMATIC1111 | 2023-10-14 | 1 | -4/+5 |
|\ \ | | | | | | | fix: checkpoints_loaded:{checkpoint:state_dict}, model.load_state_dict issue in dict value empty | ||||
| * | | reverst | wangqiuwen | 2023-10-07 | 1 | -0/+2 |
| | | | |||||
| * | | up | wangqiuwen | 2023-10-07 | 1 | -6/+5 |
| | | | |||||
* | | | Merge pull request #13139 from AUTOMATIC1111/ckpt-dir-path-separator | AUTOMATIC1111 | 2023-09-30 | 1 | -2/+3 |
|\ \ \ | | | | | | | | | fix `--ckpt-dir` path separator and option use `short name` for checkpoint dropdown | ||||
| * | | | parsing string to path | w-e-w | 2023-09-08 | 1 | -2/+3 |
| | | | | |||||
| * | | | keep order in list of checkpoints when loading model that doesn't have a ↵ | AUTOMATIC1111 | 2023-08-30 | 1 | -1/+21 |
| | | | | | | | | | | | | | | | | checksum | ||||
* | | | | add missing import, simplify code, use patches module for #13276 | AUTOMATIC1111 | 2023-09-30 | 1 | -7/+12 |
| | | | | |||||
* | | | | Merge pull request #13276 from woweenie/patch-1 | AUTOMATIC1111 | 2023-09-30 | 1 | -1/+14 |
|\ \ \ \ | |_|/ / |/| | | | patch DDPM.register_betas so that users can put given_betas in model yaml | ||||
| * | | | patch DDPM.register_betas so that users can put given_betas in model yaml | woweenie | 2023-09-15 | 1 | -1/+14 |
| | |/ | |/| | |||||
* | | | fix | 王秋文/qwwang | 2023-09-18 | 1 | -0/+1 |
| | | | |||||
* | | | use dict[key]=model; did not update orderdict order, should use move to end | qiuwen.wang | 2023-09-15 | 1 | -0/+1 |
|/ / | |||||
* / | keep order in list of checkpoints when loading model that doesn't have a ↵ | AUTOMATIC1111 | 2023-08-30 | 1 | -1/+21 |
|/ | | | | checksum | ||||
* | set devices.dtype_unet correctly | AUTOMATIC1111 | 2023-08-23 | 1 | -1/+2 |
| | |||||
* | add --medvram-sdxl | AUTOMATIC1111 | 2023-08-22 | 1 | -8/+8 |
| | |||||
* | Fix for consistency with shared.opts.sd_vae of UI | Uminosachi | 2023-08-21 | 1 | -0/+1 |
| | |||||
* | Change where VAE state are stored in model | Uminosachi | 2023-08-20 | 1 | -2/+0 |
| | |||||
* | Change to access sd_model attribute with dot | Uminosachi | 2023-08-20 | 1 | -2/+2 |
| | |||||
* | Store base_vae and loaded_vae_file in sd_model | Uminosachi | 2023-08-20 | 1 | -16/+8 |
| | |||||
* | Fix SD VAE switch error after model reuse | Uminosachi | 2023-08-20 | 1 | -2/+20 |
| |