Age | Commit message (Collapse) | Author | Lines | |
---|---|---|---|---|
2022-10-08 | Merge pull request #1851 from C43H66N12O12S2/flash | AUTOMATIC1111 | -1/+37 | |
xformers attention | ||||
2022-10-08 | update sd_hijack_opt to respect new env variables | C43H66N12O12S2 | -3/+8 | |
2022-10-08 | Update sd_hijack_optimizations.py | C43H66N12O12S2 | -1/+1 | |
2022-10-08 | add xformers attnblock and hypernetwork support | C43H66N12O12S2 | -2/+18 | |
2022-10-08 | Add hypernetwork support to split cross attention v1 | brkirch | -4/+14 | |
* Add hypernetwork support to split_cross_attention_forward_v1 * Fix device check in esrgan_model.py to use devices.device_esrgan instead of shared.device | ||||
2022-10-08 | switch to the proper way of calling xformers | C43H66N12O12S2 | -25/+3 | |
2022-10-07 | added support for hypernetworks (???) | AUTOMATIC | -2/+15 | |
2022-10-07 | add xformers attention | C43H66N12O12S2 | -1/+38 | |
2022-10-02 | Merge branch 'master' into stable | Jairo Correa | -0/+156 | |
2022-10-02 | initial support for training textual inversion | AUTOMATIC | -0/+164 | |