Commit message (Collapse) | Author | Age | Files | Lines | |
---|---|---|---|---|---|
* | Remove wrong self reference in CUDA support for invokeai | Cheka | 2022-10-19 | 1 | -1/+1 |
| | |||||
* | Update sd_hijack_optimizations.py | C43H66N12O12S2 | 2022-10-18 | 1 | -0/+3 |
| | |||||
* | readd xformers attnblock | C43H66N12O12S2 | 2022-10-18 | 1 | -0/+15 |
| | |||||
* | delete xformers attnblock | C43H66N12O12S2 | 2022-10-18 | 1 | -12/+0 |
| | |||||
* | Use apply_hypernetwork function | brkirch | 2022-10-11 | 1 | -10/+4 |
| | |||||
* | Add InvokeAI and lstein to credits, add back CUDA support | brkirch | 2022-10-11 | 1 | -0/+13 |
| | |||||
* | Add check for psutil | brkirch | 2022-10-11 | 1 | -4/+15 |
| | |||||
* | Add cross-attention optimization from InvokeAI | brkirch | 2022-10-11 | 1 | -0/+79 |
| | | | | | | * Add cross-attention optimization from InvokeAI (~30% speed improvement on MPS) * Add command line option for it * Make it default when CUDA is unavailable | ||||
* | rename hypernetwork dir to hypernetworks to prevent clash with an old ↵ | AUTOMATIC | 2022-10-11 | 1 | -1/+1 |
| | | | | filename that people who use zip instead of git clone will have | ||||
* | fixes related to merge | AUTOMATIC | 2022-10-11 | 1 | -1/+2 |
| | |||||
* | replace duplicate code with a function | AUTOMATIC | 2022-10-11 | 1 | -29/+15 |
| | |||||
* | remove functorch | C43H66N12O12S2 | 2022-10-10 | 1 | -2/+0 |
| | |||||
* | Fix VRAM Issue by only loading in hypernetwork when selected in settings | Fampai | 2022-10-09 | 1 | -3/+3 |
| | |||||
* | make --force-enable-xformers work without needing --xformers | AUTOMATIC | 2022-10-08 | 1 | -1/+1 |
| | |||||
* | add fallback for xformers_attnblock_forward | AUTOMATIC | 2022-10-08 | 1 | -1/+4 |
| | |||||
* | simplify xfrmers options: --xformers to enable and that's it | AUTOMATIC | 2022-10-08 | 1 | -7/+13 |
| | |||||
* | emergency fix for xformers (continue + shared) | AUTOMATIC | 2022-10-08 | 1 | -8/+8 |
| | |||||
* | Merge pull request #1851 from C43H66N12O12S2/flash | AUTOMATIC1111 | 2022-10-08 | 1 | -1/+37 |
|\ | | | | | xformers attention | ||||
| * | update sd_hijack_opt to respect new env variables | C43H66N12O12S2 | 2022-10-08 | 1 | -3/+8 |
| | | |||||
| * | Update sd_hijack_optimizations.py | C43H66N12O12S2 | 2022-10-08 | 1 | -1/+1 |
| | | |||||
| * | add xformers attnblock and hypernetwork support | C43H66N12O12S2 | 2022-10-08 | 1 | -2/+18 |
| | | |||||
| * | switch to the proper way of calling xformers | C43H66N12O12S2 | 2022-10-08 | 1 | -25/+3 |
| | | |||||
| * | add xformers attention | C43H66N12O12S2 | 2022-10-07 | 1 | -1/+38 |
| | | |||||
* | | Add hypernetwork support to split cross attention v1 | brkirch | 2022-10-08 | 1 | -4/+14 |
| | | | | | | | | | | * Add hypernetwork support to split_cross_attention_forward_v1 * Fix device check in esrgan_model.py to use devices.device_esrgan instead of shared.device | ||||
* | | added support for hypernetworks (???) | AUTOMATIC | 2022-10-07 | 1 | -2/+15 |
|/ | |||||
* | Merge branch 'master' into stable | Jairo Correa | 2022-10-02 | 1 | -8/+0 |
| | |||||
* | initial support for training textual inversion | AUTOMATIC | 2022-10-02 | 1 | -0/+164 |