aboutsummaryrefslogtreecommitdiffstats
path: root/modules/sub_quadratic_attention.py
Commit message (Collapse)AuthorAgeFilesLines
* Add UI setting for upcasting attention to float32brkirch2023-01-251-2/+2
| | | | | | Adds "Upcast cross attention layer to float32" option in Stable Diffusion settings. This allows for generating images using SD 2.1 models without --no-half or xFormers. In order to make upcasting cross attention layer optimizations possible it is necessary to indent several sections of code in sd_hijack_optimizations.py so that a context manager can be used to disable autocast. Also, even though Stable Diffusion (and Diffusers) only upcast q and k, unfortunately my findings were that most of the cross attention layer optimizations could not function unless v is upcast also.
* Remove fallback for Protocol import and remove Protocol import and remove ↵AUTOMATIC2023-01-091-8/+11
| | | | | | instances of Protocol in code add some whitespace between functions to be in line with other code in the repo
* Add fallback for Protocol importProGamerGov2023-01-071-1/+7
|
* Added licensebrkirch2023-01-061-1/+1
|
* Use narrow instead of dynamic_slicebrkirch2023-01-061-15/+19
|
* Add Birch-san's sub-quadratic attention implementationbrkirch2023-01-061-0/+201