aboutsummaryrefslogtreecommitdiffstats
path: root/modules/sd_hijack_optimizations.py
Commit message (Collapse)AuthorAgeFilesLines
...
* Fix VRAM Issue by only loading in hypernetwork when selected in settingsFampai2022-10-091-3/+3
|
* make --force-enable-xformers work without needing --xformersAUTOMATIC2022-10-081-1/+1
|
* add fallback for xformers_attnblock_forwardAUTOMATIC2022-10-081-1/+4
|
* simplify xfrmers options: --xformers to enable and that's itAUTOMATIC2022-10-081-7/+13
|
* emergency fix for xformers (continue + shared)AUTOMATIC2022-10-081-8/+8
|
* Merge pull request #1851 from C43H66N12O12S2/flashAUTOMATIC11112022-10-081-1/+37
|\ | | | | xformers attention
| * update sd_hijack_opt to respect new env variablesC43H66N12O12S22022-10-081-3/+8
| |
| * Update sd_hijack_optimizations.pyC43H66N12O12S22022-10-081-1/+1
| |
| * add xformers attnblock and hypernetwork supportC43H66N12O12S22022-10-081-2/+18
| |
| * switch to the proper way of calling xformersC43H66N12O12S22022-10-081-25/+3
| |
| * add xformers attentionC43H66N12O12S22022-10-071-1/+38
| |
* | Add hypernetwork support to split cross attention v1brkirch2022-10-081-4/+14
| | | | | | | | | | * Add hypernetwork support to split_cross_attention_forward_v1 * Fix device check in esrgan_model.py to use devices.device_esrgan instead of shared.device
* | added support for hypernetworks (???)AUTOMATIC2022-10-071-2/+15
|/
* Merge branch 'master' into stableJairo Correa2022-10-021-8/+0
|
* initial support for training textual inversionAUTOMATIC2022-10-021-0/+164