Commit message (Collapse) | Author | Age | Files | Lines | |
---|---|---|---|---|---|
* | added torch.mps.empty_cache() to torch_gc() | AUTOMATIC1111 | 2023-07-08 | 1 | -1/+1 |
| | | | | changed a bunch of places that use torch.cuda.empty_cache() to use torch_gc() instead | ||||
* | Merge pull request #11046 from akx/ded-code | AUTOMATIC1111 | 2023-06-27 | 1 | -4/+0 |
|\ | | | | | Remove a bunch of unused/vestigial code | ||||
| * | Remove a bunch of unused/vestigial code | Aarni Koskela | 2023-06-05 | 1 | -4/+0 |
| | | | | | | | | As found by Vulture and some eyes | ||||
* | | Use os.makedirs(..., exist_ok=True) | Aarni Koskela | 2023-06-13 | 1 | -3/+1 |
|/ | |||||
* | rename print_error to report, use it with together with package name | AUTOMATIC | 2023-05-31 | 1 | -4/+3 |
| | |||||
* | Add & use modules.errors.print_error where currently printing exception info ↵ | Aarni Koskela | 2023-05-29 | 1 | -6/+4 |
| | | | | by hand | ||||
* | fixes for B007 | AUTOMATIC | 2023-05-10 | 1 | -1/+1 |
| | |||||
* | imports cleanup for ruff | AUTOMATIC | 2023-05-10 | 1 | -3/+1 |
| | |||||
* | Filter out temporary files that will be generated if the download fails. | Tpinion | 2023-02-23 | 1 | -1/+1 |
| | |||||
* | clean up unused script_path imports | Max Audron | 2023-01-27 | 1 | -1/+1 |
| | |||||
* | Set device for facelib/facexlib and gfpgan | brkirch | 2022-11-12 | 1 | -0/+3 |
| | | | | | * FaceXLib/FaceLib doesn't pass the device argument to RetinaFace but instead chooses one itself and sets it to a global - in order to use a device other than its internally chosen default it is necessary to manually replace the default value * The GFPGAN constructor needs the device argument to work with MPS or a CUDA device ID that differs from the default | ||||
* | send all three of GFPGAN's and codeformer's models to CPU memory instead of ↵ | AUTOMATIC | 2022-10-04 | 1 | -2/+10 |
| | | | | just one for #1283 | ||||
* | fix for broken codeformer in PR | AUTOMATIC | 2022-09-30 | 1 | -1/+1 |
| | |||||
* | Cleanup existing directories, fixes | d8ahazard | 2022-09-26 | 1 | -11/+4 |
| | |||||
* | Re-implement universal model loading | d8ahazard | 2022-09-26 | 1 | -10/+25 |
| | |||||
* | Removed stray references to shared.device_codeformer. | Elias Oenal | 2022-09-14 | 1 | -4/+2 |
| | |||||
* | fix for codeformer switching torch devices on metal systems. | Elias Oenal | 2022-09-14 | 1 | -2/+2 |
| | |||||
* | Codeformer face restoration not working: AttributeError: module ↵ | AUTOMATIC | 2022-09-12 | 1 | -3/+3 |
| | | | | 'modules.shared' has no attribute 'device_codeformer' #348 | ||||
* | Merge pull request #294 from EliasOenal/master | AUTOMATIC1111 | 2022-09-12 | 1 | -3/+5 |
|\ | | | | | Fixes for mps/Metal: use of seeds, img2img, CodeFormer | ||||
| * | Refactored Metal/mps fixes. | Elias Oenal | 2022-09-12 | 1 | -15/+6 |
| | | |||||
| * | CodeFormer does not support mps/metal backend, implemented fallback to cpu ↵ | Elias Oenal | 2022-09-11 | 1 | -4/+15 |
| | | | | | | | | backend. | ||||
* | | Instance of CUDA out of memory on a low-res batch, even with ↵ | AUTOMATIC | 2022-09-12 | 1 | -14/+18 |
|/ | | | | --opt-split-attention-v1 (found cause) #255 | ||||
* | undo CodeFormer's upscaling of images with dimensions less than 512. | AUTOMATIC | 2022-09-10 | 1 | -0/+8 |
| | |||||
* | a little bit of rework for extras tab | AUTOMATIC | 2022-09-07 | 1 | -3/+7 |
| | |||||
* | codeformer support | AUTOMATIC | 2022-09-07 | 1 | -0/+108 |