Commit message (Collapse) | Author | Age | Files | Lines | |
---|---|---|---|---|---|
* | added torch.mps.empty_cache() to torch_gc() | AUTOMATIC1111 | 2023-07-08 | 1 | -5/+3 |
| | | | | changed a bunch of places that use torch.cuda.empty_cache() to use torch_gc() instead | ||||
* | Upscaler.load_model: don't return None, just use exceptions | Aarni Koskela | 2023-06-13 | 1 | -8/+5 |
| | |||||
* | Move `load_file_from_url` to modelloader | Aarni Koskela | 2023-06-13 | 1 | -4/+3 |
| | |||||
* | Simplify a bunch of `len(x) > 0`/`len(x) == 0` style expressions | Aarni Koskela | 2023-06-02 | 2 | -3/+4 |
| | |||||
* | rename print_error to report, use it with together with package name | AUTOMATIC | 2023-05-31 | 1 | -3/+2 |
| | |||||
* | Merge branch 'dev' into report-error | AUTOMATIC1111 | 2023-05-31 | 2 | -1/+148 |
|\ | |||||
| * | Vendor in the single module used from taming_transformers; remove ↵ | Aarni Koskela | 2023-05-30 | 2 | -1/+148 |
| | | | | | | | | | | | | taming_transformers dependency (and fix the two ruff complaints) | ||||
* | | Add & use modules.errors.print_error where currently printing exception info ↵ | Aarni Koskela | 2023-05-29 | 1 | -5/+2 |
|/ | | | | by hand | ||||
* | change upscalers to download models into user-specified directory (from ↵ | AUTOMATIC | 2023-05-19 | 1 | -2/+2 |
| | | | | commandline args) rather than the default models/<...> | ||||
* | Autofix Ruff W (not W605) (mostly whitespace) | Aarni Koskela | 2023-05-11 | 2 | -5/+5 |
| | |||||
* | manual fixes for some C408 | AUTOMATIC | 2023-05-10 | 3 | -7/+7 |
| | |||||
* | fixes for B007 | AUTOMATIC | 2023-05-10 | 1 | -1/+1 |
| | |||||
* | ruff manual fixes | AUTOMATIC | 2023-05-10 | 2 | -12/+12 |
| | |||||
* | ruff auto fixes | AUTOMATIC | 2023-05-10 | 2 | -8/+8 |
| | |||||
* | F401 fixes for ruff | AUTOMATIC | 2023-05-10 | 1 | -2/+2 |
| | |||||
* | manual fixes for ruff | AUTOMATIC | 2023-05-10 | 4 | -20/+21 |
| | |||||
* | autofixes from ruff | AUTOMATIC | 2023-05-10 | 2 | -2/+1 |
| | |||||
* | fix `--ldsr-models-path` not working | hitomi | 2023-04-04 | 1 | -7/+13 |
| | |||||
* | add option to show/hide warnings | AUTOMATIC | 2023-01-18 | 1 | -3/+0 |
| | | | | | removed hiding warnings from LDSR fixed/reworked few places that produced warnings | ||||
* | fix F541 f-string without any placeholders | Yuval Aboulafia | 2022-12-24 | 1 | -1/+1 |
| | |||||
* | Add safetensors support to LDSR | wywywywy | 2022-12-10 | 2 | -4/+14 |
| | |||||
* | Made device agnostic | wywywywy | 2022-12-10 | 1 | -3/+6 |
| | |||||
* | LDSR cache / optimization / opt_channelslast | wywywywy | 2022-12-10 | 2 | -12/+29 |
| | |||||
* | Reinstate DDPM V1 to LDSR | wywywywy | 2022-12-04 | 3 | -1/+1451 |
| | |||||
* | move #5216 to the extension | AUTOMATIC | 2022-12-03 | 2 | -0/+287 |
| | |||||
* | add built-in extension system | AUTOMATIC | 2022-12-03 | 3 | -0/+299 |
add support for adding upscalers in extensions move LDSR, ScuNET and SwinIR to built-in extensions |