megvii-research / nafnet

Nonlinear Activation Free Network for Image Restoration

  • Public
  • 1.3M runs
  • GitHub
  • Paper
  • License

Run time and cost

This model costs approximately $0.0078 to run on Replicate, or 128 runs per $1, but this varies depending on your inputs. It is also open source and you can run it on your own computer with Docker.

This model runs on Nvidia T4 GPU hardware. Predictions typically complete within 35 seconds. The predict time for this model varies significantly based on the inputs.

Readme

NAFNet: Nonlinear Activation Free Network for Image Restoration

The official pytorch implementation of the paper Simple Baselines for Image Restoration

Liangyu Chen*, Xiaojie Chu*, Xiangyu Zhang, Jian Sun

Although there have been significant advances in the field of image restoration recently, the system complexity of the state-of-the-art (SOTA) methods is increasing as well, which may hinder the convenient analysis and comparison of methods. In this paper, we propose a simple baseline that exceeds the SOTA methods and is computationally efficient. To further simplify the baseline, we reveal that the nonlinear activation functions, e.g. Sigmoid, ReLU, GELU, Softmax, etc. are not necessary: they could be replaced by multiplication or removed. Thus, we derive a Nonlinear Activation Free Network, namely NAFNet, from the baseline. SOTA results are achieved on various challenging benchmarks, e.g. 33.69 dB PSNR on GoPro (for image deblurring), exceeding the previous SOTA 0.38 dB with only 8.4% of its computational costs; 40.30 dB PSNR on SIDD (for image denoising), exceeding the previous SOTA 0.28 dB with less than half of its computational costs.

PSNR_vs_MACs

Citations

If NAFNet helps your research or work, please consider citing NAFNet.

@article{chen2022simple,
  title={Simple Baselines for Image Restoration},
  author={Chen, Liangyu and Chu, Xiaojie and Zhang, Xiangyu and Sun, Jian},
  journal={arXiv preprint arXiv:2204.04676},
  year={2022}
}