ePrints@IIScePrints@IISc Home | About | Browse | Latest Additions | Advanced Search | Contact | Help

Quantized Generative Models for Solving Inverse Problems

Kumar Reddy, NK and Killedar, V and Seelamantula, CS (2023) Quantized Generative Models for Solving Inverse Problems. In: UNSPECIFIED, pp. 1520-1525.

[img]
Preview
PDF
CVE_Int_Con_Com_Vis_Wor_2023.pdf - Published Version

Download (1MB) | Preview
Official URL: https://doi.org/10.1109/ICCVW60793.2023.00167

Abstract

Generative priors have been shown to be highly successful in solving inverse problems. In this paper, we consider quantized generative models i.e., the generator network weights come from a learnt finite alphabet. Quantized neural networks are efficient in terms of memory and computation. They are ideally suited for deployment in a practical setting involving low-precision hardware. In this paper, we solve non-linear inverse problems using quantized generative models. We introduce a new meta-learning framework that makes use of proximal operators and jointly optimizes the quantized weights of the generative model, parameters of the sensing network, and the latent-space representation. Experimental validation is carried out using standard datasets - MNIST, CIFAR10, SVHN, and STL10. The results show that the performance of 32-bit networks can be achieved using 4-bit networks. The performance of 1-bit networks is about 0.7 to 2 dB inferior, while saving significantly (32�) on the model size. © 2023 IEEE.

Item Type: Conference Paper
Publication: Proceedings - 2023 IEEE/CVF International Conference on Computer Vision Workshops, ICCVW 2023
Publisher: Institute of Electrical and Electronics Engineers Inc.
Additional Information: The copyright for this article belongs to publisher
Department/Centre: Others
Date Deposited: 01 Mar 2024 10:08
Last Modified: 01 Mar 2024 10:08
URI: https://eprints.iisc.ac.in/id/eprint/84039

Actions (login required)

View Item View Item