[Paper]
ARXIV
Independent
LSH
Quantisation
The analysis of the compression effects in generative adversarial networks
(GANs) after training, i.e. without any fine-tuning, remains an unstudied,
albeit important, topic with the increasing trend of their computation and
memory requirements. While existing works discuss the difficulty of compressing
GANs during training, requiring novel methods designed with the instability of
GANs training in mind, we show that existing compression methods (namely
clipping and quantization) may be directly applied to compress GANs
post-training, without any additional changes. High compression levels may
distort the generated set, likely leading to an increase of outliers that may
negatively affect the overall assessment of existing k-nearest neighbor (KNN)
based metrics. We propose two new precision and recall metrics based on
locality-sensitive hashing (LSH), which, on top of increasing the outlier
robustness, decrease the complexity of assessing an evaluation sample against