Compressing VGG for Style Transfer

32-bit float (no quantization) 8-bit 7-bit
6-bit 5-bit 4-bit
3-bit 2-bit 1-bit

I recently implemented pastiche—discussed in a prior post—for applying neural style transfer. I encountered a size limit when uploading the library to PyPI, as a package cannot exceed 60MB. The 32-bit floating point weights for the underlying VGG model [1] were contained in an 80MB file. My package was subsequently approved for a size limit increase that could accommodate the VGG weights as-is, but I was still interested in compressing the model.

Various techniques have been proposed for compressing neural networks—including distillation [2] and quantization [3,4]—which have been shown to work well in the context of classification. My problem was in the context of style transfer, so I was not sure how model compression would impact the results.


I decided to experiment with weight quantization, using a scheme where I could store the quantized weights on disk, and then uncompress the weights to full 32-bit floats at runtime. This quantization scheme would allow me to continue using my existing code after the model is loaded. I am not targeting environments where memory is a constraint, so I was not particularly interested in approaches that would also reduce the model footprint at runtime. I used kmeans1d—discussed in a prior post—for quantizing each layer’s weights.


Style Transfer Medley

I used the pastiche style transfer program—discussed in a prior post—to create the video shown above. The content image is a photo I took in Boston in 2015, and the style images were randomly sampled from the test images of the Painter by Numbers Kaggle competition.

The frames used in the video were retained during gradient descent by using pastiche‘s --workspace option.

The Python script for generating the video is on GitHub:


🎨 pastiche

pastiche A literary, artistic, musical, or architectural work that imitates the style of previous work.

―Merriam-Webster dictionary

Update 1/20/2021: The command line usage snippets were updated in accordance with v1.1.0.

I recently implemented pastiche, a PyTorch-based Python program for applying neural style transfer [1]. Given a content image C and a style image S, neural style transfer (NST) synthesizes a new image I that retains the content from C and style from S. This is achieved by iteratively updating I so that relevant properties of its representation within the VGG neural network [3] approach the corresponding properties for C and S.

The library is available on PyPI and can be installed with pip.

$ pip3 install pastiche

The example image above was synthesized by applying the style from Vincent van Gogh’s The Starry Night to a photo I took in Boston in 2015.