Categories
Uncategorized

Compressing VGG for Style Transfer

32-bit float (no quantization) 8-bit 7-bit
6-bit 5-bit 4-bit
3-bit 2-bit 1-bit

I recently implemented pastiche—discussed in a prior post—for applying neural style transfer. I encountered a size limit when uploading the library to PyPI, as a package cannot exceed 60MB. The 32-bit floating point weights for the underlying VGG model [1] were contained in an 80MB file. My package was subsequently approved for a size limit increase that could accommodate the VGG weights as-is, but I was still interested in compressing the model.

Various techniques have been proposed for compressing neural networks—including distillation [2] and quantization [3,4]—which have been shown to work well in the context of classification. My problem was in the context of style transfer, so I was not sure how model compression would impact the results.

Experiments

I decided to experiment with weight quantization, using a scheme where I could store the quantized weights on disk, and then uncompress the weights to full 32-bit floats at runtime. This quantization scheme would allow me to continue using my existing code after the model is loaded. I am not targeting environments where memory is a constraint, so I was not particularly interested in approaches that would also reduce the model footprint at runtime. I used kmeans1d—discussed in a prior post—for quantizing each layer’s weights.

Categories
Uncategorized

SMAWK in C++

I recently implemented kmeans1d—discussed in a prior post—for efficiently performing globally optimal 1D k-means clustering. The implementation utilizes the SMAWK algorithm (Aggarwal et al., 1987), which calculates argmin(i) for each row i of an arbitrary n × m totally monotone matrix, in O(m(1 + lg(n/m))).

I’ve factored out my SMAWK C++ code into the example below. In general, SMAWK works with an implicitly defined matrix, utilizing a function that returns a value corresponding to an arbitrary position in the matrix. An explicitly defined matrix is used in the example for the purpose of illustration.

The program prints the column indices corresponding to the minimum element of each row in a totally monotone matrix. The matrix is from monge.pdf—a course document that I found online.

Categories
Uncategorized

kmeans1d: Globally Optimal Efficient 1D k‑means Clustering

I implemented kmeans1d, a Python library for performing k-means clustering on 1D data, based on the algorithm in (Xiaolin 1991), as presented in section 2.2 of (Grønlund et al., 2017).

Globally optimal k-means clustering is NP-hard for multi-dimensional data. LLoyd’s algorithm is a popular approach for finding a locally optimal solution. For 1-dimensional data, there are polynomial time algorithms.

kmeans1d contains an O(kn + n log n) dynamic programming algorithm for finding the globally optimal k clusters for n 1D data points. The code is written in C++—for faster execution than a pure Python implementation—and wrapped in Python.

The source code is available on GitHub:
https://github.com/dstein64/kmeans1d

Categories
Uncategorized

Style Transfer Medley

I used the pastiche style transfer program—discussed in a prior post—to create the video shown above. The content image is a photo I took in Boston in 2015, and the style images were randomly sampled from the test images of the Painter by Numbers Kaggle competition.

The frames used in the video were retained during gradient descent by using pastiche‘s --workspace option.

The Python script for generating the video is on GitHub:
https://gist.github.com/dstein64/5dcc67fa43cc0d13d6d4d544095a1382

Categories
Uncategorized

🎨 pastiche

pastiche A literary, artistic, musical, or architectural work that imitates the style of previous work.

―Merriam-Webster dictionary

I recently implemented pastiche, a PyTorch-based Python program for applying neural style transfer [1]. Given a content image C and a style image S, neural style transfer (NST) synthesizes a new image I that retains the content from C and style from S. This is achieved by iteratively updating I so that relevant properties of its representation within the VGG neural network [3] approach the corresponding properties for C and S.

The library is available on PyPI and can be installed with pip.

$ pip3 install pastiche

The example image above was synthesized by applying the style from Vincent van Gogh’s The Starry Night to a photo I took in Boston in 2015.