▶️ gifcast

I implemented gifcast, a web page for converting asciinema casts to animated GIFs. Here’s the link:
https://dstein64.github.io/gifcast/

The JavaScript source code is available on GitHub:
https://github.com/dstein64/gifcast

The example below was generated with gifcast.

Here is the asciinema cast file used to generate the animated GIF: gifcast.cast

Tagged , , , | Leave a comment

Compressing VGG for Style Transfer

I recently implemented pastiche—discussed in a prior post—for applying neural style transfer. I encountered a size limit when uploading the library to PyPI, as a package cannot exceed 60MB. The 32-bit floating point weights for the underlying VGG model [1] were contained in an 80MB file. My package was subsequently approved for a size limit increase that could accommodate the VGG weights as-is, but I was still interested in compressing the model.

Various techniques have been proposed for compressing neural networks—including distillation [2] and quantization [3,4]—which have been shown to work well in the context of classification. My problem was in the context of style transfer, so I was not sure how model compression would impact the results.

Experiments

I decided to experiment with weight quantization, using a scheme where I could store the quantized weights on disk, and then uncompress the weights to full 32-bit floats at runtime. This quantization scheme would allow me to continue using my existing code after the model is loaded. I am not targeting environments where memory is a constraint, so I was not particularly interested in approaches that would also reduce the model footprint at runtime. I used kmeans1d—discussed in a prior post—for quantizing each layer’s weights.

Continue reading

Tagged , , , , | Leave a comment

SMAWK in C++

I recently implemented kmeans1d—discussed in a prior post—for efficiently performing globally optimal 1D k-means clustering. The implementation utilizes the SMAWK algorithm (Aggarwal et al., 1987), which calculates argmin(i) for each row i of an arbitrary n × m totally monotone matrix, in O(m(1 + lg(n/m))).

I’ve factored out my SMAWK C++ code into the example below. In general, SMAWK works with an implicitly defined matrix, utilizing a function that returns a value corresponding to an arbitrary position in the matrix. An explicitly defined matrix is used in the example for the purpose of illustration.

The program prints the column indices corresponding to the minimum element of each row in a totally monotone matrix. The matrix is from monge.pdf—a course document that I found online.

Continue reading

Tagged , | Leave a comment

kmeans1d: Globally Optimal Efficient 1D k-means Clustering

I implemented kmeans1d, a Python library for performing k-means clustering on 1D data, based on the algorithm in (Xiaolin 1991), as presented in section 2.2 of (Grønlund et al., 2017).

Globally optimal k-means clustering is NP-hard for multi-dimensional data. LLoyd’s algorithm is a popular approach for finding a locally optimal solution. For 1-dimensional data, there are polynomial time algorithms.

kmeans1d contains an O(kn + n log n) dynamic programming algorithm for finding the globally optimal k clusters for n 1D data points. The code is written in C++—for faster execution than a pure Python implementation—and wrapped in Python.

The source code is available on GitHub:
https://github.com/dstein64/kmeans1d

Continue reading

Tagged , | Leave a comment

Style Transfer Medley

I used the pastiche style transfer program—discussed in a prior post—to create the video shown above. The content image is a photo I took in Boston in 2015, and the style images were randomly sampled from the test images of the Painter by Numbers Kaggle competition.

The frames used in the video were retained during gradient descent by using pastiche‘s --workspace option.

The Python script for generating the video is on GitHub:
https://gist.github.com/dstein64/5dcc67fa43cc0d13d6d4d544095a1382

Tagged , , , | Leave a comment

🎨 pastiche

pastiche A literary, artistic, musical, or architectural work that imitates the style of previous work.

―Merriam-Webster dictionary

I recently implemented pastiche, a PyTorch-based Python program for applying neural style transfer [1]. Given a content image C and a style image S, neural style transfer (NST) synthesizes a new image I that retains the content from C and style from S. This is achieved by iteratively updating I so that relevant properties of its representation within the VGG neural network [3] approach the corresponding properties for C and S.

The library is available on PyPI and can be installed with pip.

$ pip3 install pastiche

The example image above was synthesized by applying the style from Vincent van Gogh’s The Starry Night to a photo I took in Boston in 2015.

Continue reading

Tagged , , , , | Leave a comment

Debugging in Vim

Vim 8.1 was released about a year ago, in May 2018. The “main new feature” was official support for running a terminal within vim. Along with this came a built-in debugger plugin, termdebug, which provides a visual interface for interacting with gdb. This post walks through an example session using termdebug.

Continue reading

Tagged , , | 6 Comments

🔐 LC4

About a year ago I wrote a Python library that implements ElsieFour (LC4) encryption (Alan Kaminsky 2017). LC4 is designed for human-to-human communication, without requiring a computer.

I’ve recently updated the library to include color-coded verbose output that shows the steps of the algorithm. This can be helpful for learning to manually encrypt and decrypt messages. The verbose output is accessible through both the Python API and the command-line interface (using --verbose).

Continue reading

Tagged , , | Leave a comment

⌨️🏌️🐍 vimgolf Client in Python

I implemented a vimgolf client in Python.

The source code is available on GitHub:
https://github.com/dstein64/vimgolf

The user interface is similar to the official vimgolf client, with a few additions inspired by vimgolf-finder.

The package is available on PyPI, the Python Package Index. It can be installed with pip.

$ pip3 install vimgolf

Tagged , , | Leave a comment

More Bézier Walks in Neural Networks

The videos above were generated using the same script described in an earlier post.

Continue reading

Tagged , , | Leave a comment