Possibly Wrong

Hilbert halftone art

Introduction

This post was motivated by a recent attempt to transform a photograph into a large digital print, in what I hoped would be a mathematically interesting way. The idea was pretty simple: convert the (originally color) image into a black-on-white line drawing– a white background, with a single, continuous, convoluted black curve, one pixel wide.

This isn’t a new idea. One example of how to do this is to convert the image into a solution of an instance of the traveling salesman problem, with more “cities” clustered in darker regions of the source image. But I wanted to do something slightly different, with more explicitly visible structure… which doesn’t necessarily translate to more visual appeal: draw a Hilbert space-filling curve, but vary the order of the curve (roughly, the depth of recursion) locally according to the gray level of the corresponding pixels of the source image.

After some experimenting, I settled on the transformation described in the figure below. Each pixel of the source image is “inflated” to an 8-by-8 block of pixels in the output, with a black pixel (lower left) represented by a second-order Hilbert curve, and a white pixel (upper left) by just a line segment directly connecting the endpoints of the block, with two additional gray levels in between, each connecting progressively more/fewer points along the curve.

Conversion of 2×2-pixel image, with each of four gray levels mapped to corresponding 8×8 block of (approximated) Hilbert curve.

Example

The figure below shows an example of creating an image for input to the algorithm. There are two challenges to consider:

Conversion of original color image to 128-by-128 image with 4 gray levels.

The figure below shows the resulting output. You may have to zoom in to see the details of the curve, especially in the black regions.

Black and white 1024-by-1024 image containing a single black Hilbert curve, one pixel in width.

Source code

Following is the Python source that does the transformation. It uses the Hilbert curve encoding module from my allRGB experiment, and I used Pygame for the image formatting so that I could watch the output as it was created.


import hilbert
import pygame

BLACK = (0, 0, 0, 255)
DARK_GRAY = (85, 85, 85, 255)
LIGHT_GRAY = (160, 160, 160, 255)
WHITE = (255, 255, 255, 255)

STEPS = {BLACK: [1] * 15,
         DARK_GRAY: [1, 1, 1, 1, 4, 1, 1, 5],
         LIGHT_GRAY: [4, 7, 4],
         WHITE: [15]}

class Halftone:
    def __init__(self, image, step):
        self.h = hilbert.Hilbert(2)
        self.index = -1
        self.pos = (0, 0)
        width, height = image.get_size()
        self.target = pygame.Surface((step * 4 * width, step * 4 * height))
        self.target.fill(WHITE)
        for pixel in range(width * height):
            self.move(1, step)
            for n in STEPS[tuple(image.get_at([w // 4 for w in self.pos]))]:
                self.move(n, step)

    def move(self, n, step):
        self.index = self.index + n
        next_pos = self.h.encode(self.index)
        pygame.draw.line(self.target, BLACK, [step * w for w in self.pos],
                                             [step * w for w in next_pos])
        self.pos = next_pos

if __name__ == '__main__':
    import sys
    step = int(sys.argv[1])
    for filename in sys.argv[2:]:
        pygame.image.save(Halftone(pygame.image.load(filename), step).target,
                          filename + '.ht.png')