Color Quantization is a process by which we reduce the number of colors in a given image and generate a palette. The main reason why one would want to use a palette from an image was the impossibility for early graphics adapters to handle full 24-bit RGB images due to memory limitations. However, there are other interesting applications for palettes such as retrieving a set of visually pleasing colors from a photograph or a painting.
This idea is exposed by Soctt Draves in his inspiring talk The Electric Sheep and their Dreams in High Fidelity. The problem he’s aiming is how to generate random colors which didn’t look completely like noise. So, instead of generating color values such as [ random red, random green, random blue ], we generate a palette of N values and choose an index for that palette according to a criterion.
Generating a palette from a given image is a clustering problem, where we group all the points in the RGB space into N -typically 256- clusters, or palette entries. Instead of just dividing the whole 256 * 256 * 256 color space, we can do better by sampling the image on a first iteration to retrieve the set of points which are actually used, and then perform our clustering pass over that set of points. There are several variants for color quantization algorithms, and below I provide an implementation of Dan Bloomberg’s modified median cut.
Press any key to change image.
Built with Processing