The curious rise of the pixel shader

As Twitter adds Instagram-like filters to its official clients, I’m left reflecting that the most successful application for graphics accelerators in the market today, after games, is cheap pixel shaders run against people’s photos on their phones.

The Instagram, and now Twitter, filters are some of the most computationally cheap and uninteresting things you can do with a GPU today, yet they’re some of the biggest crowd pleasers in modern computing. People love destroying the raw data captured from their phones camera’s sensor with the same filters that hundreds of millions of other people have access to. Pictures are memorable not because of the content but because you can tell they were post-processed by Instagram.

The hardware in a modern phone is capable of some incredible feats of photograph-enhancing post-processing, be it correction or content-aware resizing, or performance enhancements like faster HDR processing. Sepia and desaturation — less than a dozen instructions per pixel in each case — win out though.

There must be a market for a product that really opens up the post-processing options, giving the look of being obviously filtered but giving you much more control and creative input into how the picture finally looks, all while using the GPU to make it possible with high performance.

It could be a killer feature for the next big release of Android or iOS, where the platform’s native camera application is heavily GPU accelerated and the post-processing filters and tools are built right into the native image capture pipeline, before being shared out into whatever networks you plug in via apps.