How does it work?

Digital photographs are made up of individual pixels, and each one is split into red, green and blue. When you take a photo, your camera measures the amount of red, green and blue light hitting each pixel, ranks them on a scale from 0 to 255 and then records those values.

Matt Parker, who’s involved in the MegaPixel project, has created a photo/spreadsheet converter which allows you to see exactly how this process works – you can upload a photo and download it as an MS Excel spreadsheet, containing the number values. By setting the coloured cells in the spreadsheet to different brightnesses you can actually see the photo, if you zoom out far enough!

iphone-microscope

iPhone screen under a microscope. Image by Jamie Gallagher.

Displays like phone screens, TVs, computer monitors and even digital projectors all display images using red, green and blue light. In the case of many LCD (liquid crystal display) screens, the red, green and blue light is just arranged in blocks, where the brightness of each segment depends on how much red/green/blue needs to be displayed there.

The MegaPixel is an attempt to recreate this – on a much larger scale! Each of our ‘pixels’ measures around 8cm across, and is split into red, green and blue segments. The segments simply need to be coloured in using a grid of 100 squares – the number of squares you colour in that colour will determine how much R/G/B is displayed by that pixel, and the rest of the grid will be coloured black.

The pixels will be arranged in the window at the Museum of Science and Industry in Manchester, so that by standing far away from the display, you’ll be able to see the effect of all the pixels merging together to display a photo. Fingers crossed!

twitterhead