Infra-red light internal reflection

I’ve been researching the creation of a large touch table platform, to allow groups of multiple users to interact with one another at the same time. The below video is a ‘show and tell’ of my initial successes in developing an early prototype of the platform.

How it works

This diagram shows how I have constructed the screen. In the central layer of the screen, there is a thick piece of perspex glass. The perspex is surrounded by 300 Infra-red L.E.D lights. When the lights shine into the glass, the light is held within the glass by a physical process known as ‘Frustrated total internal reflection’. This means that the light will reflect off the outer surfaces of the glass continuously. The light will not leave the glass surface, until something makes contact with the glass surface(i.e. a finger!), in which case the light will refect off this object and out of the glass.

Light detection

Once the light reflects from a finger and out of the glass, it can be detected by an infra-red camera. To create the infra-red camera I have removed the infra-red filter from a standard web-camera, and added my own daylight filter to remove all light hiting the camera, apart from the infra-red light being reflected from the finger. The daylight filter consists of a section of an internal disk in a floppy disk, and some photographic film. The camera is then able to see these finger touches as a blob of light surrounded by darkness.

The visual display

Once the perspex glass is set-up with its infra-red L.E.D surrounding, it is placed over a sheet of frosted glass. A projector is angled toward a mirror which reflects the image onto the frosted glass. The infra-red camera is also angled at the mirror so that it can detect the blobs of light from the users finger touch.

From blobs to touch events

The laptop computer then runs a set of open source software pieces that allow the blobs of light to be interpreted as mouse movements and clicks. The software also calibrates the alignment of the screen projection, to the alignment of the web-camera. The open source software packages I have used include: Community core Vision Version 1.3 Beta, Touchlib and V Mouse.

The libyan Conflict

Using the large touch platform, I produced a multi-user interactive piece that leads the users through the events that took place at the onset of internation intervention in the 2011 Libyan civil war. Users could work together like heads of government, ordering in airstrikes and signing UN security council resolutions.

On every user touch, live feeds from twitter concerning the conflict join the table. The users may also feel a sense of guilt over thier actions as each time they touch the surface, they bring photographs of new war victims to the table.

The launch

The platform was displayed in a gallery space in Bristol U.K, and attracted many sets of users to get involved and have a play. People were excited by the possibilities of the technology, and had many questions for me to answer and explain. Producing this project was extremly hard work, and I had to overcome many difficulties along the way. Needless to say tho, it was a very rewarding process, during which I learned a huge amount as I pushed myself beyond my own technical limitations. I’m very keen to collaborate with others in the future, and pursue more creative technology experiments like this.

In Progress!

This site is currently in soft-launch phase, and most content elements are unavailable. Plently of bug fixes, responsive elements and additional features are underway. The full site will be launching soon.


Joe Allison

Joe Allison