Build Your Own Multitouch Surface Computer

Software


So now we’ve completed the hardware section. The process of building a multi-touch computer is far from over, though, we need to get the software installed and configured so that we can actually use the thing.

The central software that powered our rig is Touchlib , an open source library which takes the visual data received by the camera and parses it into touch events, which can be used by other programs to provide multi-touch control. Some programs implement this library directly, allowing for standalone multi-touch apps, while others, such as those written in ActionScript, require an extra software layer to allow the program to receive touch input. In this section, we’ll explain how we got both up and running.

Alex Popovich

here

4. Now that we know the camera works, we need to check to see that the DirectShow filter works, which will allow other programs to access the PS3 Eye. There’s another program called AmCap installed with the drivers. Run it, and if the preview view shows what the camera is seeing, we’re golden. If it doesn’t, try unplugging the PS3 Eye and plugging it back in, as well as rebooting your computer. For reasons unknown, the first time we tried, we got stuck at this stage. For us, running the uninstall program, then installing the drivers again fixed the problem.

Now we'd gotten our PS3Eye up and running. Next, we had to get Touchlib set up to handle our touch detection. Touchlib can be found here , and doesn’t need to be installed. We simply extracted the files to C:/Multitouch.

We had to replace certain Touchlib files with ones specifically designed to work well with the PS3 Eye. We downloaded these files here , then extracted them to C:/Multitouch/touchlib, overwriting when we were prompted to do so.

Now, at long last, we’ll get to see how our surface actually works. We ran ConfigApp.exe from C:/Multitouch/Touchlib. This program launches a total of eight windows, six of which are the video stream from the PS3 Eye, at different stages of processing. It’s a little over whelming the first time, but it’s actually not that hard to use these to get Touchlib properly configured to do touch recognition on our setup.

The most important window is the one in the bottom right, with the slider marked “Rectify” is the one that’s most significant. This window displays the “blobs” that will get passed as touch events. When the touch surface is working properly, this window will be entirely black until we touch the screen, at which point a white blob will appear, hopefully without flickering. Starting with the window in the bottom left, and moving right, we adjusted all the sliders so that we got the clearest blobs when you touch the screen. Generally, we accomplished this by playing with each slider until we started to get background noise on the Rectify window, then scaled it back slightly. Once you’re happy with the sensitivity of your screen, it’s time to calibrate.



Calibration is necessary to sync up the projected image and the touch surface. To calibrate, we first pressed the Enter key. This enables full screen mode, and displays a grid of green crosses. To begin calibration, we pressed the ‘c’ key. One of the crosses on the display turns red. By pressing on the dot, Touchlib is able to map that point in projector space to a point in camera space. The next cross will turn red, and this continues until all crosses have been pressed. We learned the hard way that when you’re done calibrating Touchlib, you must press the escape key. If you close the program any other way, it won’t update the config file with your changes.

Now our surface had been be properly configured (at least until we moved the camera or projector, or the lighting conditions changed significantly). To test it out, we ran the smoke.exe app in the Touchlib folder. With everything calibrated properly, colored “smoke” particle effects were displayed on our surface everywhere we touched.

However, we didn’t let ourselves celebrate for too long, because many apps written for the multi-touch surface are coded in the ActionScript language used by Adobe Flash. These programs are not set up to natively use multi-touch data, so we had to use a software layer to allow the flash-based apps to work properly. This is a three step process:

1. Change Flash’s global security settings to allow the flash apps access to the touch data. Go to this page and click the “Edit locations…” dropdown box, then select “Add Location,” then “Browse for folders.” Browse to the folder containing the flash programs you want to run, which by default is C:/Multitouch/Clients. This only has to be done once.



3. Run FlashOSCv2.jar (requires the Java runtime environment) from C:/Multitouch/Clients/flosc. And press the “start” button on the window that pops up. This program simply allows flash programs to access OSC data. This also has to be run every time you run multi-touch flash apps.


With that done, we were finally ready to try out the whole array of apps that have been written for DIY multi-touch tables. There aren’t a ton of apps available right, but we were able to find enough to have a good time with the table. Some apps come with Touchlib, in the C:/Multitouch/Clients folder, and we also recommend the AudioTouch apps, available at Seth Sandler's blog and the Multitouch Media App , by Laurence Muller.