WebRTC + Canvas + JavaScript = Air finger web browsing. You can see this demo with almost any modern browser, by default a video demo is showed but to test this project with your own camera you need to have a WebRTC enabled Chrome browser  with enable media stream in chrome://flags.

For video tracking I have developed a tool called videotracker.js. Its running is based in color detection and blend difference, every frame is captured from computer camera using WebRTC and copied to a Canvas object to be processed using javascript. In this experiment you can control google maps direction using an object with the selected color (skin color as default). The working is very simple: If your have enabled webrtc your camera begins to record and you can see yourself in the small screen. Then move your finger across the screen to start to scroll the map.

The speed of movement is related to the distance of the finger to the center of the small screen. To detect your finger a skin color range has been definited, but if the tracker is not working very well with your finger you can choose another color to track, for example I have created a small cover for my finger using a red color gum wrapper  that works quite well.

The color detection algorimth is using hsv colour to extract  color selected areas in the picture combined with a blend mode to capture only the objects with movement, it is a good way to have real-time results.

Although this method is not very stable, it is an efficient way to detect objects using a javascript browser engine, the contextual environment had significant influence on the accuracy of the results but configuring carefully the color range and the environment it can be a good way for tracking objects.

As we do with google maps, we can use videotracker.js. to control any DOM component like textareas, sliders, video players, etc.