April 21, 2010

Augmented Reality on iPhone: Marker Tracking Demo

I've been working on an augmented reality game app for iPhone recently. The player will be able to interact with the game by placing and moving specially designed marker cards in front of the iPhone's camera. The app analyzes video frames in real time and updates the scene accordingly. Here is demo of what I've got so far - marker tracking on iPhone 3GS.


The marker tracking algorithm is done by a slightedly modified version of NyARtoolkitCPP. Analyzing each frame takes < 55ms in average. The OpenGL ES rendering can achieve 30 frames per second. The current algorithm relies on thresholding so it is very sensitive to changing lighting condition. Also, template matching based identification requires matching against the whole pattern library for each detected marker. I'm seeking for better algorithms that use edge detecting and id recognition. StbTracker, the successor of ARToolkitPlus, looks ideal for devices such as iPhone but unfortunately it's not open source.

The most discussed topic by iPhone AR application developers today is real time video frame grabbing. There isn't an elegant way yet. For now, I'm using private APIs to repeatedly fetch the camera preview view. The good news is that we are going to have full access to still and video camera data in iPhone OS 4. A peek at the related APIs in the iPhone SDK 4 beta 2 further confirms that.

The OpenGL ES overlay content is displayed in a second UIWindow above the one containing the camera preview. This only point of dosing so is to get a clean preview image without the overlaying content. These ugly hacks will be all gone once we have iPhone OS 4.