Our past Augmented Reality projects (such as Gilt City) have made use of the location-based features of Layar. Through the use of GPS your phone identifies your location; we would provide the imagery and information about points nearby; and then the Layar app would display them overlaying the real-world.
The faded items are additional steps in the process: Layar provides the directory listing through which people find your content, and the phone's internal compass is used to determine your orientation.
There have been a number of successful projects, but their designs have always needed to accept a degree of inaccuracy in placing imagery - which varies depending on environmental conditions.
Image Tracking AR
An alternative to location-based AR, is instead of using GPS and orientation sensors, to look for recognisable features in the video stream and use those as reference points. Until relatively recently limitations in computing power restricted the features that could be recognised to bold, high-contrast images. These "markers" are still some of the most recognisable aspects of AR.
Now it is possible to use "marker-less", or "natural-feature" technologies, such as that shown in the image below. In advance your chosen images are processed and a mathematical description of their features generated. For augmented reality each frame (in the order of 20 a second) is individually scanned for features, which are then compared against the prepared library.
Layar's Natural-Feature Tracking
Layar recently added natural feature tracking to their mobile app; and it is this technology that is used in the Joseph Wright AR project. The starting point is the combined painting/ceramic image, as it is intended to appear on the mobile device. It is then split, with the image of the painting being sent to Layar for feature detection, and the augmentation image stored on our own servers.
On the mobile device the camera feed is scanned for matching patterns, and if one is detected the original image is rebuilt using the augmentation fetched from the server.
Using that structure, the sample below shows the minimum required to pair an augmentation with a reference image. Essentially this is returned to the Layar app in the same way that a webpage is returned to a browser.