New Augmented Reality

If we do a search of augmented reality in Google, we see many references of home grown developers making various innovations.  Is it worthwhile for companies make a jump into this technology?

While companies and marketing agnecies are still experimenting and evaluating how this technology can bring a new level of branding experience,  Acrosssair is already leading the forefront in developing augmented reality apps for marketing.  Their apps caught the attention of many media (Macworld, BBC, BusinessWeek).


I had an opportunity to ask Dana Farbo, president of Acrossair/Imano, a few questions about the future of augmented reality and how his agency is taking advantage of it.  This will be a part one of two series.

Q: Tell me something about acrossair?

Dana Farbo: Acrossair is a mobile application development company currently focused on iPhone apps. We are part of the Imano global group which also contains Imano, digital creative agency, CommerceNow, eCommerce SAS platform and, a digital publishing group. Our primary development is in mobile augmented reality and we develop our own stand alone apps as well as apps for major brands.

Q: Can you list a few Augmented Reality projects acrossair created?

Dana Farbo:  Mass transit finders such as the New York Subway, Nearest Tube and others for cities around the world including Tokyo, Madrid, Paris, San Francisco and more represent some of our early work and are very useful apps. We also have NearestWiki for finding local points of interest, TweetsAR, Nearest Places and we helped create the popular WorkSnug application. We also created the Stella Artois Le Bar Guide and have included it in our recently launched acrossair browser.


Q: What is the proudest AR project(s) acrossair has launch to date?

Dana Farbo:   The acrossair augmented reality browser is the app I am most proud of to date. This is a phenomenal leap forward in location based search using AR. Features such as the Car Finder and My Places make it a daily go-to app for me. The Google search function alone is worth it as it allows you to cut to the chase of finding what you are looking for in a visual sense faster than anything I have used. In addition for use on the 3GS with the compass, we also made it compatible for the 3G phone so that more people would get a chance to use the AR functions. And we allowed horizontal scrolling so that you don’t have to spin yourself in a circle to see what is around you. There are really too many functions to list and I find it more useful every day.

Q: How did your team came up with the idea of using AR for metro systems?

Dana Farbo:   It was really a natural as London is our home office and we are here in New York. We use the Tube and subway systems a lot and these allowed us to easily find what we needed. Plus, the data feed was readily available and not subject to constant change so could put it in a closed app with minimal upkeep. This also allowed us to work with the usability and the way we portrayed the content. I think they turned out pretty well and should be around for a long time.


Where do you see the future of augmented reality?

Dana Farbo: Augmented reality is going to be a part of all of our daily lives at some point within the next decade. We are such a mobile world these days and we want our coffee and our information immediately. There are incredible opportunities in assisted learning where we can be in a place or doing something and receive feeds and data overlays that enhance the environment. For ISMAR this past year, the team created a spec app that portayed 3D dinosaurs to represent a visual account of what type of dinosaur inhabited the area you were physically at. Plus, we could segment by period so I could see what was here 100 million years ago, 250 million years ago, etc. Very cool! [Author inserted: Check out the response from ISMAR on Acrossair’s presentation]

There is also a huge social component that will evolve as well and this will probably further lead us into protecting our identities but getting closer to those in our inner circle. I will know where my linked friends are and I will have categories of access to who can see what I am doing. But we will also have the ability to transmit my friends into my space virtually so that they may help me with a task or have a chat. Looking through another’s eyes takes on a whole new meaning.

These are just a few examples. There are so many smart people working in the area now and in the future everyone will use one form of AR or another so it will evolve like any embedded element of our lives.…

Interview with Developer of Sun Seeker: 3D Augmented Reality Viewer

Sun Seeker: 3D Augmented Reality Viewer is hot application currently available in the App Store and we were lucky enough to snag an interview with the developer. First let’s discuss our hands-on with the application.

The app’s description says that it ‘Provides both a flat view compass and an augmented reality camera 3-D VIEW showing the solar path, its hour intervals, its winter and summer solstice paths, rise and set times and more.’

When you first launch the app you are presented with a top-down compass view showing the sun’s path throughout the hours of the day. In a glance you can see sunrise and sunset, length of day and night, and where in the sky the sun will be.  You can also choose to view the sun’s information for any date you select.

Pressing the camera button brings up the augmented reality mode. This starts the phone’s camera and nicely overlays the sun’s path over a ‘grid-like’ pattern. In addition to the sun’s path you can also see the elevation, azimuth, sun’s position throughout the day, and the direction of the sun’s current position.

We really like this application and felt it offered all of the solar information you would need in a beautiful package.

App Store Link: Sun Seeker: 3D Augmented Reality Viewer $2.99


Now for our interview with the developer of Sun Seeker, Graham Dawson.

1) How did you come to your decision of developing an ‘augmented reality’ application?

I started developing for the iPhone when the 3G device first came out, and my first app was a weather app, which was a subject which I had studied at university. That app proved very successful, and I then decided to build a portfolio of apps focusing on enhancing user awareness in various ways. Later, at around the time that Apple released the 3GS device (the first device with built-in digital compass) I was looking to buy an apartment. One of my concerns was that there should be good light available, although as a property viewer you don’t get to see where the sunlight comes in at different times of day or year. As I have some background in astronomy, I immediately realised that it would be possible to create an app for the 3GS which showed me exactly the information I wanted – hence the Sun Seeker app was born. So in a sense it was serendipitous – although I am sure that had it not been this particular app, I would likely have found some other augmented reality concept to pursue.


2) What is your opinion of the capability of the iPhone’s hardware in terms of handling augmented reality applications?


Although the 3GS sensors form an exciting first step for AR, the accuracy limitations of various components are far from ideal. In particular the GPS positional accuracy is not always adequate to even decide whether a given point of interest is even in front of you or behind you, and of course GPS may not be available at all in indoor locations. Fortunately this particular limitation is irrelevant to the Sun Seeker app, although it is very relevant to the various geographical POI-related augmented reality apps and platforms.
The other main inaccuracy is in the digital compass heading, and this one does affect Sun Seeker. This is typically reported by the device as being accurate only to within +/- 25 degrees, although in practice (in non-magnetically-polluted spaces) it is often good to within several degrees, in which case it is near enough for almost all practical applications.
The other major sensor component involved is the tri-axis accelerometer – but that is generally of good accuracy, and can be further calibrated for even better results, so this one needs little further future refinement.
Although only required for certains types of AR applications, a key missing factor in AR on the Phone is the ability to do real-time video camera analysis. It would theoretically be possible to use video analysis to supplement GPS and positional data (eg. be recognising nearby buildings, landmarks or even people) and thereby sometimes being able to get a much better idea of your exact position.


3) Are you optimistic about the future of augmented reality on mobile devices and can you predict any possible future applications for this technology?

I’m very optimistic here, at least in the longer term. I suspect that the most exciting new developments will come through video analysis ie. using image recognition and overlaying the images with enhanced information about what you are seeing. This could take us far beyond the current types of AR apps which are largely restricted to presenting information about static, non-realtime POIs. An obvious application here is facial recognition, allowing you to see additional data about people nearby.
I also suspect that we will eventually start to see dedicated AR devices which will allow us to see AR information without having to look at our mobiles – for example in eyeglasses or contact lenses. In that case perhaps no-one need even know that you are using it, and it could be ubiquitous and fully accepted that people would use it. Perhaps one of Apple’s future models will be the “eyePhone”?!

4) Do you plan to develop another augmented reality application?

Yes. I am working on another idea which I don’t wish to disclose yet, and it will require some trialling to determine whether or not it works well enough in practical situations. Its certainly an exciting area to work in, although a little risky in terms of time and effort spent to prototype your ideas, in an area which is still quite novel and “bleeding edge”.

We thank Graham for his insightful answers and look forward to his future projects.…

LocFinder App

LocFinder is an augmented reality app that offers all of the navigation features you will need and we’re thankful that developer Thomas Seifert took some time to answer our questions regarding augmented reality technology on the iPhone.  First let’s take a look at LocFinder.

The app’s home screen shows you a compass arrow pointing to your current destination, a rotating globe with a pin at your destination, and other information including the latitude and longitude, and distance. You can choose a destination from your own ‘personal list’ or from the list of famous places all around the world. Pressing on the compass engages augmented reality mode. In AR mode you see a really cool compass that seems to encircle you and your destination is highlighted in red. You can also bring up an arrow to point the way.

Other features from the description:

-offers path recording and following (forwards and backwards), allows listening to your music while recording or following a path, shows you the path metrics: altitude profile, average speed, elapsed time, total distance and altitude difference. Works with Google Earth – you can include and exclude paths or locations to/from Google Earth

We found that this app not only worked great, but looked great too. There is a high level of polish on everything from the home screen to augmented reality mode. Selecting a new destination is easy, and you always have 1 click access to more information about a position via Wikipedia. LocFinder is currently $1.99 in the App Store and there is a free, lite version for you to try.…

What’s AR?

In a nutshell Augmented Reality (AR) is adding information to what you see and, one assumes, eventually to what you hear, touch and taste.

By far the best known AR application is the HUD, or Head Up Display which has been in use since the Second World War.  Initially it only had optical replacements for gun sights so the pilot’s peripheral view was not obscured by a huge metal ring with cross hairs in the middle and his head is up and eyesight focussed at a distance – on the target.  However, as radar and targeting systems were added these displays became increasingly sophisticated adding target acquisition markers (range, bearing and relative height) for attack and missile lock warnings and status indicators etc. for defence and system warnings.

Now this technology has percolated to the street allowing us to use a hand held device to locate the nearest tube station even if it isn’t visible through crowds or buildings, but that is only part of the story.  We have the following broad capabilities:

Instrumentation is much as I described for the early HUD displays.  It simply makes visible in your field of vision something which would otherwise divert your attention from the view ahead.  Many cars are now fitted with a HUD and there are several GPS speedometer apps for the iPhone which will be reviewed elsewhere on AugmentedReality-iPhone.  These usually allow you to mirror the display so, when placed on the dashboard your speed, direction and even navigational information are reflected on the car’s windscreen the right way up.

Augmentation applications add to or change the appearance of reality.  This can be something as simple as adding a little alien to a photograph taken with your iPhone (see Magicam) to seeing how you will look in a new piece of clothing or overlaying the path of a ball or puck onto a television broadcast in real time.  There are other exciting possibilities here like expanding human vision.  Mercedes have a working infrared system on the S-Class which currently displays in a monitor near the instruments – it doesn’t take a big leap to see that if they can track eye and head movement accurately they could expand that image to apparently fill the whole screen blending with reality to add detail.  There is an interesting crossover here into Virtual Reality (VR) where everything you see is synthesised and the real environment is hidden – this would, it seems be simpler to achieve but more hazardous in the event of an inevitable hardware or software problem!

Object Recognition in real time.  This involves the system seeing real world objects, identifying them and adding information to them.  This is the least exploited area at the time of writing.  The most likely initial developments are likely to be barcode-like objects designed to stand out from background clutter.  These will be used as triggers for additional information about the object to which they are attached.  Eventually as the sophistication of computer vision systems increases these techniques will no longer be necessary and if you want the Terminator-like ability to see peoples clothes sizes you could have it…

Interaction.  Another area still in its infancy.  Imagine your iPhone interacting with your AR glasses to project an AR keyboard in the air and allow you to actually type on it to send a text message reply… and you thought using a Bluetooth headset made you look weird!  Current interactions on hand held devices are limited to you interacting with the virtual items on the screen – touching them to activate additional information or, in the case of games, to shoot at the virtual targets on the screen.…