Introduction

During my recent agile sprints, I completed the minimum requirements for two key user stories. Firstly, my player can sign into the app using Facebook. This makes it quick and easy for the user, providing they have a Facebook account. As I have integrated authentication through Firebase, I am able to add further social media sign in platforms in the future while keeping all user data and identiy centralised within the Firebase user database. Secondly, I have integrated Vuforia to my project and set up two distinct methods for adding virtual lovelocks to the world. In this article I will demonstrate the app’s functionality so far and discuss the challenges assocated with the next stage of development.

User Interface (UI)

The app is currently set out over several scenes. There is a SignIn scene, a Home (main menu) scene and a scene for each the app’s augmented reality modes – surface plain, to place a lovelock on a surface and air position, to place a lovelock at a point in the air. Currently, the UI is barebones, a few buttons to help navigate between scenes and sign out. I have configured the app, through Unity, to be responsive in terms of layout. This means that I start with a base resolution,I am currently using my Pixel XL 3 as primary testing device, so I have set the default resolution to 1080×2160 to correspond with the screensize of a Goole Pixel 3 with which I am testing. The buttons and other UI elements are positioned within the layout using anchoring. This means that if I anchor x button to the top middle of the layout, it will always remain in that position, regardless of the size of the actual device. I find it important to implement responsive design earlier rather than later in the prototype build as the UI will become more complex with conditional buttons and modals appearing in response to user input. Resolution sizes can be simulated within Unity by simply changing the player window size.

UI Positioning within Unity
Setting UI Scale Mode and Reference Resolution in Unity

Demo

Here is my first demo. This early development version of my app which includes Facebook/Firebase sign-in and authentication, basic navigation and scenes configured for Vuforia’s Air and Plane positiong modes. The AR mode includes the placement of a 3D lock with an animated particle stream.

Demo of App early Stages

Challenges

The most significant challenge I am faced with for progression to the next stages of the app, is how to persist the virtual love locks in their position between sessions. Currently The lovelock only remains in place while the AR mode is selected. Once the app is closed the lock will vanish.

Within the current version of the app, I am logging the device’s geo coordinates in latitutude/longtitude format. In my next sprint I shall save this data to the cloud database (firebase) along with other details about the lovelock, such as the user’s unique ID and a time stamp. With the geolocation data I am able to then poll the database based on a player’s current geo position and request, say, the closest 100 objects to the player’s current location. However, to then place the objects in the correct place on the player’s screen so that they reappear in the same real-world position where they were originally placed presents me with a significant challenge.

The position of an augmented reality object is calculated using three sets of data; the three dimensional position on the screen, the rotational position of the object itself and the scale of the object depending on how near or far it is from the camera – this position is also called an anchor. This information can be used to place an existing virtual lovelock on a user’s screen, overlaying the device’s camera view of the real world. However, it will only be accurate if and when the device’s camera is in exactly the same position as it was when the lovelock was initially created and placed. To achieve this internally, Vuforia uses its own spatial coordinate system and extended device tracking to creat a dataset which it calls pose information. However, I have been unsuccessful in finding a way to persist the pose information over multiple sessions. One potential solution I was able to find was Google’s ARCore Cloud Anchors. Google offers this service so that multiple users can interact with a shared virtual object in AR at the same time. Howevern, this is a solution designed for a somewhat different test case than my app’s requirements. Google’s cloud anchors is a great feature for an app in which multiple players should see and be able to interact with shared objects in private groups. However, Google only store the anchor points for 24 hours, meaning it is not a suitable solution for my test case, which requires long-term persistance.

Augmented reality is still an emerging technology. The commonly available features such as those found within the Vuforia Engine are being developed for quite specific, common usage-scenarios. Whereas, my app requires a bespoke solution. I am confident that what I am aiming to do is achievable, however, I must now consider building customised functionality to fulfil my user story to place virtual lovelocks in the realworld which remain in place. Or, I must consider modifying my user story so that the technical requirements are less consuming.

Building a custom solution to address this challenge will be time consuming, but it would add greater value to my app. I must also consider that a suitable solution may become available in the future. In the meantime I might need to consider a short term solution which will satisfy at least a large part of my requirements without having to significantly modify the user story.

Next Steps

If I am unable to find a suitable technical solution for persisting and showing existing virtual lovelocks through an AR engine such as Vuforia, I have an alternative solution. I will store the geographical coordinates of a device at the moment when a virtual lovelock is added. I will also take a snapshot of the camera view at the moment when the lovelock is placed and also store the position, scale and rotation of the lovelock within the scene at that moment. With this information I will be able to recreate the exact augmented reality image at the time the lovelock was added. I will then be able to show this snapshot augmented with an animated virtual lovelock. This means that a user will still create their lovelocks in AR mode. However, they will only be able to see them in future sessions as an augmented snapshot. This will allow me to satisfy a large part of my user story without the need for complex, customised development of new AR functionality. When a suitable technological solution, for persisting AR objects over multiple sessions, become available within an existing AR engine like Vuforia, I will be able to assess and integrate that technology into the app as a new feature.

Comments

Leave a Reply