An Embedded Approach to AR App Design

Last summer, I was working on a prototype for an AR app that would allow users to create virtual objects in real world spaces. I carried out the work using the Unity3D game engine. My last commit to the repository was five months ago.

Since those last commits I’ve been busy working on a few Flutter projects (see previous posts). I now have an new opportunity to go back to the AR project, so I have been reflecting upon how I might combine these two technologies, Unity & Flutter, together.

Why? Good question!


I previously wrote about issues with building consistent mobile UI within Unity. The Unity team are working on a UI asset called UIWidgets which looks very promising but is still very much under development and not yet properly documented. The maintainers freely admit the asset is more or less derived from Flutter. So, in terms of UI, for me, it would make sense to use Flutter which is extremely well documented and includes both Material and Cupertino pixel perfect ready made set of UI widgets.


Flutter is built to be fast. According to Flutter’s homepage “Flutter code is compiled to native ARM machine code using Dart’s native compilers. Thus Flutter gives you full native performance on both iOS and Android.“. Some performance advantages of Flutter over Unity include; generally smaller app sizes, faster app loading.

Tools For Jobs

I believe that’s where possible it’s best to use the appropriate tools for the job. Unity is essentially a game engine and does a great job at allowing developers build 2D, 3D and XR games & experiences. However, I want to build a mobile app with a conventional user interface; yet, which has a game like component within it – the 3D AR experience.


I will want to eventually have my UI available in a numer of international languages. This is available using third party paid assets within Unity. I have not tested this yet; but, I wouldn’t expect it to be as straightforward as with Flutter which has built in internationalisation components provided within the underlying Dart language.


Flutter is opensource, there are no costs involved and I can contribute to open source widgets and packages if I need to. Unity has a free tier but will involve costs for whitelabelling (removing the Unity logo from the apps loading screen). These costs, in fairness, would be incurred whether I use Unity alone or combined with Flutter.

ARCore Compatability

ARCore support for Flutter is currently limited to a few community driven projects on Github. ARCore_Flutter_Plugin looks very promising but is still limited to a developer preview. The types of objects that can be placed in the AR are limited to spheres and cubes and there isn’t an API for cloud anchors, as yet.

There is also a Flutter plugin for ARKit, Apple’s own AR framework. This is at a similar stage of development. It’s exciting but not quite ready for what I need. Besides, ARKit doesn’t support cross-platform development – it is only compatible with iOS devices.

Unity, on the other hand, includes full support for ArCore as there is a dedicated SDK for it, created by Google. Unity is cross platform just like Flutter.


Unity + Flutter

I want to be able to take advantage of Flutter’s rapidness of development, performance and pixel perfect ready to go UI widgets. However, I need a screen within the app to act as a window on the world, through which users will be able to position virtual objects within the realworld space. Users should also be able to interact with each other’s objects at the same time. This shared AR interactivity is achieved through ARCore’s Hosted Cloud Anchor feature. This feature is available within the Google ARCore Extensions for Unity.

One option for my use case would be simply to create two distinct apps. The main app with Flutter and a further one built with Unity which serves purely as the AR component. The two apps could work in tandem. The user would manage general account features, notifications, social interactions, etc. within then main Flutter app and the Unity app is launched when a user goes into AR mode. This approach would have some serious userbility implications, though. It would require the user to install two apps and keep them upto date with oneanother. It would also provide a much less streamline experience than having the full featureset within a single app.

My preferred option is to embed the Unity built AR component within a Flutter Widget. While this isn’t something that’s available as part of the core Flutter framework, there is an open source, community driven project to do just this. Flutter Unity View Widget is an “Embeddable unity game engine view for Flutter”. The project is hosted on an active Github repository with multiple contributors and openly accepts pull-releases. So, if I find any issues or need to expand on the feature set I have the possibility of building those features myself and requested for them to be pulled into the project. The project is documented and even has a dedicate Gitter chat channel. From what I have read in the docs, this widget should do exactly what I need. There are even a few examples of it being used with AR Foundations. I will need to test with Cloud Anchors and confirm that the communication between the Unity app, which will be written in the C# language and Flutter (Dart) is consistent and efficient.

Next Steps

My next step is to build a Unity Cloud Anchor test app using Google Code Lab’s ARCore Extensions for AR Foundation: Cloud Reference Points. I will then set up a basic Flutter app to test the Unity AR component with the Flutter Unity View Widget.

Leave a Reply