Unity MARS and Flutter for AR

Using Unity MARS and Flutter to Author Cross-Platform AR ExpEriences


I previously experimented with combining the Unity game engine and Flutter (Google’s UI toolkit for natively compiled cross-platform apps) back in late 2019, early 2020; with mixed outcomes. At the time, I was working on an AR prototype using Vuforia, a framework for which a Flutter plugin wasn’t (and still isn’t) available. In any case Flutter is essentially a 2D platform while Unity is one of the most popular realtime 3D authoring tools. On the flip side, Unity has poor support for native-feel mobile interface. So, the combination of these two technologies should be a great match.

Furthermore, a Flutter component to embed Unity built apps had already been published to the Flutter plugin directory. Nonetheless, back then, I was unable to validate a stable integration between the two technologies with neither Vuforia nor ArCore, Google’s native AR toolkit. However, the approach was validated with ARKit, Apple’s AR software kit, but I was looking for a cross-platform solution which ARKit does not provide. So, I had to put to the Unity-Flutter AR approach to one side.

Since then I have worked with ARCore and Flutter for non-animated AR object placement and this works nicely using remote GLB format files. But, more recently, I have a new opportunity to work with a more complex AR user-case once again. One which will require the physics and animation capabilities of Unity. So, a year and a half since my first Unity-Flutter experimentation I have new opportunity to retest the approach.

Unity MARS

I followed the beta releases of Unity MARS with great interest over the course of 2020. MARS is what Unity describes as a Professional-grade workflow for AR development. It enables AR developers to author complex, data-oriented apps, test within the editor without waiting for builds after code edits, and deliver apps with « runtime logic that adapt responsively to the real world«

Mainstream Use Case

The MARS tool was adopted by the creative team behind the recent highly acclaimed Wallace & Gromit AR interactive storytelling app – The Big Fixup. Furthermore, the team behind the project, – a creative consortium of three British design/software companies, chose Flutter for the app’s user interface thus pursuing the embedded Unity-Flutter approach with seemingly great results. Product manager, Beth, at Fictioneers, the name of the consortium, described the goal of the project as :

«A StoryTelling experience where the goal was to create an innovative app  that delivered meaningful content to your smartphone »

Beth, Product Manager @ Fictioneers

Flutter Unity Widget

I had previously experienced numerous challenges when I first experimented with the Flutter Unity Widget, including performance, Unity communication issues and general Unity version compatability issues. However, that was way back before the major 3.0.0 version was released in November 2020 which addressed much of these issues as well as a further version 4.0.0 in early March 2021 which included a number of critical fixes and performance issues with iOS among other improvements.

Initital Tests

For my initial test I decided to use the example game provided within the Unity MARS package. There are several example scenes including a complex game which includes an animated robot who can move about on surfaces and even leap from one surface to another when there is no ground-level clear passage available. The robot collects floating gems which are also generated in the AR space. This seemed like a nice, relatively complex test case for me to experiment with.

I worked with the latest 2020 LTS version of Unity for the tests and kept it updated as new minor versions were released; as of writing I am using 2020.3.11f1

I am using Flutter 2.1 and testing with a Google Pixel 3XL physical device.

There were a few custom tweaks needed to my gradle build files and local properties based on local specifics. Most of these are mentioned on the widgets documentation page. I have also noted them further down in this article.

On the whole, it was a smooth process. Having set up the complex game scene and installed the Flutter Widget Unity package, I was able to build an APK and set up a simple Flutter UI with a couple of Material AppBar Widgets to make it clear this is a Flutter app with a Unity built embedded dimension. And it worked!

Adding More Physics and Interaction

For the purpose of a current project prototype I want to place floating virtual objects above the real-world ground.

These objects should react to physical collisions, for example when the user of the AR experience walks into an object it would be knocked off kilter, rock back and forth before returning to its initial pose and state.

The objects are ‘fed’ into the experience based on the user’s position as they move about, so they are dynamically loaded at runtime and not preconfigured; as they are based on data from outside the Unity app and within a cloud-based database.

To maximise fluidity, a good practice with MARS, I found, is to lay spawn points. Spawn points are surface planes, or other interest points, in my case the ground. A spawn point is a position on the desired plane upon which a virtual object will be placed. An important feature of MARS is the carefully considered semantics it uses for these concepts.

For example, a feature in the real-world matching a preconfigured condition is called a Proxy. The MARS documentation describes a proxy as a virtual representation of a physical real-world object or space with which an app can interact – placing digital content (Prefab) upon it, etc.  The Unity manual goes on to say: think of proxies as placeholders that connect the real-world with digital content in an app. For example, a face, a table, a wall, or a floor. Current conditions to apply when identifying viable proxies include size, height, padding and so forth. However, the long term goal for Unity is to integrate more abstraction to match, such as time or the current weather condition (think rain!).

A spawn point is a position in the real-world, represented by a proxy, where a gameobject, such as a character, will appear (spawn). For my experiments and taking inspiration from the MARS example complex game, I used a MARS replicator to create an unlimited number of spawn points, on the targeted surface, each one having a synthetic object attached to assign a semantic tag. I can then create further proxies with my desired GameObject (character, etc) during runtime which extend from the spawn points through a semantic tag condition. To summarise, generalised spawn points are created in real-time as soon as they are detected. These are general because they have only some basic conditions. In my case, horizontal and vertical planes where there is over one square metre available. I can then create my characters and other AR objects on these prelaid spawn points with much finer control; for example, I might want them only after a duration of time or in a certain order or, in the future, only when the spawn point is a certain colour or texture!


There are a few things to look out for when using MARS and Unity/Flutter together generally. Here are a few of them which by noting them along the way, I hope may be useful for anyone running into these small but sometimes blocking gotchas.

Physics – I would recommend using Unity’s Rigid Body approach which allows the use of the built in Physics for a more realistic feel but also less boilerplate. Here is a great video I found which showcases this in action in a popular non AR game – this can be adapted to AR. (https://youtu.be/qdskE8PJy6Q)

Runtime Proxies – when creating new proxies at runtime, for example as and when external events from a database trigger new AR objects it’s essential to synchronise the new state with the main MARS system using the SyncModifications()method of Proxy.

This is needed as the MARS DB has it’s own copies of the condition data stored in a more optimal format running on a background thread for performance. When everything, including all proxies, are pre-configured in the inspector, the syncing is handled automatically. When making changes at runtime, however, it’s necessary to notify the system of the new/modified proxy. Much thanks to Thomas Key of Unity Technologies who helped me with this, and who explained that : “Calling SyncModifications() informs the query db of a change and schedules a “modification” change action. A modification change can take a few frames to get picked up, as the db query scheduling thread is synced intermittently with the main thread via an actions buffer.”

Links to the relevant classes:


Shadows – I am testing with an Android physical device, so ARCore is the component doing the heavy lifting. I’m not sure why but GameObjects do not automatically cast shadows on planes when using MARS. This has to be setup manually. I was surprised as this works naturally with ARCore native and non animated GLB objects in my Flutter/ARCore experimentation and development. Anyway, with Unity I had to set this up manually for my animated objects. This was achieved by adding a Shader as well as a Rigidbody component to my GameObject. The Shader I used can be found here : https://github.com/dilmerv/ARFoundationOcclusion/tree/feature/URPOcclusion/Assets/Shaders

This is applied to a clone of the main Plan Visualizer Plane which is then assigned to the main Plan Visualizer component.

Planes Visualizer Config
Plane Visualizer Plan Config

Flutter – a few tweaks were required for my use case, these may or may not be required depending on the setup.



lintOptions {
disable 'InvalidPackage'
checkReleaseBuilds false


unityStreamingAssets=.unity3d, google-services-desktop.json, google-services.json, GoogleService-Info.plist



def getNdkDir() {
Properties local = new Properties()
local.load(new FileInputStream("${rootDir}/local.properties"))
return local.getProperty('ndk.dir')

add to local.properties:

ndk.dir=C\:/Program Files/Unity/Hub/Editor/2020.3.8f1/Editor/Data/PlaybackEngines/AndroidPlayer/NDK

Barebones Demo

Here is a link to a demo of my experimentation using primitive 3D objects. These objects are created at runtime dynamically and placed on existing spawn points. The objects interact with oneanother physically and cast shadows onto detected planes and each other. The functionality includes a floating effect, as this is a requirement for my project ; and a feature which allows the user to bring one of the objects toward them and return it back to it’s original spot again, colliding with anything in it’s way with a realistic physics effect. I am also using MARS gelocation conditions and have tested and validated this working over a distance of 100 metres in a single session. The AR experience is embedded within a barebones Flutter UI to demonstrate these two technologies working together.

Conclusion & Next Steps

Unity MARS and the Flutter Unity Widget have each come a long way over the past 12 months, since I previously experimented with the approach. I found the process to be relatively straightforward and stable using Unity 2020 and Flutter 2.1. The MARS workflow really speeds up the early phase of the development process with the simulation environment without the necessity of new builds. Using MARS allows the developer to lay down the framework of a fully functional AR experience with the knowledge that adding new AR features/conditions such as weather detection, object recognition and so forth, in the near future, can be accommodated naturally into the workflow.

My next steps involve benchmark testing communication between Flutter and Unity. Specifically, I want to understand what the difference in performance will be between Unity Firebase (db) connectivity directly versus Flutter Firebase combined with Flutter Unity. Currently, my philosophy is to use Unity strictly for the 3D / AR functionality and prioritise Flutter for all further business logic and presentation layer. I will post my findings and decision making on this side of things over the coming weeks.  

Leave a Reply