Platforms & Tools


To explore a new technological platform with a view to getting under the hood and coding something up is one of the most exciting prospects, as a creative app developer, I can think of!

In today’s connected world, apps have become a ubiquitous feature of our daily lives. At night while I sleep, my smart watch is monitoring for movement and recording my pulse-rate along with other factors in order to provide algorithms with data about the quality of my night’s rest. If I owned a stand-alone sleep tracking unit I could also have my bedroom’s heating, lighting and air-quality automatically controlled by algorithms and to be gently awakened with an app driven sunrise simulation.

Having got out of bed, I weigh myself on smart-scales – I am told about my weight, body-mass, air quality and even what the weather is going to be like today. After reading the news headlines and friend’s updates via apps on my phone, I go for a run. My route is tracked using GPS. Other apps tell me how many calories I’ve burned, my heart rate and plenty of other stuff. In my house, the central heating system is controlled by a touch-screen app built into the main unit. I work from home, but if I had to drive to my job, I’d be interacting with apps at petrol stations, car-parks, autoroute tolls as well as my in-car app for GPS, air con, entertainment, parking assistance, charging/petrol station location and more!

Apps are everywhere, integrated into an ever increasing variety of form-factors ranging from smartphones, games consoles, wearables and robotics to immersive augmented and virtual reality platforms.

Under The Hood

As the variety of forms in which we encounter apps is becoming greater so have the technologies which are driving them become diverse and in some ways somewhat disparate. A developer who has mastered Apple’s Swift programming language is able to write code for macOS, iOS, watchOS, and other Apple platforms on their Mac computer using Apple’s XCode development environment. However, that same developer would not likely be comfortable working in Android Studio on Microsoft Windows using Java or Kotlin programming languages to code up an app for Android OS, ChromeCast or Android Wearables.

The vast array of apps we encounter in our daily lives are built using a variety of different programming languages, architectures, frameworks, operating systems, development environments, and hardware – processors, etc.

The Developer’s Dilemma

A developer who wishes to develop native apps for apple and android must study and get to grips with both Android and Apple paradigms. This means learning several distinct programming languages, frameworks and development/testing environments as well as an ongoing familiarity with the latest associated products, software updates and new version releases of the software kits.

Furthermore, Android and Apple are not the only platforms available and phones and desktops are by no means the only form factors. In an ideal world a creative app developer would be able to use the same set of tools and languages to makes apps for all kinds of devices as well as augmented reality and virtual reality headsets, gaming consoles, wearables and the rest of the internet of things.

So, the question I would like contemplate is this: what is the best approach for a creative app developer who wishes to cater for a wide range of target platforms and form-factors?

General Purpose Apps

HTML5/Hybrid Mobile Web Apps

For general purpose apps, by which I mean the most common types of app for phone, tablet or desktop using a standardised user interface design, forms, lists, etc. some potential solutions exist. These solutions tend to come in the form of toolkits with which a developer is able to compile native code for different target platforms from a single codebase.

One of the best known toolkits for this is Cordova Tools – “a diverse ecosystem of command line tools, JavaScript frameworks and cloud services” (source). With Cordova, developers work with web based technologies; HTML, Javascript and CSS to build a project which is then compiled to native code on multiple platforms; including IOS and Android as well as several others. Cordova is great for web developers who may reuse their existing web skills to make apps for devices rather than web browsers.

Cordova and other similar solutions do have their downsides. The compiled apps run through a webview which means that the UI is only native-like. This can sometimes give an app a less consistent feel and level of performance, especially across different platforms and OS versions (requires reference). Otherwise, with the help of plugins and some knowledge of mobile app architecture – push notifications, OS API’s (GPS, camera, etc) mobile-web apps, as they are sometimes know, provide everything a developer needs for most general purpose apps. But, a mobile-web app isn’t the same as a native app!

Compiled Native Apps

In recent years, several frameworks have emerged presenting developers with the opportunity to create native apps for multiple platforms from a single codeset. One such example is React Native, developed by Facebook. According to the framework’s homepage, apps built with React Native “aren’t mobile web apps because they use the same fundamental UI building blocks as regular iOS and Android apps.“. To develop with React Native, a developer must learn Javascript. While Javascript is a web based technology and the React Native frameworks is based on the web oriented React JS framework, the learning curve for a web developer is a little steeper than Cordova because the framework is built around native components rather than web components. Facebook use the technology as part of it’s main Facebook and Instragram apps. Not to mention other high profile companies using the technology: Skype, Tesla, Uber, Walmart (source).

React Native isn’t the only framework targetting fully native apps from a single codeset. Apple’s Swift toolkit currently only supports Apple and Linux. However, there is a community based effort working to port to other platforms too (source). Google, on the other hand, offers Flutter which it describes as “a portable UI toolkit for building beautiful, natively-compiled applications for mobile, web and desktop from a single codebase.” (source). Flutter has not been around for as long as Facebook’s React Native. Yet, at the time of writing, it has already been ‘starred’ over 65k times on GitHub, compared to React with nearly 80k stars.

Flutter uses the Dart programming language, also developed by Google. Although this likely means that most developers have to learn a new language, Dart can also be used for developing web apps using AngularDart. Also, Flutter for web is currently available as a technical preview. This means that React Native and Flutter are closely weighted against one another in terms of attractiveness. The big question, therefore, is whether to go with Facebook or Google!

Specialised Apps


Flutter and React Native provide developers with an opportunity to make general purpose apps for multiple mobile platforms from a single codebase. However, more and more of today’s app requirements are not ‘general purpose’. They are increasingly developed for a specific form-factor such as wearables, interactive things, VR and AR. Of the frameworks and toolkits described above, only Flutter comes with some support for Augmented Reality. This is done via plugins that interact with Google’s ARCore platform. But, at the time of writing, this is quite limited and still in an experimental phase. Furthermore, Flutter/ARCore plugins can only compile for Android devices.

Focusing on VR, as this is a field I am personally studying, it is evident that a similar situation exists to the one discussed earlier regarding general purpose apps. At least, in terms of the hardware I currently own, that is – I have Oculus Go and Google Daydream headsets. Oculus is a Facebook company and Daydream is Google’s VR project. Each of these platforms has it’s own SDK. Daydream is not limited to Android. Google offers an SDK for both Android (java), iOS (Swift) as well as an NDK (Native Kit) (C++). That’s a lot of languages and kits if I want to build VR apps for multiple platforms!

Cross Platform

Of course, it’s technically possible to build interactive VR worlds with pure code. But, given the significantly complex nature of the visual 3D/VR animations, as well as the underlying layers of simulated physics to ensure that visual objects behave in a consistent manner, coding up a VR world in notepad just isn’t going to be possible for an individual or even a small group of developers. Fortunately, we have cross-platform game engines.

Unity and Unreal are two massively popular game engines with multi-platform support for virtual, augmented and mixed reality apps. Each provides software for accelerated learning & development. The visual editor allows a developer to create a world and edit the creative and physical properties of objects and elements within the world. This permits the developer to focus their programming skills on specific behaviours, interactions, web-server API calls, UI and so on. While Unity uses the C# programming language, Unreal uses C++. Both of these are reasonably complex, technical languages. However, each are very much transferable skills. C# is created by Microsoft and can be used to build apps for Windows as well as web servers. C++ is a general purpose, multi-paradigm language often used for embedded apps.


For general purpose app development, including mobile apps for business, social media and 2D games, it is well worth considering a modern toolkit like Flutter or React Native. Either of these kits will allow a developer to build apps in native format for multiple platforms from a single codeset. I am using Flutter with Dart as I have already used Dart with AngularDart and I’m familiar with Android Studio.

Speciality app development such a wearables, requires further research. While Android and Linux can be embedded for the Internet of things, it is beyond the scope of this article. For my own case: virtual reality, each of the game engines discussed provide all the tools necessary to get up and running. I have decided to work with Unity for my VR projects. Once more, this is motivated by my existing experience with C# within the context of .net web/windows projects over the years. I couldn’t find any real defining advantages, otherwise, between Unity and Unreal.

Leave a Reply