Tag Archives: unity

ARKit – Portal

After the massive excitement of Apple’s new ARKit, there have been some interesting demos and prototyping of AR Apps that are utilising this core technology.

I decided to make my own after sees some examples online, and I created a room portal within Unity 3D.

Initially, it appears as a virtual door within the environment, into another world. The user has the ability to walk into the room, and view the outside world (the real world).

This is using a mix of DepthMasks, and Buffers. As expected, the 6DOF (degree of freedom) tracking is pretty much spot on, and makes the experience very smooth and unrestrained. (Any lag that is seen within the video is actually the screen-recorder, and not the App).

Expect more interesting examples, and demos to come!

Award Winning Unity3D Application

I’m very proud to announce that the mobile application that I have been working on for a lengthy amount of time has won an award!

The MMA Smarties Awards is the world’s top global mobile marketing awards celebrating innovation, creativity, and success. Working alongside Engine Creative and BIC, we won the award in the Mobile App category – competing against other bluechip marketing companies including Jaguar, Heineken and McDonalds.



The Application uses Unity3D to it’s full potential, along with using the powerful Vuforia for Augmented Reality tracking technology which pushes the immersive and educational experience to new bounds.


The app has now been released in six countries within Europe (including France, UK, Spain and Germany) and further releases are planned for Latin American.

Microsoft Kinect & Unity3D

Recently, I’ve been messing around with Microsoft’s Kinect Sensor (v2) within Unity3D to see what interesting results I can achieve. I’ve only really done large scale projects with the previous version of their sensor. But I must say, this new sensor is a huge improvement! And its tracking is on point!


For such a complex device, their developer API is crystal clear, and very supportive (in terms of multi-platforms). The only down side is that their SDK is Windows only (so I couldn’t develop on my Mac) – which isn’t the end of the world, but just an inconvenience.

Depth & Infrared Point Cloud

I wanted to experiment with the depth and infrared sensor and to get a point cloud representation of my room within Unity3D.

2016-05-11 (1) copy

I achieved this by  accessing both the depth and infrared sensor, and creating a reader for them both. A reader is what actually contains the data from the sensors, think of it like a ‘feed’ of data.

Once I had the data coming into Unity, I needed to actually do something with that. I dynamically built a 65,000 vertices Mesh with a MeshTopology of a Point – to give me the point cloud mesh. Each point represents a pixel from the depth sensor, whereby each pixel represents the distance in millimetres. This gave me a very accurate representation of my room in 3D via point cloud.

I took this one step further by colouring each point cloud with the pixel data given from the IR sensor.

The Mesh is then recalculated every frame (Update) to give me a real time scan of the room I was sat in.

The Result

Incase you can’t make out what’s going on, what you’re seeing is my room – 3D scanned into Unity in real time. You might be able to make out me, sitting on my chair in the middle. And as you can see, each point cloud has it’s own colour value given from the IR sensor.

This is a front isometric view – it almost looks like an image, but it’s not! This is just all the point clouds aligning up perfectly to give the impression that it’s a 2D flat image.