Experiments with AR - Part 1!

1-2 min read

May 18, 2019

Introduction

I’ve been working with Virtual Reality for close to two years now. When I was working with VR, I was always intrigued by Augmented Reality. When I started with VR, Pokemon Go was the buzz of the town. I always wanted to get started with AR but don’t really know why I didn’t.

But I recently, I finally got started with AR. Last year Google released their AR SDK - ARCore for AR development on Android and iOS mobile devices. According to Google, right now around 4 million devices support ARCore and this number is expected to increase rapidly in the coming year. Another very popular SDK for AR is Vuforia and although it works in a similar way and has been present for far longer, it’s not open source. We also have ARKit for iOS devices but since I neither have a mac or an iPhone, I decided to start developing with ARCore.

Basics of ARCore

Fundamentally, ARCore is doing two things: tracking the position of the mobile device as it moves, and building its own understanding of the real world. It uses three features to to this:

  1. Motion tracking allows the phone to understand and track its position relative to the world.
  2. Environmental understanding allows the phone to detect the size and location of all type of surfaces: horizontal, vertical and angled surfaces like the ground, a coffee table or walls.
  3. Light estimation allows the phone to estimate the environment’s current lighting conditions.

AR Foundations

Last month Unity came up with their cross platform AR package AR Foundations for making AR apps for both android and iOS. It uses ARCore for Unity and ARKit for iOS.

So, I got started with AR Foundations. I got started by setting up the environment and making my first basic app. This was an app which did Plane recognition.

Error: VideoService could not be found

ARCore

AR Foundations was still very new and limited and didn’t have a lot of support for it. So, I started developing using ARCore itself.

The first app I made using ARCore, first detected a plane. Once the plane was detected it allowed user to place a cool alien spaceship on the plane.

Error: VideoService could not be found

As you can see that the spaceships hold their positions when you move towards them, away from them, around them or even when they are off screen. It almost feels like they are really there. So how does that happen? When the application starts, the position at which the phone is at the start of the application is marked as the origin in a 3D coordinate system. And virtual objects are placed relative to this position. When user moves around, the data from phone’s sensors is used to estimate the distance and direction of the user’s movement. On the basis of this information the virtual objects’s projection(size, angle, shape) is changed. This is really simplificaiton how things happen and there’s lot more going under the hood.


Hello 👋
Subscribe for wholesome stuff about life, tech, football, physics and everything else.
No spam, and there's unsubscribe link in every message.