Augmented Reality February 27, 2019

The hardware that will change AR

Lucas Selfslagh

Lucas Selfslagh

3D Engineer

Thomas Smolders

Resident Writer

When it comes to augmented reality, there are a lot of quotes describing the potential impact of the technology.

‘It is as big as the smartphone. It is huge’ is what Apple CEO Tim Cook said in an interview with The Independent in 2017. It’s probably also what he thought when he assigned Frank Casanova as Senior Director for Augmented Reality Product Marketing. Like, do you know what Phil Schiller is to the App Store? Well, Frank is gonna be that to Augmented Reality.

‘It’s the new communication platform’, claims Mark Zuckerberg, CEO of Facebook. He also means business, as Facebook tried to buy Unity three years ago, in what would have been a acquisition among the likes of Disney and Pixar.

No one denies the potential of AR experiences, but there is still a lot of debate about the hardware that will run the AR experiences of tomorrow. Some say we have to await the breakthrough of smart glasses or dedicated wearables, while others claim that current mobile consumer devices are sufficient for kickstarting an entire new way of interacting with digital content.

At In The Pocket, we believe that the camera of your smartphone is the new playground, a first step into this brave new world. So evidently, we’re looking forward to the technological innovations that will shape the future of our devices, and concurrently: AR. Here are some of the coming breakthroughs in AR that will change the playing field in the coming years:


Bluetooth 5.1


Bluetooth 5.1 was presented on 21 January 2019 by the Bluetooth Special Interest Group. The most interesting feature for us are the Angle of Arrival (AoA) and Angle of Departure (AoD). AoA will enhance location services with a direction finding feature that makes it possible to detect the direction of a Bluetooth signal, allowing for positioning by means of signal triangulation such as we already do daily on a larger scale with wifi-signals and cell towers.

With Bluetooth direction finding, developers will bring products to market that understand device direction and achieve sub-meter location accuracy, outdoors and indoors. That’s why we believe Bluetooth 5.1 will change the way we do wayfinding, asset tracking, proximity marketing, etc.

FaceID

The best known depth sensor feature is probably Apple's FaceID


Depth sensors

Depth sensors are a relatively new feature on smartphones. The best known depth sensor feature is probably FaceID, that was first featured on the iPhone X in 2017. FaceID uses a tiny IR camera, called the TrueDepth camera, which shoots out 30.000 rays of light (friggin laser beams, that’s right) per second and can gather an understanding of depth from the returned pixels.

We wouldn’t be serious about AR if we weren’t dreaming about a back-facing TrueDepth cam since day one. And it turns out our prayers will be answered rather sooner than later: persistent rumors from various credible sources claim that the 2019 (or 2020) iPhone will sport a back facing TrueDepth cam.

Because sensing depth is great for unlocking your phone without entering a passcode, but it’s absolutely stellar when it comes to AR features! Today’s AR tracks visual points through successive camera frames to triangulate their 3D position, which can limit us in featureless environments or low light conditions.

Tomorrow’s AR could theoretically abolish the need for recognizable patterns, ugly markers or even the presence of light itself. Maybe the coolest part for you, the user, will be the fact that you will de facto get night vision on your phone.


HoloLens 2

Three years after the original HoloLens launched, Microsoft announced its successor last weekend. The HoloLens has played a big role in the short history of consumer grade augmented reality, but according to what’s currently know, Microsoft will primarily focus on enterprise customers with the HoloLens 2.

We can talk for hours about the software that is improved in the new version of the HoloLens, but the hardware got an upgrade too. The combination if its lasers, mirrors and the waveguide create a brighter display with a wider field of view that doesn’t have to be precisely aimed into your eyes to work.

The cameras that scan the room improved so they have semantic understanding of the environment. The new HoloLens knows what’s a table, what’s behind the table and which gestures you use to do what you want to do. This combination between AR and AI is something we’re really looking forward to - make sure to check out the Shift podcast in which our AI Lead and AR Lead discuss the possibilities of this evolution!

Discover our Roadmap to AR

Find out how augmented reality can improve your business with the help of our Roadmap to AR

Learn more
Learn more