News September 13, 2017

AR is taking off: our experts’ thoughts on the Apple Keynote

Natalie Hendrickx

Content Marketeer

With the usual pomp and circumstance, Apple released its new generation of products in the brand-new Steve Jobs Theater last night. What especially piqued our interest, are the expanded possibilities of augmented reality (AR) and machine learning (ML). We’re already working on AR products for several clients, so we’re excited about the new possibilities. This is what our experts conclude from last night’s unveilings.

Kenny Deriemaeker, AR Competence Lead

“Last night’s announcements show that AR is a long game for Apple. We were wondering whether Apple would add sensors that could scan and recognize the environment around you, like Google Tango can. This would open up a whole new class of location-aware AR applications. But Apple has clearly chosen not to do that (yet), which is a strong statement about their AR strategy in the coming years.

Apple is confidently taking a big step towards the ultimate AR vision

Rather than going all-in with new hardware and sensors on the iPhone to expand the capabilities of ARKit, Apple is making serious investments to get everything they can out of the cameras and motion sensors we already have. In addition, they are counting on software innovations like machine learning (ML) to enable really interesting AR applications in the coming years.

The ultimate AR vision (where we get a helpful, context-aware digital layer seamlessly projected on top of the world anywhere and anytime) is still several product cycles away, though. It might not even come in the form of a smartphone. But with these new iPhones and iOS 11, Apple is confidently taking a big step towards that future.”

Jan Deruyck, Director of Sales & Marketing

“Up until now, AR has not been much more than a gimmick. But today the power of this technology can finally be fully leveraged. Why? Because it is built into an everyday device like the iPhone, together with powerful software, state-of-the-art cameras and machine learning algorithms.

Augmented Reality will finally go mainstream

The list of possibilities for consumers and companies is endless:

  • Redecorating? Map out your house on your iPhone. You’ll immediately see how much paint you need for the kitchen.

  • Going to a big DIY store for that paint? They might have a handy indoor navigation app, because those have a decent ROI now.

  • Are you a paint manufacturer and does your machinery need maintenance? Intelligent AR manuals could help your engineers. Heck, you might not even need an engineer to find and fix problems anymore.  

In The Pocket is already working hard with retail, fashion and enterprise clients to ship amazing AR products. The first ones will be revealed around the time the iPhone 8 and X are available.”

Quentin Braet, Solution Architect

“The real change for the iPhone ecosystem is in the ARKit and MLCore libraries (already on the existing iPhones). They enable developers to easily integrate AR experiences or use state-of-the-art ML algorithms without having to worry about how to implement these complex pieces of software. The focus of an app developer can remain on the application features. 

Integrating AR experiences and using machine learning will be easier and quicker

But is this unique to the iPhone? Certainly not. This time, Google isn’t miles behind like in 2010, and that is actually a good thing for the evolution in this area. I’m sure that we will see a lot of new use cases that we’re not yet aware of with these libraries. Face ID is just an example. You can have your doubts about its usefulness, but there will soon be cases where these doubts will vanish.”

Thomas Mons, director of engineering

“Forget Animojis, Face ID and the OLED display. For me, yesterday’s event was all about the new hardware and the possibilities it introduces into the ecosystem. In June, Apple already presented iOS 11, with ARKit and MLCore as its most significant features. The new hardware announced yesterday enables new capabilities of both frameworks and perhaps more importantly: it confirms Apple’s vision that a smart device should be “smart” on its own and not rely on a “smart cloud”.

New hardware will make your smart device less dependent on the cloud

That philosophy has recently been gaining popularity as “fog computing”, a metaphor for processing that happens “closer to the ground”. The advantages of this approach include faster response times, less network traffic, improved security and better privacy. But in order for this to work, a device needs processing power, and lots of it. And guess what Apple just announced? Exactly, a whole new generation of CPU/GPU (A11 Bionic) that Apple designed and produced itself, and that (if we may believe early benchmarks) might completely blow away the competition.”

AR Discovery Sprint

AR for your company? A short and effective AR Discovery Sprint will inspire you and help you determine what's viable. 

Discover VR
Discover VR