Innovation August 27, 2018

AI will make UI so smart it will eventually disappear

Sebastiaan Van den Branden

Data Scientist

Today artificial intelligence (AI) is already capable of improving user experience (UX) in many different ways. Technologies like computer vision, natural language processing and recommendation engines are used by companies like Google, Netflix and Spotify to predict and nudge user behaviour, make translations, provide traffic guidance, streamline shopping, etc. Tomorrow, it will streamline our lives in nearly every way.


No need to heat (manually)

These examples are just a snapshot in a developing technology, but they are all trying to deliver the same vision: anticipating user needs, facilitating the steps to solve them, and ultimately even eliminating those steps or actions altogether. The ultimate user experience is one where the user doesn’t have to do anything, and everything happens automagically.

An early example of a product that nailed this vision, was Nest. Before Nest, you had to manually control your thermostat or use a pre-programmed schedule if you wanted to heat your house. The problem is that if you manually turn on your thermostat, it takes a while to heat up your living space and when you turn it off, the remaining heat in the room is lost when exiting the room after turning off the thermostat. Google Nest uses predictive models in addition to sensors that sense your presence in the room. This gives the device the ability to predict your personalized heating needs. This way, your original intent of putting on the heater is met before you even feel the need to express it, because it already has the right temperature by the time you get home. An added benefit should be that it saves you on heating costs. It’s an experience where the user doesn’t have to do anything, albeit a very focused use case, but more on that later.


We could project this mindset on other existing products. Spotify, YouTube and Netflix are putting lots of resources into predicting your taste and behaviour. Spotify wants to make playlists with music you like for different moods and a Discover Weekly playlist. YouTube is deploying advanced recommendation engines to give you related videos as accurately as possible to keep you on the platform as long as they can. Netflix gives me a playlist ‘best choices for Sebastiaan’ hoping their predictions are as accurate as possible. Although, this is just one step in the right direction. The real reason people open Spotify is that they want to hear the right music at that certain time. People open Youtube and Netflix because they want to be entertained by movies and video at the right time in the right place.

In short, whenever we express the need for entertainment, these platforms do anything to eliminate choice and the resulting stress that comes with it. They want to cater to our needs. In the future, they will go beyond, and solve our needs even before we have expressed them.

We can imagine a future where you enter your living room and the TV pops on at just the right time when you were in the mood to be entertained by the TV show you didn’t even know you were excited to start watching. The same can be conceptualized for Spotify: You put on your headphones and the right music for your mood starts playing - just like Nest does the work for you.

This doesn’t just apply to entertainment, but to any vertical, really. Let’s look at shopping. When we shop on Amazon, their cutting-edge recommendations are already making it very easy to find what we need. But we still need to search for it ourselves. Wouldn’t it be much easier if retailers anticipate our needs and predict when we will need certain stock-up items. Our weekly grocery shopping, for one, could be a thing of the past. You can bet on it Amazon and others are working on making this happen.

Using AI to create the ultimate UX

By collecting more (user) data, machine learning models will become more accurate in the future. And data, there will be plenty. We could imagine a future where we capture even more biometric data than wearables are currently capable of, to predict our moods and intents. Connected homes or smart cities with a growing number of sensors will also bring us a wealth of data points. This way, models will be capable of predicting complex problems, beyond ‘simple’ problems like ‘should the heater be on or off’.

More data and better models will finally allow us to deliver on the ultimate vision: solving needs without users having to navigate apps and interfaces to meet their intent.

Why aren’t we at this point yet? Complex multilayer problems are more difficult to solve than simple and concrete problems. Predicting whether a thermostat should be on or off is a more straightforward problem then accurately predicting someone's mood or preference for instance. Solving complex prediction problems requires more data and computational resources besides skilled developers. Though, we see new off-the-shelf machine learning models, pre-processing techniques and research papers pop-up every day, so this technology is growing rapidly side-by-side with the fast evolution of available computer power.


At In The Pocket we incorporate AI into our products to empower users with great new features, but also to eliminate tedious steps along the way. We don’t apply AI for the sake of AI, because technology is a tool, not the reason. By eliminating or facilitating certain steps in a process, we can improve ease of use, accuracy and speed to bridge the gap between intent and result.

Scanning a bank card in the Bancontact app instead of manually inputting the card number, advanced recommendation engines to predict your shopping list for your next supermarket visit, building support apps with computer vision are just some of the innovations we could build into our consumer products. But we can even see this in professional contexts as well, like quality control, for instance. Imagine the use of computer vision to automatically inspect every item shipped with a very high accuracy instead of inspecting samples manually and taking the risk to let faulty products through. In healthcare, we’re facilitating the process of documenting medical cases by automatic recognition of body parts.

It is important to keep searching for obstacles in UX, small improvements that can be made and eliminating steps by incorporating artificial intelligence into your products. But by doing so, we can think bigger: How could we eliminate these steps altogether to meet the original intent of the user.

Eager to unlock the potential of AI for your company?

In just a few weeks’ time, we pinpoint the most viable AR opportunity for your business, build a working prototype of it and test and validate it.

Learn more
Learn more