Finding your way is not always easy. In a new city, for example, or in an airport where you have never been. One of the tools that can help you with this is augmented reality.
We started experimenting with AR Wayfinding a year ago when we realized augmented reality is the perfect tool to navigate people in and around buildings. It’s more intuitive than reading maps and offers more opportunities if combined with a Building Information Model (BIM) and other data.
After we developed a proof of concept of our AR Wayfinding tool, we integrated it into an online platform for Telenet and an internal application.
This is how we do it
This might be a bit technical, but we always love to explain how we tackle projects like these. The most important factor to create AR Wayfinding tools are three reference points in both the real world and digital world that are exactly the same to align both worlds on top of eachother. The main reference point will act as our marker, the second reference point is the floor right underneath the marker. In the most basic setting our main reference point is a QR marker. By measuring how high this marker hangs above the floor, we can know the second reference point. The last point is the user’s device. Since the device is used to scan the QR marker it’s easy to figure out where the device is in reference to the other two points.
We then put these points in the digital world by using the blueprints of the building and import them into Unity. In the blueprint we search for the place where we put our marker and place a digital marker in the same spot. Just as in the real world, the second reference point will be the floor. We move the digital marker the same amount of units in the Y-axis as we measured in real life. Last but not least, the device is represented by a Unity Camera.
Here comes the mathematical magic. When the user scans the QR marker, we know that during that frame it’s possible for the app to align both the real and digital world. By scanning the marker, we receive camera intrinsics from the device that we can use to build a conversion matrix that allows us to adjust and match the position and rotation from our digital reference points to the real reference points. After that, both worlds are aligned!
There will still be some offset, as the intrinsics we get from the camera aren’t always perfect because they are 3D calculculations from a flat image. To keep things as correct as possible, we add one more adjustment while the user is using the AR Wayfinding tool. By doing this, we can update the Y-axis from the digital world to the gravity sensor in our device.
Using augmented reality to develop wayfinding tools has numerous advantages.