Our takeaways from WWDC 2023
Hate it or love it, you can’t escape the fuzz. Last night, Apple’s highly-anticipated WWDC took place and we took the liberty to recap the most important highlights of the event. From iOS 17 tweaks to the integration of AI, and finally, the unveiling of the Apple Vision Pro. Join us as we divide the gimmicks from the good stuff.
First things first: iOS 17
Let’s start off with the least exciting stuff. Apple platforms, including iOS, iPadOS, and macOS, have matured into a maintenance phase, mirroring the trajectory of the iPhone. This year's software update focuses on refining existing features rather than introducing flashy new ones.
iOS 17 brings enhancements such as contact posters for calls, live voicemails, a check-in safety feature, audio message transcripts, StandBy (a kind of smart clock display when your phone is charging), interactive widgets, AirDrop improvements, a journalling app, enhanced autocorrect and text prediction, and a new content safety framework for child protection on iPhones. Not bad, but not thrilling.
More notable: Widgets now assume a more prominent role by becoming interactive and appearing on the iPad lock screen, the new StandBy mode, and even the Mac desktop, providing relevant information throughout the day. If you're an app developer, now is the time to incorporate widget support, seamlessly integrating your app experience into the device.
AI at work, behind the scenes.
Unlike other tech company presentations, Apple didn't hype AI as a feature but rather integrated it seamlessly into new and improved functions. It was refreshing to see AI actually improve the user experience instead of just creating the next ChatGPT or Bard-powered chatbot.
Examples include contact posters and Live Stickers that use foreground object extraction, text transcription for voice messages and voicemails, an article reader in Safari, AR effects based on hand gestures, improved autocorrect with sentence suggestions, Sensitive Content Warnings for secure viewing, adaptive listening mode for AirPods, and grocery sorting in the Reminders app.
But, in our opinion, three more remarkable examples stood out:
- Personal Voice: An accessibility feature that recreates your voice for calls by typing messages. This will be extremely helpful for users which a speech disability, for example.
- Auto-detection and auto-fill of verification codes: Seamlessly identifies and fills in verification codes from text messages and emails, enhancing convenience and security. Also, the option to automatically delete messages containing verification codes further safeguards your login information.
- Visual Lookup: Apple's alternative to Google Lens now detects recipes from food photos and allows object identification in videos, expanding your culinary exploration and knowledge. Additionally, the Photos app creates automatic albums for pets, extending its automatic album creation beyond human subjects.
These AI-powered features aim to improve the user experience without slapping a big AI label on top of it. They should be invisible, but very helpful while using your iPhone throughout the day.
The elephant in the room: VisionOS
Now onto the biggest announcement of the night: the brand-new Apple Vision Pro headset, powered by visionOS, the first operating system designed specifically for spatial computing. This headset combines various technologies developed by Apple, such as ARKit, spatial audio, cross-platform apps, SwiftUI, and their Vision framework for hand and gesture detection.
The Apple Vision Pro promises a seamless blend of virtual and augmented reality experiences, easily switchable with a simple twist of a dial. Apple refers to this as spatial computing, avoiding traditional AR, VR, or MR terminology.
Also important: Apple positions the Vision Pro as a new computer that has the potential to replace your current iPhone, iPad, or Mac. They don’t want to make it just a game or entertainment device, but rather a new way of computing, which might be where Apple’s future is heading.
What sets Apple apart from competitors is its focus not only on the wearer's experience but also on how others can engage with them. Through their Eyesight technology, the Vision Pro can project the wearer's perspective to anyone observing, enhancing interaction.
Another remarkable (or familiar?) feature is the creation of an authentic digital representation, known as the digital Persona, which replaces the user's appearance during video conferences. This lifelike representation accurately mimics facial expressions and hand gestures, delivering a more engaging communication experience.
While the Apple Vision Pro will initially be available in the US starting early 2024, with more countries to follow later in the year, developers can begin their development journey in the coming months, as Apple will provide developer kits.
Conclusion
In summary, Apple's WWDC 2023 showcased a mix of incremental updates with one groundbreaking innovation. iOS 17 focused on refining existing features, while AI integration improved user experience. The centrepiece was the Apple Vision Pro headset. It aims to redefine computing, but it’s way too soon to give it credit for that. But one thing is certain: developers, prepare yourselves for what’s coming!