Apple wants to hire someone to help it deliver "the next paradigm of user interfaces and entirely new interaction models.” What might such user interfaces do, and why will we use them?
When you think of 3D user interfaces (UIs), it’s hard not to think of Minority Report and that famous scene in which Tom Cruise flips through hundreds of holographic images using touch and gesture to manipulate them.
(Crosscom founder Don Shin recently created a proof-of-concept solution that does something similar).
In light of Apple’s job ad, the years of work it has been doing on the development of AR glasses, and the evolution of ARKit, it feels like a good time to consider how these technologies (augmented reality, virtual reality and machine intelligence) could enter daily life.
It's important not to underestimate the mission. Apple’s own history suggests that when it comes to developing a 3D UI in AR, the company will want to develop something that defines the category as effectively as its Mac, iPhone and iPad platforms have done.
To achieve that it must figure out how to create a platform that can evolve rapidly over time, unleash waves of third-party innovation and enable the creation and consumption of user experiences that provide tangible benefits to both consumer and enterprise users.
When it comes to AR (and VR), attention always seems focused on their use in games. Gaming is a highly competitive space, and while gaming experiences can be truly amazing, games platforms have short lives. Apple wants a platform that can evolve and grow in time. So, it won't be playing games.
What sort of benefits can 3D UI in AR provide?
As an example of an interesting AR/3D UI solution, I often return to the above video, which shows a product called the WayRay Navion in action.
The AR navigation solution will overlay GPS navigation instructions onto the road ahead, and also gather and shares driver statistics to help you stay in control when on the road. It's not hard to imagine it translating street signs in future.
It's an interesting example.
So, how might you use something like this in daily life? I recently wrote about an app called Magnus. This app tries to recognize any piece of art it is pointed at to provide artist and pricing information. That kind of concept – using machine intelligence to learn more about stuff – should be familiar to anyone who uses Shazam music recognition. You’ll also find it in apps like Sky View, Night Sky and many more.
The principle that you can access more information about anything you happen to point your camera at unlocks all kinds of educational opportunities.
Want to know the correct way to care for a house plant? Someone will build an app for that. Are you a field technician staring at a broken machine and need the tech support manual? They are already building apps for that.
Towards a humanistic AI
Do you use Photos? Have you gone through your collection and named your People so Photos can find images of them? Have you used Face ID? These solutions quite clearly show that Apple’s technology can already recognize you and people close to you.
Siri co-founder Tom Gruber talked about how “humanistic AI” can enhance our lives during his 2017 TEDTalk. He asked: “What if you could have a memory that was as good as computer memory and was about everybody you ever met?”
“Augmented reality will take some time to get right, but I do think that it's profound,” Apple CEO Tim Cook said in 2016. “We might ... have a more productive conversation, if both of us have an AR experience standing here, right?"
If you can point a camera at a piece of art to find out everything about it from an app’s database, what happens if you point your camera at someone you’ve met only once as it accesses your own private and personal database?
It’s not impossible to imagine a 3D UI providing an overlay that informs you of that person’s name, when you met, even what you last discussed. Think of it as a Way Ray Navion for your daily life. (And, of course, it could go way beyond that; Chinese law enforcement already use facial recognition sunglasses).
Now take a look at Apple’s job ad:
“Apple’s UI frameworks define the look and feel of our software and products. You will have the opportunity to build software that directly impacts the way both developers and customers interact with our products. You will work with some of Apple's most advanced technologies including the Augmented Reality (AR) and Virtual Reality (VR) support offered in ARKit and Metal 2. Work closely with human interface designers and internal clients to define and deliver the foundation for the next paradigm of user interfaces and entirely new interaction models.” (Italics mine).
Apple isn’t in the business of playing games, it’s in the business of building platforms. ARKit will be an essential component of future platforms, not a feature. Now, where did I put those spectacles?
Google+? If you use social media and happen to be a Google+ user, why not join AppleHolic's Kool Aid Corner community and get involved with the conversation as we pursue the spirit of the New Model Apple?
Got a story? Please drop me a line via Twitter and let me know. I'd like it if you chose to follow me there so I can let you know about new articles I publish and reports I find.