Tech

In iOS 16, apps can trigger real-world actions hands-free – TechCrunch

New features in iOS 16 will allow apps to trigger real-world actions hands-free. That means users can start things like playing music just by walking into a room, or turning on an e-bike for a workout just by stepping on it. Apple told developers today in a session hosted during the company’s Worldwide Developer Conference (WWDC) that these hands-free actions could be triggered even if the iOS user isn’t actively using the app at the time.

The update that uses Apple’s Nearby Interaction Frameworkcould lead to some interesting use cases where the iPhone becomes a way to interact with objects in the real world if developers and accessory makers decide to adopt the technology.

During the session, Apple explained how apps can now connect and exchange data with Bluetooth LE accessories, even when they’re running in the background. However, in iOS 16, apps can start a Nearby Interaction session with a Bluetooth LE accessory that also supports Ultra Wideband in the background.

In this regard, Apple has updated the specification for accessory manufacturers to support these new background sessions.

This paves the way for a future where the line between apps and the physical world is blurring, but it remains to be seen if third-party app and device makers decide to embrace the functionality.

The new feature is part of a broader update to Apple’s Nearby Interaction framework that was the focus of the developer session.

Introduced at WWDC 2020 with iOS 14, this framework allows third-party app developers to tap into the U1 or Ultra Wideband (UWB) chip on iPhone 11 and later devices, Apple Watch, and other third-party accessories. It’s what today powers the Precision Finding capabilities offered by Apple’s AirTag, which allow iPhone users to do this Open the Find My app be guided to the exact location of their AirTag using on-screen directional arrows, along with other cues that tell you how far you are from the AirTag or whether the AirTag may be on a different floor.

With iOS 16, third-party developers will be able to create apps that do almost the same thing, thanks to a new feature that allows them to integrate ARKit – Apple’s augmented reality developer toolkit – with the Nearby Interaction framework .

This allows developers to take advantage of the device’s trajectory calculated by ARKit, so their devices can also intelligently guide a user to a misplaced item or other object that a user might want to interact with, depending on the functionality of the app. By using ARKit, developers get more consistent range and direction information than if they were just using Nearby Interaction.

However, the functionality doesn’t have to be exclusive to AirTag-like accessories made by third parties. Apple demonstrated another use case where, for example, a museum could use ultra-wideband accessories to guide visitors through its exhibits.

Additionally, this feature can be used to overlay directional arrows or other AR objects over the camera’s view of the real world to guide users to the ultra-wideband object or accessory. Continuing the demo, Apple briefly showed how red AR bubbles could appear on the app’s screen over the camera view to show the way.

Longer term, this functionality lays the groundwork for Apple’s rumored mixed reality smartglasses, where presumably AR-based apps would be at the core of the experience.

The updated functionality will be rolled out for beta testers of the iOS 16 software update, which will be released to the general public later this year.

In iOS 16, apps can trigger real-world actions hands-free – TechCrunch Source link In iOS 16, apps can trigger real-world actions hands-free – TechCrunch

Related Articles

Back to top button