Apple is currently advancing the development of at least three hardware products that have the potential to disrupt the industry, aiming to reshape the human-computer interaction ecosystem through underlying visual intelligence and holographic display technologies.

It is reported that the "H1" or "MH1" version of the spatial iPhone is expected to be released around 2030. Its core highlight lies in an AMOLED screen integrated with a nano-structured holographic layer, combined with eye-tracking and diffractive beam steering technology, allowing users to experience a 360-degree immersive sensory experience without needing glasses. In a more near-term timeline, Apple's AI strategy has entered an advanced testing phase.

One product, positioned as an AI keychain similar in size to an AirTag, is expected to be released as early as next year. This device, as an iPhone accessory, eliminates the physical display and instead enhances Siri's voice and visual interaction capabilities through a "always-on" camera and microphone. At the same time, the hardware design of the AI-powered AirPods Pro is nearly complete. Its camera is mainly used to capture visual data to assist AI processing, rather than for video recording. Although the release has been delayed due to compatibility issues with Apple Intelligence, the product is still expected to be commercially available by the end of this year after the September update of Siri's visual intelligence features.

These series of actions by Apple reflect its transition from traditional screen interaction to "seamless visual perception." By embedding AI vision modules into earphones and keychains, Apple aims to build a closed-loop system covering spatial computing and wearable intelligence. This not only complements the Apple Intelligence ecosystem with hardware but also signals that the consumer electronics industry is accelerating into a new era of multimodal real-time interaction.