Meta has released its Aria Gen 2 smart glasses, a major leap in AI-powered wearable tech. They are aimed at advancing research in spatial computing, robotics, and machine learning. Though not for commercial sale, these glasses give researchers a real-world testbed to develop next-gen interaction systems.
Meta Aria Gen 2 features 4 computer vision cameras, a 12MP RGB camera, and a global shutter sensor with 120dB HDR capability. These power real-time eye and hand tracking, blink detection, and pupil diameter sensing. The tech giant has also boosted stereo overlap from 35 to 80 degrees, improving depth perception and spatial awareness.
The glasses are lightweight (74–76g) and come in eight face-fit sizes. They include sensors like an accelerometer, GNSS, barometer, and ambient light sensor. An improved contact mic and a PPG heart-rate sensor sit in the nose pad.
At the core lies Meta’s custom coprocessor, designed to run machine perception algorithms on-device. This reduces latency, boosts security, and eliminates dependence on the cloud. The glasses use Visual Inertial Odometry (VIO) for six degrees of freedom tracking, supporting detailed environmental mapping.
Despite the tech leap, Aria Gen 2 won’t be sold to the public. Instead, Meta is using it in collaboration with partners like BMW, IIIT Hyderabad, and Carnegie Mellon for advancing AI, robotics, and HCI research. Meta is laying the foundation for intelligent, responsive spatial computing with Aria Gen 2. This marks a critical step toward smarter, more intuitive human-tech interaction.
Also Read:Best Smartglasses with AI Features