The new AR Spectacles glasses incorporate a spatial engine that allows hand tracking

Snap has presented the fifth generation of Spectacles its augmented reality (AR) glasses redesigned to integrate four cameras feeding the Snap space engine, which allows the tracking users’ hands and their interaction with the environment, powered by the new operating system Snap OS.

The technology company has announced its latest developments within the framework of the Snap Partner Conference, in which it has stressed its intention to improve AR to make it easier to “experience the world together with friends in completely new ways.”

In this sense, Snap has presented its new fifth-generation AR Spectacles glasseswhich he defines as a ‘hardware’ that “breaks the limits of screens and brings people closer to the real world.” This new model has been redesigned with lightness in mind, and with a weight of 226 gramsThe technology company has detailed that it reduces the weight of a typical virtual reality (VR) headset by less than half.

In addition, they have been equipped with four cameras that feed the Snap space engineWith this technology, the glasses are capable of carrying out a “perfect” tracking of the movements of the users’ handsallowing you to interact with the real world through gestures and AR.

As the company explained in a statement on its website, another of the new features included in the new Spectacles is the optical enginewhich has been built from scratch taking advantage of the Snap’s patented technology to enable a transparent AR display.

When reproducing images, the glasses integrate liquid crystal on silicon (LCoS) microprojectors. These microprojectors are “incredibly small” but pack a huge punch, with which the company has assured that they can create “sharp and vivid” images.

Likewise, the microprojectors are accompanied by some waveguideswhich allow you to view images created by the LCoS projector “without the need for lengthy calibrations or custom adjustments.” This is because Each waveguide has “billions of nanostructures,” which move light into the users field of vision to combine AR images with the real world.

As a result of this technology, Snap has detailed that the optical engine can offer a 46-degree diagonal field of view with a resolution of 37 pixels per degree. This is equivalent to viewing a 100-inch screen just 3 meters away, as he has exemplified.

Snap has also highlighted that the The glasses lenses are automatically tinted according to the ambient lighting.. Therefore, if there is a lot of light, the glasses darken to maintain the projection of vibrant images, even if it is direct sunlight. This makes it easy to use in any space.

On the other hand, the new Spectacles are powered by a Dual system-on-a-chip architecture. To do this, they integrate of Qualcomm’s Snapdragon processors which, thanks to this infrastructure, divide the computational workload between them.

With this, Snap has highlighted that this model allows for “more immersive” experiences, while reduces energy consumption thanks to its titanium vapor chamberwhich improves heat dissipation. As for their autonomy, they offer up to 45 minutes of continuous standalone run time.

NEW SNAP OS OPERATING SYSTEM

The new Spectacles have been presented together with the new Snap OS operating system, which offers an intuitive interface and has been designed with emphasis on reflecting the naturalness with which people interact with the world.

As the technology has developed, this has materialized in an operating system that allows users to easily navigate AR options using hand gestures and voiceIn fact, as he explained, “the main menu is always in the palm of your hand.”

To do this, through the images captured by the four cameras, the Snap’s spatial engine is able to understand the space around users. In this way, together with the monitoring of hand movements, it guarantees the realistic three-dimensional operation of the lenses.

The company also pointed out that the lenses are “designed to be shared.” Therefore, Snap OS also makes it easier for developers to create shared experiences so that friends and family can enjoy them together, thus unifying the real world with AR.

The new glasses can also work in conjunction with a smartphone via the new Spectacles app, that allows screen mirroring of the mobile device, as well as using it as a game controller and reproduce the images on the lenses. Also, with the Spectator modeother users can view what is being viewed and played through the AR glasses.

In addition to all this, Snap has announced that Spectacles glasses will offer Support for OpenAI chatbotsso you can interact with AI assistants on the glasses using voice commands.

For the moment, he has indicated that the new Spectacles have been Released for developers subscribed to the Snap Developer Program in the United States, in order to test and continue developing their products for the new AR glasses. However, to participate in the program, an annual subscription of $99 (80.80 euros at current exchange rate).

By Editor