Mon. Jan 17th, 2022

As Snap’s creators begin experimenting with the company’s augmented reality Spectacles hardware, the company dives deeper into exploiting the capabilities of its Lens Studio to build augmented reality filters that are more connected, more realistic and more futuristic. At the company’s annual Lens Fest event, Snap debuted with a number of changes that came to their lens creation suite. Changes range from efforts to integrate external media and data to more AR-centric features designed with a spectacle future in mind.

On the media side, Snap will launch a news audio library that will allow creators to add audio clips and millions of songs from Snapchat’s library of licensed music directly into their lenses. Snap is also making an effort to bring real-time data into Lenses via an API library that shows development trends such as weather information from AccuWeather or cryptocurrency prices from FTX. One of the major feature updates will allow users to integrate links into lenses and send them to various web pages.

Image credit: Snap

Snaps’ once silly selfie filters continue to be a great growth opportunity for the company, which has long had augmented reality in its sights. Snap detailed that there are now more than 2.5 million lenses that have been built by more than a quarter of a million creators. These lenses have been seen by users a total of 3.5 trillion times, the company says. The company is building its own internal “AR innovation lab” called Ghost, which will help the company bankroll Lens designers who want to push the boundaries of what is possible by handing out grants of up to $ 150,000 to individuals projects.

As the company seeks to make lenses smarter, they are also seeking to make them more technically proficient.

In addition to integrating new data types, Snap is also looking at the underlying AR technology to help create enjoyable lenses for users with lower-end phones. Its World Mesh feature has enabled users with advanced phones to take advantage of AR and view lenses that integrate multiple real-world geometry data for digital objects into a lens to interact with. Now Snap also enables this feature across more basic phones.

Image credit: Snap

Similarly, Snap is also rolling out tools to make digital objects respond more realistically to each other, debuting with a physics engine in the lens that will allow for more dynamic lenses that can not only interact more deeply with the real world, but can adapt to simultaneous user input.

Snaps’ efforts to create more sophisticated tools for creating lenses on mobile come as the company also seeks to build more forward-looking support for the tools developers may need to design for hands-free eyeglass experiences on their new AR glasses. Creators have been experimenting with the new hardware for several months, and Snap has built new lens functionality to address their concerns and create new opportunities.

Image credit: Snap

Ultimately, Snaps glasses are still in developer mode, and the company has not offered any timelines for when they can ship a consumer product with integrated AR features, so they have theoretically plenty of time to build to build in the background.

Some of the tools Snap has quietly built include Connected Lenses, which enable shared experiences inside Lenses, allowing multiple users to interact with the same content using AR Spectacles. In their developer iteration, the AR glasses do not have the longest battery life, which means that Snap has had to be creative to ensure that the Snap’s are there when you need them without running persistently. The company’s Endurance mode allows lenses to continue running in the background off-display while waiting for a specific trigger such as reaching a specific GPS position.

Leave a Reply

Your email address will not be published. Required fields are marked *