With Lens Cloud, Snap offers back-end services for its developer platform

Mobile

Snap isn’t just the company behind popular social app Snapchat. It has also built a powerful augmented reality developer platform called Snap AR — it spreads beyond Snapchat thanks to Camera Kit, an SDK solution that lets you integrate Snap’s camera capabilities in other apps.

In Snap jargon, Lenses are essentially augmented reality apps that you can access in Snapchat. And the company has been building a huge “app store” of Lenses over the past few years. Some of them are funny, some of them are useful, some of them help you communicate in a whole new way. It is becoming a rich augmented reality ecosystem.

“Already, more than 250,000 creators have built over 2.5 million Lenses that have been viewed over 5 trillion times,” Snap CEO Evan Spiegel said during the Snap Partner Summit keynote.

Today, the company is launching another component for its developer platform, Lens Cloud. As the name suggests, it’s a server-side component that will help developers build dynamic, multiplayer experiences.

There are three components to Lens Cloud. First, developers can take advantage of multi-user services. It lets you create an instance for a group of friends so that they can interact together at the same time within the same Lens.

Second, location-based services let developers anchor Lenses to places, starting with Central London. For instance, museums could leverage that to enable certain Lenses when you’re pointing your camera at a specific landmark.

Image Credits: Snap

And finally, there are storage services. Developers can store assets on Snap’s servers and load them up on demand. It can also act as a sort of memory card. Users can leave a Lens and pick up where they left off later.

“Storage services make it possible for developers to expand beyond 8 megabytes. They do so by storing the assets that they’re going to load in the Lens in real time in our cloud,” Snap Head or AR Platform Partnerships Sophia Dominguez told TechCrunch’s Sarah Perez.

Storage services aren’t available just yet but the company expects to launch them in the coming months. This collection of backend services will be available for free for Snap AR developers.

Creators who want to start working on a Lens for Snapchat start by downloading Lens Studio. They can import 2D and 3D assets, use 3D Face Mesh, add custom shaders, write scripts and take advantage of Snap’s machine learning models with SnapML. Today, Snap is releasing a new version of Lens Studio with some new features as well.

Lens Studio already lets you dynamically adjust your Lenses by leveraging APIs. For instance, you can change the appearance of a Lens if it’s raining. The company is adding new API partners to open up more possibilities. Thanks to AstrologyAPI and Sportradar, you can create content that adjusts depending on astrological or sports data.

The company is also working on ray tracing support, which should greatly improve reflections and surface rendering in general. Analytics have been improved as well with event insights. It should help with debugging in particular.

Products You May Like

Articles You May Like

Perplexity has reportedly closed a $500M funding round
Indian startups raised 32% fewer rounds in 2024 as VCs got selective
OpenAI says it has no plans for a Sora API — yet
YouTube will now let creators opt in to third-party AI training
Hyme Energy signs global deal with Arla to scale thermal storage tech

Leave a Reply

Your email address will not be published. Required fields are marked *