.The annual Coachella music festival in California had a total of 20 AR experiences for its 750,000 visitors during the festival this year. With the Coachellaverse (what an uninspired name), visitors on location and at home could immerse themselves in various immersive experiences via an app, Snapchat and Instagram.
Snapchat’s annual #SnapPartnerSummit confirmed and celebrated the power of the medium AR and its impact on the future. What a great update to Snap’s AR platform and AR in general. Proud to be a Snapchat partner
Snapchat launches Lens Cloud Services, reinforcing my belief in the real-world Metaverse. Location-based AR will become so big in the coming years.
Brands can turn their 3D assets into a shoppable Lens with just the click of a button through a 3D asset manager. And extract products in AR from existing product photos with another new feature that removes the models and background.
Snap comes with a brand new section in the app called ‘Dress Up’ where all AR commerce is bundled together.
There were also many cool new technical features such as ray tracing (better lighting effects), more APIs to add data to Lenses and Event Insights (with which you can measure certain engagement in a lens, just like with Google Analytics). And finally, CEO Evan Spiegel launched the Pixy, an autonomous camera drone. Wow! In addition to your own AR glasses, this drone will probably also be able to expand yourself and the world around you with virtual content in the future.
In the latest editions and like the post above, there are many developments in the field of location-based AR. We see more and more AR that is linked to a location in the city. It can be a visual expression that comes to life or that you become a part of, such as the case below from us of the Spicy Whopper from Burger King. Try the Lens here. But also complete monuments such as the new collaboration between Snap and the LACMA, entire buildings in London such as this cool experiment by Arthur Bouffard with the new City Templates or this walk through NYC in which the city council proposals in on location with AR are visualized (super smart!). Niantic is also expected to launch the Lightship Visual Positioning System during their developer summit. In the future, this will be a world-scale solution to have a 3D map of the world so that multiple devices can have the same AR experience at the same time or at a different time in a specific location. This will most likely be rolled out in select cities initially.
Meta’s Spark AR Studio comes with new features to develop better AR experiences. With the new audio engine, audio can be more easily integrated into AR and there are new tools for music visualization. There are enhanced occlusion capabilities to blend the real world and the virtual world even better. The coolest thing I think is that they have now also embraced LidAR to better detect depth and distance in AR. This scans the space so that the AR really blends with the space. See everything in the video below.