Augmenting your inbox with new AR inspiration and insights. With in this new edition Coachella with 20 AR experiences, the Snap Partner Summit and Spark AR updates. Enjoy the magic! 🔥
.The annual Coachella music festival in California had a total of 20 AR experiences for its 750,000 visitors during the festival this year. With the Coachellaverse (what an uninspired name), visitors on location and at home could immerse themselves in various immersive experiences via an app, Snapchat and Instagram.
At the festival, you could use the app in AR to see where you had to go (AR wayfinding) to discover one of the large AR installations. The app uses Niantic’s Lightship SDK to bring physical installations to life in AR. The festival was also promoted via Instagram with a number of AR installations in 15 different locations around the world such as New York and London. The Meta AR effects could also be experienced at the festival as an AR tour as you can see in this video. Nice to see the difference between the real AR and the teaser video.
On the big stage were huge LED screens that used Unreal engine for an immersive performance. The viewers at home saw even more because an AR layer was added to the live stream like with this performance by Flume. Coachella had turned out really well this year but it certainly won’t be the last time because according to innovation Lead Sam Schoonover they had 20 times more engagement than the year before. And it certainly won’t be the last festival, because Snapchat recently announced an intensive collaboration with event organizer Live Nation during their annual Partner Summit. More Partner Summit updates in the next post.
Snapchat’s annual #SnapPartnerSummit confirmed and celebrated the power of the medium AR and its impact on the future. What a great update to Snap’s AR platform and AR in general. Proud to be a Snapchat partner
Snapchat launches Lens Cloud Services, reinforcing my belief in the real-world Metaverse. Location-based AR will become so big in the coming years.
Snap is coming with a feature where users take a few photos of their bodies and then add 1 on 1 matching clothing to those photos to see what it looks like. “Shoppers can go from ‘this looks good’ to ‘this looks good’ on ME!”
Brands can turn their 3D assets into a shoppable Lens with just the click of a button through a 3D asset manager. And extract products in AR from existing product photos with another new feature that removes the models and background.
Snap comes with a brand new section in the app called ‘Dress Up’ where all AR commerce is bundled together.
There were also many cool new technical features such as ray tracing (better lighting effects), more APIs to add data to Lenses and Event Insights (with which you can measure certain engagement in a lens, just like with Google Analytics). And finally, CEO Evan Spiegel launched the Pixy, an autonomous camera drone. Wow! In addition to your own AR glasses, this drone will probably also be able to expand yourself and the world around you with virtual content in the future.
In the latest editions and like the post above, there are many developments in the field of location-based AR. We see more and more AR that is linked to a location in the city. It can be a visual expression that comes to life or that you become a part of, such as the case below from us of the Spicy Whopper from Burger King. Try the Lens here. But also complete monuments such as the new collaboration between Snap and the LACMA, entire buildings in London such as this cool experiment by Arthur Bouffard with the new City Templates or this walk through NYC in which the city council proposals in on location with AR are visualized (super smart!). Niantic is also expected to launch the Lightship Visual Positioning System during their developer summit. In the future, this will be a world-scale solution to have a 3D map of the world so that multiple devices can have the same AR experience at the same time or at a different time in a specific location. This will most likely be rolled out in select cities initially.
Meta’s Spark AR Studio comes with new features to develop better AR experiences. With the new audio engine, audio can be more easily integrated into AR and there are new tools for music visualization. There are enhanced occlusion capabilities to blend the real world and the virtual world even better. The coolest thing I think is that they have now also embraced LidAR to better detect depth and distance in AR. This scans the space so that the AR really blends with the space. See everything in the video below.