Snapchat has recently introduced a generative artificial intelligence suite as a part of the new Lens Studio 5.0 release. What this suite primarily aims to give power to the AR experience for users. The social media platform said that it is previewing Snap’s real-time image model that “can instantly bring your imagination to life in AR”. Snapchat with the help of AI will let users type in an idea for a transformation and then generate vivid AR experiences based on the input in real-time.


What All Will We Get With This Update


This might be one hell of good news for Snapchat users, especially content creators, as they will be able to change the surroundings in their videos. As per the company, the latest update will ensure that the changes align with the light and colours in the video in order to offer a seamless experience rather than look like a messed up video. The new on-device AI model will utilise the text prompt given by the user to generate a custom lens.


ALSO READ | Nikon Z6III Price Announced. Check Out How Much World’s First Camera With Partially Stacked CMOS Costs


Snapchat is introducing a range of AI tools designed to enhance the augmented reality (AR) experience for creators. The latest update to Snapchat's developer platform, Lens Studio, includes new features like advanced face effects and Immersive ML. The new face effects allow creators to modify a user's face based on a written prompt or uploaded image, while Immersive ML enables real-time transformations of the user's face, body, and surroundings. Additionally, the update provides an AI assistant to assist developers in creating 3D models.


Previously, Snapchat users were limited to basic video edits, but with this new update, the special effects added to images are expected to appear much more realistic.


Soon, Snapchat users will also be able to generate 3D assets using text or image prompts. For instance, a 3D character head can mimic user expressions and create face masks and textures.


The ones who are wondering, for normal users, access to this new model will be available in the coming months, and for creators, they will get access by the end of this year.