The former Snap design lead launches Shader, an AR creation tool that leverages AI to generate custom effects.

.
The former Snap design lead launches Shader, an AR creation tool that leverages AI to generate custom effects.

Shader aims to compete with industry heavyweights like Snap's AR development platform Lens Studio and TikTok's Effect House with the no-code AR creation tool that generates custom effects, 2D masks, and lenses in minutes rather than hours to create a single AR filter, which involves engineering and design skills. The startup built its platform on the open source Stable Diffusion model, letting users enter text-based prompts to generate their creations.

Co-founded by Darya Sesitskaya, a former Snap design lead who worked on the designing of Snapchat's AR camera, Lens Studio, Lens Cloud and more, this company was also previously represented at Wanna (formerly Wannaby), the AR technology company popularized for its virtual try-ons for sneakers, clothing and watches. Shader's team is comprised of former Snap AR and Blizzard engineers.

It's a real-time AI camera app, which launched in beta on iOS devices in December 2023. This means that users take a photo of themselves for the app to scan their faces and enter a prompt for the AI to generate a personalized AR effect that the user can then record a video of themselves wearing.

During our testing, we found results to be on the simpler side, but there were no bugs or glitches. While it's simple, there's a lot of potential for the app to be more than just a fun tool to play around with and show your friends. Shader plans to launch a premium subscription option that provides access to features that are higher in quality, Sesitskaya tells TechCrunch.

The app offers cross-platform functionality, so users can share their creations on Instagram, TikTok, and Snapchat, giving creators an opportunity to show off the exclusive filters they have come up with. Shader will later launch an in-app social feed where users can post their templates, allowing other creators to like, comment, and try out the effects.

Along similar lines of other AI photo-editing applications, users can upload their images from the camera roll, either using prompts to tailor their photos or choose a variety of Shader's preset masks, which include one featuring a fox, Yoda-like, and others for filters.

Since releasing the beta version, Shader has downloaded about 3,000 times. An Android release will follow shortly.

It released a version that can run on web, using webcam face scanning in the latest form. For creators, a text box was added for typing prompts into. But judging by current appearances, premade templates do not appear to exist.

Shader is also optimizing its iOS app for the newly launched Vision Pro, which utilizes Apple's digital persona technology (image above). Additionally, Shader offers an API and plug-in for companies to implement the technology into their own products.

As for funding, Shader raised $580,000 from Betaworks, Greycroft, Differential Ventures, Mozilla Ventures and On Deck. While it is a relatively small round, an investment like this suggests there remains demand for AR-creation tools. The capital raised is going towards new product features, such as speech-to-text prompts, and support for platforms like Twitch, Discord and Zoom, which allow people to wear AR filters in live streams. It's also going to help grow its marketing team.

Our mission is to make AR/AI effects accessible to everyone, empowering users to easily create personalized content. Shader is expanding into various social face filters, including background, clothes and hair, prioritizing user-friendly design principles to unlock new possibilities for the 400 million creators on social media. In the near future, we also plan to implement the ability to create voice-to-AR effects and 3D background replacements," says Sesitskaya.

 

Blog
|
2024-11-28 19:25:15