Interview with Clay Weishaar, aka WRLD SPACE
There’s no doubt that AR is a rapidly growing industry. Big names such as Snapchat and Facebook are helping to fuel this growth by providing their own tools, which allow creators to build their own AR experiences. One such creator is Clay Weishaar, aka WRLD SPACE.
When Clay messaged us on Poplar Studio’s Instagram account, I had a little fangirl moment. I’d been following his lenses and effects for a while, seeing his following grow to 523K followers on Instagram.
I asked Clay if we could jump on a call to discuss the future of AR, as well as his own experiences creating lenses and effects with Lens Studio and Spark AR. Not only did he agree, but we also ended up getting some hot tips on AR distribution in the process.
Priyanka Parmar: Hey Clay, how’s it going?
Clay Weishaar: Really good. It’s crazy, the Spark AR beta went really well and I got a lot of reach. I’ve gotten millions of impressions on some of the effects.
… Then the line goes dead. Just my luck. After some time, we get the call back online.
PP: So, how did you get into AR?
CW: I’ve worked in agencies as Creative Director for quite a while. I’ve been doing it since Vuforia launched, working on projects like making a car appear on a marker. But it really took off with the first edition of HoloLens. I went to a hackathon at the Fox Innovation Lab in Los Angeles and I got introduced to headset AR and fell in love with it.
I also did a gallery exhibition for Macallan in NY that travelled around, which was a mixed reality art exhibition.
I love working in the medium because things have been so flat for so long that working in a spatial medium and designing in space is something I’m really fascinated by. So, with the reach of Lens Studio and Spark AR, and being one of their creators, it’s just really cool to see the mass adoption and reach of AR, because headset AR is not there yet and otherwise I wouldn’t get to see people enjoy my experiences.
You can have millions of people looking at your lens on both platforms. I’m really stoked with mobile AR and what Snap and Instagram are doing right now.
PP: I see that your Instagram has been blowing up, what would you say has given you this success?
It’s kind of insane. I think if I was not in on it early, I wouldn’t have done as well as I have but, because it’s something new, people get excited by it.
There are definitely challenges that I’m facing. Discovery is really ridiculous, it’s so hard for people to find multiple face effects. What I found best is that you can create highlights with the link to your effects in your story, and then you can have multiple effects on your Instagram so they can swipe through your tray, go to your highlights and click on your link. But it’s like we have to hack the system right now because it’s not very creator-friendly.
Snapchat is definitely a lot better in terms of how they are profiling creators and getting the stuff out there, but even within Lens Studio you have to know to tap to get to the discovery area, so I think they’re still trying to refine that. It’s funny, it’s so new and everybody’s figuring it out together. I talk to Mate Steinforth and we’re trying to help each other out, it’s been good because I’ll promote his stuff and he’ll promote my stuff and I think knowing him from the community has helped too. It feels a bit Wild West right now.
PP: Out of Spark AR and Lens Studio, which would you say you prefer?
CW: Lens Studio, hands down. They worked on it for so long, it’s so much better, you can ask any Official Lens Creator too and they’re trying to work in Spark and it’s like wearing two left shoes, it’s really not natural, the tool is clunky. You can’t iterate as fast in it, and the approval process is longer. Snapchat is definitely really good for the creative process and for how quickly you can iterate and test lenses on it. Also, the editor is so much easier.
I hear it across the board from all the creators on Snap who are just like ‘ugh Spark’.
I hope that they make it better but the Snap ecosystem doesn’t allow people to subscribe to creators yet. I was at Lens Fest this year and that was the biggest ask from all the creators: “How do people subscribe to my lenses so they can see them regularly?”. The only way they can find them is through the community lenses. On the other side you’ve got Spark, which has a rather clunky tool but you get followers and they can regularly follow your work. It’s tough, I think you have to do both right now.
There’s the workflow too, creating a 3D asset and doing everything. It’s pretty easy to take a lens from Lens Studio and translate it over to Spark fairly quickly, so I feel like we’re in this phase, you know, just like how there is ARKit and ARCore, that you’ve got Spark and Lens Studio and you kind of build for both.
PP: Do you collaborate a lot with other creators?
CW: Snap definitely reaches out to me a lot to create stuff but, it’s hard with the current ecosystem to collaborate within Lens Studio and Spark. You can give shoutouts on Instagram, but with Lens Studio there’s no way of having two creators credited for a Lens. That’s something we’ve requested though, being able to do a tag-team with different creators, because a lot of people will influence each other and you’re always bouncing off ideas online. It would be really cool to have credits.
PP: What industry do you see benefiting the most from AR?
CW: Everything that I’ve heard so far is industrial use cases for headset AR, but if you’re talking mobile it’s the social platforms. There’s this company called Mira who developed this headset and they marketed it as this $100 headset that you put your phone in, not like the google cardboard exactly but a little bit different. Basically it was a cheap AR headset. But they quickly pivoted so now they’re servicing industrial-type stuff for real-time AR instructions. AR headset-based stuff isn’t really a consumer product, industrial-based stuff is.
I’m looking forward to the next version of Spectacles. I think it’s great because it’s going to serve these bite-sized social AR experiences and face lenses without the fatigue of holding up your phone, which has been the biggest criticism I’ve heard, that you get hand fatigue when you’re trying to engage in a game or an active experience.
There’s some pretty fun ones out there, but no one wants to hold up their phone, so it’s going to be cool to be able to put on your Spectacles and see people wearing your lenses.
PP: What are some of the brands you’ve worked with?
CW: Because I was previously in production, this is my first year as a freelancing creator. I’ve been at different production studios like Be Real, Tool and Unit 9, plus agencies.
These platforms are allowing me to be a publisher and a creator. I can do a freelance project that’s not under a production company, and partnering with people who are doing this in a different way is something i’m excited around.
Brand-wise, I’ve worked with a lot of different ones: Airbnb, Intel, Google. I worked on the Google CES rollercoaster, which is super fun and I did a lot of the character visualisation in AR. It was a good use case of taking Lens Studio and showing somebody a model at scale of what that character will look like. Allowing the client to walk around it was very cool.
It’s a different way of using mobile AR to do previews. I’ve definitely worked with a lot of different brands, but I’m really excited by the social platforms right now.
PP: That’s what we think, we want to help creators bridge that gap between them and brands.
CW: It’s so great, it’s like an in-between, you don’t need to be under production companies or a large agency. You guys are awesome to partner with, you handle the strategy and educating the client. It just feels like a really lightweight, modular workflow to me. It feels really fresh.
PP: What is your preferred method of distribution?
CW: For distribution of effects, I think the Lens Studio community lens section is good. I’m starting to do lens challenges with Snap, that’s something they’re doing with creators. So, I think Snap are doing a lot on their end and they are very far ahead of what Spark is doing.
On Instagram, right now your best option for effects is to create highlights that link to your lens and making sure people know that they can click on that tiny 6 point type to use your effect. Sharing as many stories with your link as possible is the best way to get traction.
PP: I’ve noticed that popular foods are a trend in your effects and lenses. How come you went down that route?
CW: It’s just fun, I don’t want to take it too seriously. I might do some diet stuff next time. I don’t know, I’m a foodie. I like doughnuts, hot dogs, pizzas and hamburgers and they’re also fun shapes to work with.
I was doing a lot of World Lenses before Face Lenses. I like to put things in an environment, and seeing a giant hamburger in the freeway is something that interests me. It’s just fun.
I’ve been working on an indie game along these lines for years that I haven’t yet released, which is basically how I learnt Unity, but my game explored a lot of those assets too.
It’s fun to put things in an environment and to factor in the environment around you. If you’re on top of a building and there are buses driving, they can drive through the doughnut. A lot of times, you can’t control how AR is used, so I like the idea of location scouting and building the experience around the location, it’s super interesting when you factor in the world space. I think it gels well with where cloud AR is going to go, contextual computer vision. Like, if I’m using my phone or wearing my glasses to see a tree then you can serve up that experience based on a design around a tree. It’s interesting, as the technology progresses we’re going to see a lot more context-based lenses in AR experiences happening in real time, being pulled from the cloud.
PP: When you’re not making all these really fun effects, what do you normally do?
CW: I surf, I live right by the beach. I go to art shows, I like painting. I’d say surfing mostly, it’s a good balance of being away from your computer.