Using Motion Capture to Escape the Uncanny Valley

Using Motion Capture to Escape the Uncanny Valley

Hi, this is Ian Berget from Elara Systems, here with-

Mike Kennedy. I’m a senior developer here at Elara systems, a Strategic Creative Agency.

Ian Berget:

We are going to talk about… Well, motion capture today, a bit of rigging, and I think, Mike, you put it as kind of escaping the uncanny valley in some way, without having to have a huge staff internally that’s all about character and photorealistic animation.

Mike Kennedy:

Yeah. Yeah, exactly. We were in a project that required a human model be part of our scene. It just seemed to make the whole task that much more daunting, because all of a sudden, now we have this human character as part of our immersive experience.

Ian Berget:

I mean, historically, it’s not that we haven’t done humans here at Elara, but so much of what we’re doing is mechanism of action. So you’re inside the body, or there’s an operating theater of sorts, and you’re operating on the body. So you don’t usually have to focus on interacting with a person face-to-face and talking with them.

Mike Kennedy:

Right, and just naturally, when we look at other people, we can tell if things seem off or unnatural. I think it’s something that’s just physiologically built into us. And so having the 3D character feel off really broke some of the immersion.

Mike Kennedy:

There are a couple of ways to go about addressing this issue. One of which is to stylize your character to the point where it makes sense that the user would be looking at something that’s more of a cartoon and less of a real human. The other is to very much hone in on that realism.

Ian Berget:

Right, remove as much of the uncanny portion of it as is possible. And of course, there will always be some small things about it that aren’t 100%, but if you get close enough, there are people at this stage that will forgive you for not having it be 100% real.

Ian Berget:

There’s also some techniques that can more or less just replicate the feeling of speaking to someone or looking at someone directly. Mike, if you could pull up the Microsoft Capture Studio website. We’ve worked with them in the past.

Mike Kennedy:

Yeah.

Ian Berget:

And I think that that’s a good example of a way that you can kind of get past that uncanny valley, is with a full-on, 360-degree performance capture, but it does have its limitations.

Ian Berget:

If we look at it, we see there’s a lot of detail that goes into these. And there we go, not only humans, but animals as well, clearly, and other things.

Mike Kennedy:

Right.

Ian Berget:

And they’re able to create these holograms, and these VR models that are basically a video streaming back in full 3D, 180-degrees, around 360 degrees, you could probably go under the feet if you wanted, and a very natural motion and image, because it’s effectively a video, at the end of the day.

Mike Kennedy:

Right.

Ian Berget:

A very detailed, 360-degree video projected and recreated as a mesh. You can really get up close to it, get far from it, et cetera, which is fantastic, but you can’t exactly do anything procedural with it, at least with some limitations.

Ian Berget:

And you also cannot do this at your own studio. This is not something that you could very easily set up. Maybe you could get an Xbox scan, but it wouldn’t be to this level of detail. It would be quite expensive.

Mike Kennedy:

Right, and like you said, it’s a movie being played back, really. It’s not as dynamic as say having your own 3D character model with a rig set up for you to be used. So in that way, your motions are limited.

Mike Kennedy:

What I’ve been thinking about trying to do is find a flexible … At least semi-realistic, or realistic enough to block out the uncanny valley feeling from our character models, and do that in such a way that is efficient, that won’t require a whole in-house character modeling team.

Ian Berget:

Exactly. At the end of the day, it all comes down to expense. It’s not that we couldn’t, but it wouldn’t be in any way efficient to invest so many internal resources to honing in really closely on the character design and animation by hand. It’s certainly something that we have done, but it’s not always the right choice.

Mike Kennedy:

Definitely, and it also depends on the piece, right? So if the piece is really focused around an interaction with a single character, and that’s where you’re spending most of your time, then yeah, it makes sense to put more resources into that character. But if we have a character as a prop, similar to … Or not just a prop, but a single piece among other pieces, then it’s difficult to give it the attention that it needs without spending too much time on it.

Ian Berget:

Exactly.

Mike Kennedy:

So that being said, I was doing a little bit of investigation to see what was out there, and of course my mind goes to 3D scanning. Since we’ve done something similar with Microsoft, I had to imagine that there is something out there for that. I stumbled upon this site, Renderpeople.com, and they had a couple of free models available. And from what I can tell, these look fantastic. There’s a few that are just people that have been simply posed, there are a few that are even pre-rigged.

Mike Kennedy:

What I really liked about this is that I could … When I choose to download, I can choose what my preferred file format is. In our case, we’re working directly in the Unreal Engine. So they have those assets pre-packaged and ready to be imported into the engine, which was really nice. They also have a couple of other platforms that they’re readily available for, and they even had a couple with some animations preloaded on them.

Mike Kennedy:

Yeah, so it’s not enough to just have a realistic-looking 3D model, that … One of the biggest factors I’ve found to escape the uncanny valley is to give your character a realistic movement.

Ian Berget:

Right. This gets into a bit of motion capture, for example, where it’s possible that some of this was hand animated to some extent, but most likely, they have motion capture suits, they’ve performed motion capture, and then they’ve taken these already very nice looking models, rig them, and then mapped the motion capture to them. And that’s different from Microsoft, of course, because Microsoft is not based on rigging. It’s just positions of vertices, and diffuse channel color changes and all of that, baked in more or less frame-by-frame, versus this, which is effectively still a game model, like a traditional game model, but the production put into it is much more efficient because of that motion capture.

Mike Kennedy:

Totally. It’s astounding how much of an effect just a simple idle breathing animation has on the realism of your character.

Ian Berget:

Mm-hmm (affirmative). Not just completely emotionless, blinking [crosstalk 00:09:06].

Mike Kennedy:

Yeah.

Ian Berget:

Yeah.

Mike Kennedy:

Definitely. It’s the difference between them feeling like, “Oh, this is a very realistic looking doll or a wax figure,” versus, “Oh, this is a breathing being.”

Ian Berget:

Yeah, exactly what we’re going for. So how does that look in the engine?

Mike Kennedy:

Yeah, so if we jump back into our engine, I just grabbed … It’s called the RP manual rigged, which should be the dancing character. Let’s see if I can find them in our…

Ian Berget:

You ended up in the wall, of course.

Mike Kennedy:

Yep, there he is.

Ian Berget:

It looks like it is more or less to scale, which is nice.

Mike Kennedy:

Yeah.

Ian Berget:

This room is of course from the Learn tab in the Epic Games launcher, one of the realistic rendering samples, though they may have moved it since 4.25 came out. They’ve got another Archviz sample instead, but still, kind of just works right out of the box, which is nice.

Mike Kennedy:

Right. Yeah, let’s run this, and let’s see how everything looks. Let me freeze my window size here a little bit.

Ian Berget:

Perfect.

Mike Kennedy:

Okay, I notice … There he is.

Ian Berget:

He’s doing his thing. And the lighting’s already there, and all the motion, and it kind of just works, which is the great thing about having really good sources for … And he’s in the wall … Really good sources for these types of animations.

Ian Berget:

And the other nice thing about this, and I think Mike, you’ll probably agree, is that we wouldn’t need a motion capture studio ourselves. We wouldn’t need to produce all the 3D Animations in-house if we didn’t want to, because there are many studios that are dedicated to this type of motion capture process. If you can get the model, if you can get the rig, you can do a lot of work in collaboration with interactive partners to produce this type of animation for a project.

Mike Kennedy:

Definitely. For sure, and that’s sort of where this, websites like Renderpeople.com come in, where they’ve already got some pre-made people, and you can just browse according to what you’re looking for.

Mike Kennedy:

Of course, depending on what project you’re trying to do, you may need an order that is more customized to your project, but I’m sure there are … Maybe Renderpeople.com has somewhere you set up an order like that, or there might be another website that has something like that. I haven’t had a chance to really dig that far into it yet, but I totally agree.

Mike Kennedy:

And the other thing that is pretty incredible is there are other tools online. I’m thinking specifically about rigging, because that can be particularly challenging.

Ian Berget:

Is it maybe Mixamo that you’re thinking of, that has that automatic rigger, perhaps?

Mike Kennedy:

Exactly, that is what I’m thinking.

Mike Kennedy:

Let’s see. Yeah. So I’ve used this service before. It’s been a while, but when I did, it was as easy as uploading the OBJ file, so long as it was a humanoid model, and then it would generate the skeleton that I needed.

Ian Berget:

Fantastic. And I don’t know how recently it was, but Adobe of course acquired Mixamo. So if you are interested in getting a hold of it, I’m pretty sure it’s available through not the Adobe Creative Suite, I don’t believe, but one of the additional Adobe licenses.

Mike Kennedy:

Gotcha. Okay, yeah. Yeah, I think when I used it, it hadn’t been acquired yet, but yeah. So here we go. You can upload your character, you just sign in, but it also has these pre-built animations, which I believe they may be motion captured. Maybe not. I’m not sure, it’s kind of hard to tell, but it’s a nice service, regardless, if you’re looking for a quick way to rig up a character model.

Ian Berget:

Definitely. Well, I think that covers the majority of it for today, and we’ll certainly do another one of these as we go further down this pipeline, and try out some new, different things. Thank you, Mike, very much for going through and talking about the process.

Mike Kennedy:

Yeah, most definitely.

Ian Berget:

All right. I think we’ll sign off there for today. Talk to you all later.

Mike Kennedy:

All right, have a great day.

Ian Berget:

Goodbye.