VRChat has a feature called OSC trackers. It exists as a low-cost way to add trackers controlled by software, without needing to create custom SteamVR drivers, and also can be used on Quest standalone headsets.

For the sake of experimentation, I’ve been wanting to try using OSC trackers for locomotion “animations”, where the locomotion animation logic would come from an external program, rather than doing it in an animator controller with animation clips.

osc-locomotion-f.mp4

Strangely I’ve only written this down recently, but I’m convinced I had been wanting to do this for more than a year already.

Discord_wS6Sh7HpZG.png

Untitled

Flatscreen Third person games

You’ve played third person games, you know what I will be talking about. Grand Theft Auto, Assassin’s Creed, Uncharted, Metal Gear Solid, they all control a main character in third person. Even when it’s in first person, you have NPCs that move around. There’s an entire industry dedicated to making a game character movement look good.

The idea is to make or reuse a character controller for third person games in an external program, and then give that character OSC trackers, so that this virtual character will be controlling the avatar in VRChat instead of my physical body.

There’s a bunch of interesting talks about this subject. Here’s a few of my favorite talks and demos:

https://www.youtube.com/watch?v=KSTn3ePDt50

https://www.youtube.com/watch?v=7S-_vuoKgR4

https://www.youtube.com/watch?v=RCu-NzH4zrs

https://www.youtube.com/watch?v=16CHDQK4W5k

Generally, I think there’s a bunch of established tech from existing non-VR fields, which I really wonder how they would translate into VR, so I’m barely scratching the surface.

Making this project is partly a way for me to start looking for answers in which game animation techs could be reused in the context of social VR.

VR has its own constraints that put a limit on what’s acceptable, and I think a good way to find those constraints is to break them intentionally or unintentionally.

Motion Matching in Unity by JLPM22

In GDC 2016, Kristjan Zadziuk from Ubisoft presented a technique they devised “Motion Matching” (see the first video above).

This technique caused a ripple in the games industry, as bits of animations for locomotion cycles could be replaced by sampling directly from a specially recorded and labelled continuous motion capture session, to directly drive the character locomotion.

In effect, this creates a very sophisticated character movement in the world, as the character compensates by properly shifting weight when turning or stopping.

An user by the handle of JLPM22 (Jose Luis Ponton) has published an open version of this technique, implemented in Unity: