Replacing my face with AR

ApesOnline
3 min readMay 27, 2021

--

I know almost nothing about 3D modeling, animation, or face-tracking, but I really didn’t need to know much in order to get to work replacing my face with AR. It was a nearly bug-free experience.

The Back Story

In early 2021 I started writing down an idea for 3D animated face-tracking based characters that would be used as an augmented reality filter for streaming or video chat. I wanted this for a couple reasons: (1) it would be so much cooler to stream from an avatar than with my own face and (2) I want to limit the amount of photos and videos of myself on the internet. My goal is to have a super fun and engaging video experience designed for personal privacy.

So as engineers do, I wrote a technical specification. I was ready to get a move on but I needed to find a designer who would be able to create the 3D models. Months later the phenomenal artist secretwaves and I collided on aol.gg and here we are today building this out.

What We’ve Learned So Far

Our work is still in progress. Here’s what we’ve learned so far using the open-source Jeeliz library to achieve this.

Jeeliz takes .obj files and converts them to json, then uses the Three.js 3D rendering library, and WebGL deep learning neural network models, to render a 3D model in the browser that animates in response to face-tracking in the browser. There are three components to creating AR characters that can replace your face online:

  1. Morphs — secretwaves created the 12 .obj morphs. These are the primary 3D assets. There’s a base morph and 11 others that each express different facial features (e.g. mouth open, left eye shut, etc). Jeeliz then smoothly transitions between them, tracking your facial features as you move.
  2. Mesh converter — a binary script provided in the Jeeliz library that takes the .obj morph files and processes them, outputting a json blob that can be used in the browser.
  3. Web app — the Jeeliz github repository provides some demo web apps, but we had to replicate one and move files around so they could we found when we served the app from localhost using the NPM package serve. In our fork of the repository we have a tutorial directory containing a single HTML file which can be served statically and which pulls in local javascript files as well as the custom augmented reality 3D character.

Up Next

Now that we know we can get the entire app running with a custom 3D character, our next steps are updating the morphs to make the model look good and creating textures that bring it to life. Then we want to build out the grand vision of making these characters a part of ubiquitous privacy tech that protects every day users. Imagine when you can join any online platform and selectively decide how much of your real face to share and how much is augmented reality. That’s the power I want us all to be able to wield.

Maybe in a future iteration we go all in on motion capture like codemiko…

Follow: https://twitter.com/v_stickykeys

Contribute: https://github.com/v-stickykeys

Originally published at https://stickykeys.substack.com.

--

--

ApesOnline
ApesOnline

Written by ApesOnline

Apes Online. $AOL and $MODEM. Ape at your own risk.

No responses yet