ARCore is a platform for building augmented reality apps on Android. Augmented Faces is a subsystem of ARCore that allows your app to:

An Android app that applies custom effects to a detected face in the front-facing (selfie) camera. The app will

  • Attach a custom 3D animal nose and two "ears" attached to the left and right side of the forehead
  • Overlay a custom texture over the face to add freckles and shading around the eyes

What you'll learn

Hardware requirements

Software requirements

  1. Android Studio version 3.1 or later
  2. Google Sceneform Tools (Beta) plugin
  3. The latest version of ARCore. This APK is automatically installed on your ARCore supported device via the Play Store. If your device doesn't have the required version of ARCore, you can sideload it.
  1. Download the latest ARCore SDK for Sceneform (sceneform-android-sdk-x.y.z.zip), which includes:
  1. GIMP for custom asset creation (Optional section)

Install the Google Sceneform Tools (Beta) plugin

  1. In Android Studio open the Plugins settings:
  1. Click Browse repositories, and install the Google Sceneform Tools (Beta).

Note: You will likely have to restart Android after updating or installing a new version of the Google Sceneform Tools (Beta) plugin.

Open the Sceneform Augmented Faces sample app

  1. Extract the zip file sceneform-android-sdk-x.y.z.zip.
  2. Launch Android Studio, and choose Open an existing Android Studio project.

  1. Navigate to the unzipped sdk folder sceneform-android-sdk/samples/hellosceneform (Note: Do not just point to the higher level folder sceneform-android-sdk but navigate all the way to hellosceneform)
  2. Click Open to open the hellosceneform project. Wait for Android studio to finish syncing the project.

Note: You can ignore any warnings about the obsolete or deprecated API.

Now you have all the software required to complete this tutorial.

Configure your project's build.gradle files

Make sure your project's build.gradle includes Google's Maven repository:

allprojects {
    repositories {
        google()
        ...
        mavenLocal()

Confirm your app's build.gradle has following entries.

android {
    ...
    defaultConfig {
        // Sceneform requires minSdkVersion >= 24.
        minSdkVersion 24
        ...
    }
    // Sceneform libraries use language constructs from Java 8.
    // Add these compile options if targeting minSdkVersion < 26.
    compileOptions {
        sourceCompatibility JavaVersion.VERSION_1_8
        targetCompatibility JavaVersion.VERSION_1_8
    }
}

dependencies {
    ...
    // Provides ARCore Session and related resources.
    implementation 'com.google.ar:core:1.8.0'

    // Provides ArFragment, and other UX resources.
    implementation 'com.google.ar.sceneform.ux:sceneform-ux:1.8.0'

    // Alternatively, use ArSceneView without the UX dependency.
    implementation 'com.google.ar.sceneform:core:1.8.0'
}

Update your AndroidManifest.xml

Confirm your app's manifest file (In Android Studio, app->manifests->AndroidManifest.xml) is set to AR Required and has CAMERA access.

<!-- Both "AR Optional" and "AR Required" apps require CAMERA permission. -->
<uses-permission android:name="android.permission.CAMERA" />

<!-- Indicates that app requires ARCore ("AR Required"). Ensures app is only
     visible in the Google Play Store on devices that support ARCore.
     For "AR Optional" apps remove this line. →

<uses-feature android:name="android.hardware.camera.ar"  android:required="true"/>

<application>
    ...
    <!-- Indicates that app requires ARCore ("AR Required"). Causes Google
         Play Store to download and install ARCore along with the app.
         For an "AR Optional" app, specify "optional" instead of "required".
    -->
    <meta-data android:name="com.google.ar.core" android:value="required" />
     ...
</application>

Build and run the sample app

  1. This codelab requires a physical device (not supported in Android Emulator). The device should be connected to the development machine via USB. (Optional: See the quickstart for detailed steps)
  2. To verify your device is connected via USB to your development machine, open a command console and enter this command:

    adb devices

    This connects to the device and outputs the serial number of the device. The output should look something like this:

List of devices attached

* daemon not running; starting now at tcp:5037

* daemon started successfully

712KPJP1076540 device

  1. In Android Studio, click Run . Then, choose your device as the deployment target and click OK to launch the sample app on your device.
  2. Allow the requested permissions on your device.
  3. This should launch the app and show the back-facing camera view.

Now we're ready to customize the base HelloSceneform app to use Augmented Faces features.

Import assets

In this step you'll import assets consisting of 3D models and textures into your project. These assets import as *.sfb and *.sfa files.

First, copy the pre-designed assets from /sampledata/models in the AugmentedFaces sample to the corresponding folder in the Hellosceneform sample.

cp -R <sceneform-android-sdk>/samples/augmentedfaces/app/sampledata/models/ <sceneform-android-sdk>/samples/hellosceneform/app/sampledata/models/

And then, import the assets into the app

  1. In Android studio, in the Project window, navigate to app/sampledata/models.
  2. Right click the model fox_face.fbx and select Import Sceneform Asset to begin the import process.

  1. Set .sfb output path to src/main/res/raw/fox_face.sfb. Leave the rest of the fields at their default values.
  2. Click Finish to begin the import process.

These values are used by the sceneform.asset() entry in the app's build.gradle, and determine where the *.sfa and *.sfb files will be generated in your project.

  1. Make sure the fox face asset imported properly: in the Project window, navigate to app/res/raw and ensure that fox_face.sfb is there.

  1. (Optional) Copy or download the canonical reference face mesh if you plan to create your own custom assets later in sceneform-android-sdk/assets/ folder. We won't use this in this code lab, although you might find it useful to open this file if you're already comfortable using 3D modeling software canonical_face_mesh.fbx.

Request CAMERA permission

Note that when using ArFragment, this is handled for us. We will skip this section.

Configure the ARCore session

Augmented Faces requires the ARCore session to be configured for the front-facing (selfie) camera and face mesh support to be enabled. To do this in Sceneform, extend the ARfragment class as FaceArFragment.

  1. In Android Studio, in the Project window, navigate to app > java.
  2. Right-click com.google.ar.sceneform.samples.hellosceneform, and choose New > Java Class.
  3. In the Create New Class dialog, enter FaceArFragment in the Name field and ArFragment in the Superclass field
  4. Click OK to create the new class.
  5. Add these imports to the new FaceArFragment.java file. Note that, import ArFragment is auto generated from the previous steps. Delete the auto-generated import.
import android.os.Bundle;
import android.support.annotation.Nullable;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.FrameLayout;
import com.google.ar.core.Config;
import com.google.ar.core.Config.AugmentedFaceMode;
import com.google.ar.core.Session;
import com.google.ar.sceneform.ux.ArFragment;
import java.util.EnumSet;
import java.util.Set;
  1. Override the configuration to use the front-facing (selfie) camera and face mesh by adding the following overrides into FaceArFragment class.
@Override
protected Set<Session.Feature> getSessionFeatures() {
  // Configure Front Camera
  return EnumSet.of(Session.Feature.FRONT_CAMERA);
}

@Override
protected Config getSessionConfiguration(Session session) {
  Config config = new Config(session);
  // Configure 3D Face Mesh
  config.setAugmentedFaceMode(AugmentedFaceMode.MESH3D);
  return config;
}
  1. Because the HelloSceneform sample app has features that aren't required for this codelab, let's disable them by adding the following override in the FaceArFragment class.
/**
 * Override to turn off planeDiscoveryController. Plane trackables are not supported with the
 * front camera.
 */
@Override
public View onCreateView(
    LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) {
  FrameLayout frameLayout =
      (FrameLayout) super.onCreateView(inflater, container, savedInstanceState);


  getPlaneDiscoveryController().hide();
  getPlaneDiscoveryController().setInstructionView(null);

  return frameLayout;
}

Now let's make use of this FaceArFragment subclass in the HelloSceneformActivity layout.

  1. In HelloSceneformActivity, change "private ArFragment arFragment" to "private FaceArFragment arFragment".
  2. In the Android Studio Project window, navigate to app > res> layout, and open activity_ux.xml.
  1. In HelloSceneformActivity, typecast the output of getSupportFragmentManager to FaceArFragment instead of ArFragment.
  2. Test your app, in Android Studio, click Run . You should see front-facing camera preview without any face effects yet.

Create Renderables

For tips and practices on creating custom assets,, refer to Creating Assets for Augmented Faces (Optional).

Use ModelRenderable.Builder to load the *.sfb models in the HelloSceneformActivity file at runtime:

Declare the following renderable in HelloSceneformActivity class

private ModelRenderable faceRegionsRenderable;

Replace the "ModelRenderable.builder()" call in HelloSceneformActivity.java in the sample app with the face renderable call below

// Load the face regions renderable.
// To ensure that the asset doesn't cast or receive shadows in the scene,
// ensure that setShadowCaster and setShadowReceiver are both set to false.
ModelRenderable.builder()
    .setSource(this, R.raw.fox_face)
    .build()
    .thenAccept(
        modelRenderable -> {
          faceRegionsRenderable = modelRenderable;
          modelRenderable.setShadowCaster(false);
          modelRenderable.setShadowReceiver(false);
        });

ArSceneView sceneView = arFragment.getArSceneView();

// This is important to make sure that the camera stream renders first so that
// the face mesh occlusion works correctly.
sceneView.setCameraStreamRenderPriority(Renderable.RENDER_PRIORITY_FIRST);

Add the following imports to HelloSceneformActivity.java that will be useful for later sections.

import com.google.ar.core.ArCoreApk;
import com.google.ar.core.AugmentedFace;
import com.google.ar.core.TrackingState;
import com.google.ar.sceneform.ArSceneView;
import com.google.ar.sceneform.FrameTime;
import com.google.ar.sceneform.Scene;
import com.google.ar.sceneform.rendering.Renderable;
import com.google.ar.sceneform.rendering.Texture;
import com.google.ar.sceneform.ux.AugmentedFaceNode;
import java.util.Collection;
import java.util.HashMap;
import java.util.Iterator;
import java.util.Map;

Detecting the face

When a user's face is detected by the camera, ARCore performs these steps to generate the augmented face mesh, as well as center and region poses:

  1. It identifies the center pose and a face mesh.
  1. The AugmentedFace class uses the face mesh and center pose to identify face region poses on the user's face. These regions are:

The AugmentedFace class extends the Trackable class. Following code filters the trackable list for AugmentedFaces.

Render the effect for the face

Rendering the effect involves these steps:

Extracting the face mesh and rendering the face effect is added to a listener on the scene that gets called on every processed camera frame.

private final HashMap<AugmentedFace, AugmentedFaceNode> faceNodeMap = new HashMap<>();
Scene scene = sceneView.getScene();

scene.addOnUpdateListener(
    (FrameTime frameTime) -> {
      if (faceRegionsRenderable == null) {
        return;
      }

      Collection<AugmentedFace> faceList =
          sceneView.getSession().getAllTrackables(AugmentedFace.class);

      // Make new AugmentedFaceNodes for any new faces.
      for (AugmentedFace face : faceList) {
        if (!faceNodeMap.containsKey(face)) {
          AugmentedFaceNode faceNode = new AugmentedFaceNode(face);
          faceNode.setParent(scene);
          faceNode.setFaceRegionsRenderable(faceRegionsRenderable);
          faceNodeMap.put(face, faceNode);
        }
      }

      // Remove any AugmentedFaceNodes associated with an AugmentedFace that stopped tracking.
      Iterator<Map.Entry<AugmentedFace, AugmentedFaceNode>> iter =
          faceNodeMap.entrySet().iterator();
      while (iter.hasNext()) {
        Map.Entry<AugmentedFace, AugmentedFaceNode> entry = iter.next();
        AugmentedFace face = entry.getKey();
        if (face.getTrackingState() == TrackingState.STOPPED) {
          AugmentedFaceNode faceNode = entry.getValue();
          faceNode.setParent(null);
          iter.remove();
        }
      }
    });

Well done, you have successfully completed the codelab and learnt:

You can also review to the completed sample application augmentedfaces for reference.

In this section, you will use a template 2D texture for Augmented Face and create your own customizations such as adding a virtual face tattoo or creating a virtual mustache.

Enable 2D texture

  1. Copy fox_face_mesh_texture.png from the folder sceneform-android-sdk-master/samples/augmentedfaces/app/src/main/res/drawable-xxhdpi/ to sceneform-android-sdk-master/samples/hellosceneform/app/src/main/res/drawable-xxhdpi/
  2. Add the following code in appropriate code blocks.
// Define within HelloSceneformActivity class
private Texture faceMeshTexture;

// Load the face mesh texture. Copy this to HelloSceneformActivity class after the Renderable builder ModelRenderable.builder()...
Texture.builder()
    .setSource(this, R.drawable.fox_face_mesh_texture)
    .build()
    .thenAccept(texture -> faceMeshTexture = texture);

// In scene.addOnUpdateListener(), add the null check
...
if (faceRegionsRenderable == null || faceMeshTexture == null) {
...

// In scene.addOnUpdateListener(),set face mesh texture
...
for (AugmentedFace face : faceList) {
  if (!faceNodeMap.containsKey(face)) {
    ...
    faceNode.setFaceMeshTexture(faceMeshTexture);
    ...
  }
}
  1. Test your app. In Android Studio, click Run . You should see orange shade on your upper eyelids and freckles on cheeks.
  2. Now that the pre-created fox texture is working, let us customize it.

Customize the 2D texture

  1. Open the fox_face_mesh_texture.png in an image editing tool such as GIMP. On macOS you open the PNG using the built-in Previewer app and use View > Show Markup Toolbar to reveal the image editing toolbar.
  2. Modify the image and save / export the modified PNG as custom_texture.png
  3. Copy custom_texture.png to sceneform-android-sdk-master/samples/hellosceneform/app/src/main/res/drawable-xxhdpi/
  4. In HelloSceneformActivity class, replace R.drawable.fox_face_mesh_texture with R.drawable.custom_texture
  5. In Android Studio, click Run to see your custom texture overlaid on your face