In After Effects, select File > Import > File. Browse to the location of the footage, and add it to the Project.
- After Effects User Guide
- Beta releases
- Getting started
- Workspaces
- Projects and compositions
- Importing footage
- Text and Graphics
- Text
- Motion Graphics
- Work with Motion Graphics templates in After Effects
- Use expressions to create drop-down lists in Motion Graphics templates
- Work with Essential Properties to create Motion Graphics templates
- Replace images and videos in Motion Graphics templates and Essential Properties
- Animate faster and easier using the Properties panel
- Drawing, Painting, and Paths
- Overview of shape layers, paths, and vector graphics
- Paint tools: Brush, Clone Stamp, and Eraser
- Taper shape strokes
- Shape attributes, paint operations, and path operations for shape layers
- Use Offset Paths shape effect to alter shapes
- Creating shapes
- Create masks
- Remove objects from your videos with the Content-Aware Fill panel
- Roto Brush and Refine Matte
- Layers, Markers, and Camera
- Animation, Keyframes, Motion Tracking, and Keying
- Animation
- Keyframe
- Motion tracking
- Keying
- Transparency and Compositing
- Adjusting color
- Effects and Animation Presets
- Effects and animation presets overview
- Effect list
- Effect Manager
- Simulation effects
- Stylize effects
- Audio effects
- Distort effects
- Perspective effects
- Channel effects
- Generate effects
- Time effects
- Transition effects
- The Rolling Shutter Repair effect
- Blur and Sharpen effects
- 3D Channel effects
- Utility effects
- Matte effects
- Noise and Grain effects
- Detail-preserving Upscale effect
- Obsolete effects
- Expressions and Automation
- Expressions
- Expression basics
- Understanding the expression language
- Using expression controls
- Syntax differences between the JavaScript and Legacy ExtendScript expression engines
- Editing expressions
- Expression errors
- Using the Expressions editor
- Use expressions to edit and access text properties
- Expression language reference
- Expression examples
- Automation
- Expressions
- Immersive video, VR, and 3D
- Construct VR environments in After Effects
- Apply immersive video effects
- Compositing tools for VR/360 videos
- Advanced 3D Renderer
- Import and add 3D models to your composition
- Import 3D models from Creative Cloud Libraries
- Image-Based Lighting
- Extract and animate lights and cameras from 3D models
- Tracking 3D camera movement
- Cast and accept shadows
- Embedded 3D model animations
- Shadow Catcher
- 3D depth data extraction
- Modify materials properties of a 3D layer
- Work in 3D Design Space
- 3D Transform Gizmos
- Do more with 3D animation
- Preview changes to 3D designs real time with the Mercury 3D engine
- Add responsive design to your graphics
- Views and Previews
- Rendering and Exporting
- Basics of rendering and exporting
- H.264 Encoding in After Effects
- Export an After Effects project as an Adobe Premiere Pro project
- Converting movies
- Multi-frame rendering
- Automated rendering and network rendering
- Rendering and exporting still images and still-image sequences
- Using the GoPro CineForm codec in After Effects
- Working with other applications
- Collaboration: Frame.io, and Team Projects
- Memory, storage, performance
- Knowledge Base
Face Tracking Overview
Face Tracking lets you accurately detect and track human faces. Simple mask tracking lets you quickly apply effects only to a face, such as selective color correction or blurring a person’s face, and more.
However, with Face Tracking, you can track specific parts of the face such as pupils, mouth, and nose, allowing you to isolate and work on these facial features with greater detail. For example, change colors of the eyes or exaggerate mouth movements without frame-by-frame adjustments.
After Effects also lets you measure facial features. Tracking of facial measurements tells you details such as how open the mouth or an eye is. With each data point isolated, you could greatly refine content. Furthermore, you can also export detailed tracking data to Adobe Character Animator for performance-based character animation.
The face tracker works largely automatically, but you can obtain better results by starting the analysis on a frame showing a front, upright view of the face. Adequate lighting on the face can improve the accuracy of face detection.
In the Tracker panel, there are two face-tracking options:
- Face Tracking (Outline Only): Use this option if all you want to track is the outline of the face.
- Face Tracking (Detailed Features): Use this option if you want to detect eye (including eyebrow and pupil), nose, and mouth locations, and optionally, extract measurements of various features. This option is required if you want to use the tracking data in Character Animator.
If you are using the Detailed Features option, a Face Track Points effect is applied to the layer. The effect contains several 2D effect control points with keyframes, each of which is attached to detected facial features (for example, the corners of the eyes and mouth, locations of pupils, the tip of the nose).
Tracking outline of a face
-
-
Drag the footage from the Project panel into a Composition to add a layer.
-
Position the current time indicator (CTI) to a frame showing a front, upright view of the face you want to track.
Note:Face detection is improved if the initial frame to track has a face looking forward and is oriented upright.
-
Draw a closed mask loosely around the face, enclosing the eyes and mouth. The mask defines the search region to locate facial features. If multiple masks are selected, the topmost mask is used.
-
With the Mask selected, select Window > Tracker to open the Tracker panel. Set the tracking Method to Face Tracking (Outline Only).
-
In the Tracker panel, track forward or backward one frame at a time to ensure that tracking is functioning correctly, and then, click the button to begin analyzing all frames.
-
Once the analysis is complete, face tracking data is made available within the composition.
Tracking detailed features and extracting facial measurements
-
In After Effects, select File > Import > File. Browse to the location of the footage, and add it to the Project.
-
Drag the footage from the Project panel into a Composition to add a layer.
-
Position the Current Time Indicator to a frame showing a front, upright view of the face you want to track.
Note:Face detection is improved if the initial frame to track has a face looking forward and is oriented upright.
-
Draw a closed mask loosely around the face, enclosing the eyes and mouth. The mask defines the search region to locate facial features. If multiple masks are selected, the topmost mask is used.
-
With the Mask selected, select Window > Tracker to open the Tracker panel. Set the tracking Method to Face Tracking (Detailed Features).
-
In the Tracker panel, track forward or backward one frame at a time to ensure that tracking is functioning correctly, and then, click the button to begin analyzing all frames.
-
After the analysis is complete, the tracking data is made available within a new Effect called Face Track Points. You can choose to access face tracking data within the composition or the Effects (Window > Effect Controls) panel.
-
Move the current-time indicator to a frame showing a neutral expression on the face (the rest pose). Face measurements on other frames are relative to the rest pose frame. In the Tracker panel, click Set Rest Pose.
-
In the Tracker Panel, click Extract & Copy Face Measurements. A Face Measurements effect is added to the layer, and keyframes are created based on calculations made from the Face Track Points keyframe data. The Face Measurements keyframe data is copied to the system clipboard for use in Character Animator.
Note:The keyframes for Face Measurements are generated based on the Face Track Points keyframe data, relative to the Rest Pose (refer to Step 8).
Face tracking data reference
Face Track Points
The Face Tracker effect creates effect control points for several facial features, which you can view in the Timeline panel.
Face Measurements
If you have used the Detailed Features option, you can extract even more information in the form of parametric measurements of facial features, known as Face Measurements. All measurements shown for the face you tracked are relative to the Rest Pose frame.
Face Offset
Indicates the position of the face, offsetting to 0% at Rest Pose frame. The following data points are made available indicating offset values on x, y, and z axes:
- Offset X
- Offset Y
- Offset Z
Face Orientation
Indicates three-dimensional orientations of the face. Orientation is measured using the following data points, indicative of x, y, and z axes:
- Orientation X
- Orientation Y
- Orientation Z
Left Eye
Indicates various points of measurement for the left eye, and includes the following data points:
- Left Eyebrow Distance
- Left Eyelid Openness
- Left Eye Gaze X
- Left Eye Gaze Y
Right Eye
Indicates various points of measurement for the right eye, and includes the following data points:
- Right Eyebrow Distance
- Right Eyelid Openness
- Right Eye Gaze X
- Right Eye Gaze Y
Mouth
Indicates various points of measurement for the mouth, and includes the following data points:
- Mouth Offset X
- Mouth Offset Y
- Mouth Scale Width
- Mouth Scale Height