User Guide Cancel

Behaviors: body (directly controlled)

Limb IK: Control bending and lengths of arms and legs

Limb IK controls the bend directions and stretching of legs and arms. For example, pin a hand in place while moving the rest of the body or make a character’s feet stick to the ground as it squats in a more realistic way. Use this behavior to control the bending of elbows, knees, and ankles and make the movements look natural. IK stands for Inverse Kinematics, a method for calculating, for example, entire arm positions by specifying only the hand position.

Setup

  1. In the Properties panel, under the Limb IK behavior, use the Apply to drop-down to choose from arm, legs, or both.

    Use the drop-down to apply the behavior to arms, legs, or both

  2. For each arm, add handles at the locations of the shoulder joint, elbow, and wrist. For each leg, add handles on each hip, knee, heel, and toe.

  3. For Arms: apply the Left Shoulder, Left Elbow, and Left Wrist tags to the left arm’s handles, and the Right Shoulder, Right Elbow, and Right Wrist tags for the right arm. The handle with the Wrist tag is usually the same one that will be moved, such as with a Draggable tag. The Shoulder tags are best placed in the parent group of the arms.

    If Left Shoulder and Right Shoulder handles are not present, the Neck-tagged handle will be used instead.

    Apply handles
    Apply handles

    Note:

    The location of the Elbow handle relative to the line between the Shoulder and Wrist handles will determine the direction that the arm can bend. The placement of the Elbow handle is important if the arm was drawn straight (for example, arms in an A- or T-pose).

  4. For Legs: apply the Left Hip, Left Knee, Left Heel and Left Toe tags to the left leg’s handles, and the Right Hip, Right Knee, Right Heel, and Right Toe tags for the right leg.

    Note:

    Limb IK and Walk behaviors on the same puppet might cause issues because both want to control the same handles. If you do have both behaviors applied, consider turning off Limb IK when animating walk cycles, and turning off Walk when doing general body animation. Limb IK should work fine in side and 3/4 views.

Limb IK controls

  • Apply to choose whether you want to apply IK on Arms, Legs, or both.
  • Stretchiness controls how much the arms can stretch beyond their length at their rest pose. This setting works in combination with the force being used to move the hand (for example, a Draggable wrist). Lower values restrict stretching, whereas higher values produce very stretchy arms.
  • Bend Strength controls how much to constrain the bend direction. 0% allows bend direction to change, whereas higher values restrict bend direction more.
  • Auto Arm Bend controls if automatic bending occurs. This option is enabled by default.
  • Elbow Flip Threshold controls when the elbow bend will reverse direction, based on the direction that the upper arm (from shoulder to elbow) is pointing relative to the vertical axis. For example, at 0°, as the upper arm crosses the vertical axis, either pointing up or down, the bend direction will change. At 30°, it’ll flip the bend direction 30° off vertical. Typical values are between 90° and -90°.
  • Smooth Transition controls easing into the reversed bend direction. Lower values can produce an undesired quick flip of the elbow, so higher values are recommended for continuous movement.
  • Reverse Arm Bend Left and Reverse Arm Bend Right switch the elbow direction that left and right arms bend, relative to their default bend direction based on the locations of the Elbow handles.
  • Reverse Leg Bend Left and Reverse Leg Bend Right switch the bend direction of the left and right leg.
  • Ankle Flexibility controls how softly the ankles rotate with the leg movement.
  • Ground Detection use the checkbox to prevent the feet from moving below their initial vertical position.
  • Initial Foot Pinning use the checkbox to pin their feet to initial positions.
  • Pin specify positions for wrists and heels relative to specific coordinated.
    • Left Wrist to use the drop-down to choose whether you want to pin the tagged handle to the puppet, behavior, or scene.
    • Left Wrist Strength control the influence of the Limb IK behavior on the left wrist handle on a scale of 0 to 100%.
    • Right Wrist to use the drop-down to choose whether you want to pin the tagged handle to the puppet, behavior, or scene.
    • Right Wrist Strength control the influence of the Limb IK behavior on the right wrist handle on a scale of 0 to 100%.
    • Left Heel to use the drop-down to choose whether you want to pin the tagged handle to the puppet, behavior, or scene.
    • Left Heel Strength control the influence of the Limb IK behavior on the left heel handle.
    • Right Heel to use the drop-down to choose whether you want to pin the tagged handle to the puppet, behavior, or scene.
    • Right Heel Strength control the influence of the Limb IK behavior on the right Heel handle.

Leader/Follower Behavior

Leader/Follower behavior is a type of Body (directly controlled) behavior. The behavior has two handle tags: Leader and Follower. Handles with the Follower tag go in the exact location as the Leader handle. If there is just one Leader handle, all Follower handles will follow the Leader handle. If there are several Leader handles, each one must be specified, and Follower handles will follow the same-named Leader handle.

Leader/Follower Behavior

Setup

  1. In the Rig panel, set up a puppet with different artwork. The puppet should have clear changes when a particular artwork shifts positions.

  2. Create a group with independent content inside. For example, make an arm group inside the body group. Make different independent parts in the arm group.

  3. Add behaviors. For example, choose Limb IK behavior, helping the artwork (arms) bend in realistic ways.

  4. Choose Leader/Follower behavior from the behavior panel. All the selected behavior will show up on the right-hand side.

  5. Create and tag the Leader layer. Choose a layer you want to start with and tag it as Draggable and Leader.

  6. Create and tag follower layers. Name every other layer with the same name as the leader, and tag it as Follower. Repeat this until every layer in your group has either a Leader or Follower tagged handle.

  7. Choose the arm group and drag it to the trigger panel, specifically the Create Swap Set on the lower left. This will help show up one artwork at a time.

Controls

The behavior has the following parameters:

  • When controls how Leader/Follower operates relative to other behaviors: 
    • Off effectively turns off Leader/Follower and can help keyframe its effect on the puppet. 
    • Before IK is useful when you want Limb IK to set the final handle positions (e.g., for an Auto-swap setup with Limb IK in which Leader/Follower pulls the arms together before Limb IK constrains the handle positions). 
    • After IK is useful when you want the Leader/Follower to set or constrain the final handle positions (e.g., arms controlled via Limb IK to pick up another object like a coffee cup).
    • After Physics is helpful when you want physics like the Dangle behavior to move handle positions before Follower handles move to the Leader handle.
  • Position controls whether the Follower handles to match the position of their Leader handle. 
  • Scale controls whether the Follower handles to match the scale of their Leader handle. 
  • Rotation controls whether the Follower handles to match the rotation of their Leader handle.

Dragger: Control a region of a puppet by dragging with the mouse or by touch

This behavior allows you to drag a region of a puppet away from the rest of it (for example, to wave its arm). It is applied by default for new puppets, but works only if a location on the puppet is set up for control via mouse or touchscreen.

Setup

Assign the Draggable tag to a specific handle or to groups to affect its origin. If imported artwork has a guide or a layer with the word "Draggable" in its name, the Draggable tag is applied to the corresponding handle automatically. 

Note:

The Dragger tool in the Puppet panel can create Draggable-tagged handles without needing to modify the original artwork file.

Controls

Drag near the location in the Scene panel. The nearest Draggable handle location on the puppet moves to match the relative changes in mouse position while dragging.

The Dragger behavior records each Draggable handle that you move as a separate take group, so that multiple performances for a specific handle compose together, and don’t affect performances for other handles that you drag. By grouping Dragger takes by dragged handles, you don’t need to use multiple Dragger behaviors to capture multiple dragged handles. The Timeline panel shows each Dragger takes grouped by handle name as "Handle (handle-name)".

The Dragger behavior lets you control Draggable-tagged handles only when dragging within some distance from the handles. This optional control can help when you only want to control Draggable handles when very close to them, as opposed to whichever one is nearest. It is also useful when you have more than one Dragger behavior applied to different parts of the puppet — to avoid both activating on a single drag.

If you have a touch-enabled display, you can control Draggable handles by touching the display. Multiple handles can be controlled at the same time. The following actions can be performed at the handle location:

  • Move: One finger down, then dragged in any direction. This gesture is equivalent to dragging a handle with the mouse.
  • Scale: Two fingers down, then dragged apart or together, spreading or pinching the fingers.
  • Rotate: Two fingers down, then either one swept in a rotating motion around the other or both swept in a rotating motion and same direction around the midpoint of the two fingers.

This behavior has the following parameters:

  • Mouse & Touch Input controls whether the behavior processes mouse actions such as clicks and drags or touchscreen gestures.
  • Limit Range controls whether the nearest Draggable handle is controlled regardless of distance (unchecked) or only if the Draggable handle is within a specified range (checked).
  • Range specifies how close, in on-screen pixels (i.e., independent of panel zoom level), the mouse pointer or finger on a touchscreen needs to be to a Draggable handle to control it. A value of 0 (the default) chooses the nearest one.
  • After Move controls what happens after you stop dragging the Draggable handle. Return to Rest (the default) makes the handle settle back to its resting pose, whereas Hold in Place keeps the handle at the spot where you left it.
  • Return Duration controls when Return to Rest is used for dragged handles, the handles return smoothly to their resting locations. You can control how long (in seconds) it takes to return to rest with the Return Duration parameter.
Note:

Each Draggable handle can have different After Move and Return Duration settings. For example, you can use Hold in Place to pose one draggable hand of a character, switch to Return to Rest and drag the other hand, then change the Release Duration to be longer and then drag a necklace.

Eye Gaze: Control eye pupil movement separate from the rest of the face

This behavior uses the webcam, mouse, arrow keys, or touch-enabled display to control the movement of a puppet’s pupils for finer control and recording of eye gaze. You can dart the puppet’s pupils directly to the nine common positions shown below.

Eye darts
Eye darts

You can smooth and pause eye gaze and blend recorded eye gaze takes.

Artwork setup

Organize and tag layers similar to those for the Face behavior. However, you only use the Head, Left Eye, Right Eye, Left Pupil, Right Pupil, Left Pupil Range, and Right Pupil Range tags.

Controls

You can control eye gaze with the webcam, mouse, arrow keys, or touch-enabled display.

  1. If you are using camera input to control eye gaze, make sure the Camera & Microphone panel is open, your webcam is on, and the Camera Input button is not disabled. The panel shows what your webcam sees.

  2. Select the puppet to control in the Timeline panel.

  3. To use the webcam, arm the Camera Input parameter in the Properties panel, then look around in front of the webcam.

  4. To use the mouse or touch-enabled display, arm the Mouse & Touch Input parameter. As you move your pupils or drag with a mouse or finger, the puppet’s pupils should follow.

  5. To use arrow keys, arm the Keyboard Input parameter in the Properties panel.

  6. To pause pupil movements (when the Camera Input parameter is armed), hold down the semicolon (;) key.

    Use this capability to , for example, have the character glance from side to side, holding the stare at each side. When you release the key, the pupils smoothly move to the currently tracked pupil positions. To slow down the transition, increase Smoothness.

    If you want to only use one of the input types to control the eye gaze, disarm the other two. If you have two or more enabled, you can use the Strength parameters to control how the various inputs are blended.

This behavior has the following parameters:

100% is the default transformation for each parameter, but you can decrease it to 0% to dampen the transforms or increase above 100% to exaggerate them.

  • Mouse & Touch Input controls whether the behavior processes mouse or touch-display drag actions for eye gaze. When armed, clicking or dragging with the mouse or touch will cause the puppet’s eye pupil layers to look at the mouse/touch location. The location is defined relative to the center of the scene, which corresponds to the puppet’s eyes looking straight ahead. You can dampen the offset by decreasing Mouse & Touch Strength.

Tip: Temporarily disarm behaviors controllable via the mouse (for example, Dragger or Particles) if you want to control eye gaze with the mouse.

  • Camera Input controls the movement using the input from a webcam.
  • Keyboard input: Control the movements using arrow keys as follows:
  1. Single arrow key: points the pupil in that direction.
  2. Pair of arrow keys: points the pupil in a diagonal direction. For example, up and left, up and right down and left, or down and right.
  • Snap eye gaze: Select this option when using camera or mouse to restrict pupil movement so that it darts to one of the nine common positions shown earlier. This controls the responsiveness of the pupil to your own pupil movement. When controlling the eye gaze with the arrow keys, the pupils always dart even if snap eye gaze is not enabled.
  • Smoothing controls how much to dampen jittery pupil movements in front of the webcam or due to nonideal lighting conditions causing pupil tracking points to move unexpectedly. The default value does some smoothing, but you might want to decrease it if you prefer to have your puppet’s pupils react instantaneously to quick motions, including rapid eye movements. Note that smoothing has an effect even when snap eye gaze is enabled.
  • Camera Strength controls how far pupils can move when controlled using the camera input.
  • Mouse & Touch Strength controls how far pupils can be offset when controlled via the mouse or touch.
  • Keyboard Strength controls how far the pupils move when controlled using Keyboard.
  • Minimum Snap Duration specifies the minimum duration of a snapped eye gaze position before the pupils can move. This parameter can be used to prevent overly jittery eye gaze movements when snapping is enabled.

You can also blend together Eye gaze recordings (takes).

Note:

When the Camera Input and Microphone Input buttons in the Camera & Microphone panel are disabled, they override (disarm) the Arm for Record setting for Eye Gaze behavior. This allows you to view, while stopped, the results for a selected puppet with recorded takes for these behaviors without needing to deselect the puppet first.

Face: Control a puppet's face with your webcam

This behavior uses the face-tracking results from your webcam to control the position, scale, and rotation of named facial features in your puppets.

Artwork setup

Specify the facial features to control

Organize and tag layers as described in Body features.

Eyelid blinking

A puppet's eyes can blink in two ways — swapping to different artwork for a closed eye or sliding opposing eyelid layers together. The former gives you more control over the look of a closed eye, especially if it cannot be represented as two adjacent layers, and is easier to set up. However, no scaling occurs for partially closed eyes. The latter gives you continuous movement/scaling of eyelids, but requires more set up.

Use unique closed eye artwork

Assign separate Left Blink and Right Blink layer tags within the Left Eye and Right Eye tagged layers. When the Face behavior detects a closed eye, the Eye artwork is swapped for the Blink ones.

Use separate eyelid artwork

  1. Assign tags for Left Eyelid Top, Left Eyelid Bottom, Right Eyelid Top, and Right Eyelid Bottom layers inside the respective Left Eye and Right Eye layers.

  2. Create a handle at the bottom edge of each Eyelid Top layer, and another handle at the top of edge of each Eyelid Bottom layer.

The vertical distance between these handles determines how far to close and open the eyelids (i.e., Character Animator tries to move the top/bottom eyelid layers together to simulate closing of the eye).

Eyebrow control

As you raise or lower the eyebrow of your puppet, you can also tilt them for more expressiveness. You can tilt the eyebrow inward at the low point to emphasize a scowl, or tilt it outward at the high point to make your puppet look surprised.

Eyebrow tilt
Eyebrow tilt

To adjust the intensity of eyebrow tilt when raised or lowered, adjust the Raised Eyebrow Tilt and Lowered Eyebrow Tilt under Face behavior.
Note: Both these options are relative to the drawn orientation of the eyebrow at the rest pose, with 100% or -100% tilting the eyebrows vertically.
The Move Eyebrow Together option under the Face behavior is set as default in Character Animator. This means that the eyebrow movement happens in sync with each other. You can deselect the option for independent control of the eyebrows.

Face Tracking setup

This behavior captures your facial expressions from your webcam and animates the puppet based on your facial movements.

For some configurations with both internal and external webcams, the internal camera might not be the first (default) camera, or the intended external webcam might not be the next video input, so you might need to switch to the intended webcam. Also, sometimes you might need to reset or retrain the face tracking to the current position and orientation of your face so that the initial appearance of a puppet is as intended.

Mirror Camera Input 

Use the Mirror Camera Input option in the Camera & Microphone panel menu to control if the camera image should be flipped horizontally before being used. Note that the option is checked by default.

Choose a specific webcam (video source)

Choose Switch to Next Camera from the Camera and Microphone panel until the intended webcam is active in the panel, or choose Switch to Default Camera to reset to the first webcam.

If you have multiple webcams or your webcam isn’t the first video source found (for example, if you have a video capture device), cycle through available video sources to choose the intended one.

If the number of video sources changes during or between sessions, you might need to reselect the intended source.

Improve tracking accuracy of your facial performance

  • Increase direct lighting on your face.
  • For eyelid and eye gaze direction tracking, move closer to the camera so your face appears bigger in the frame.

Calibrate for your face

Look at the center of the Scene panel, place your face in what you consider a default, rest pose, then click Calibrate in the Camera and Microphone panel or press Cmd/Ctrl+P.

Recalibrate the red face and blue pupil tracking points

In case they no longer follow your facial features, try moving your head within the field of view of the camera, double-click in the Camera & Microphone panel, or push the red dots towards your face using your hand.

Smoothening facial movements

If your facial movements in front of the webcam are uneven or if lighting conditions cause facial tracking points to move unexpectedly, you can try compensating for these movements by having captured camera information smoothened over time. To smooth out facial movements, increase the new Smoothing parameter’s value in the Face behavior. The default value does some smoothing, but you might want to decrease it if you prefer to have your puppet react instantaneously to quick motions, including rapid eye blinks. Mouth replacements are not affected.

Capture the initial frame of the performance

To ensure that a punched-in performance (starting a recording during playback) for the Face or Lip Sync behaviors captures the initial frame of the performance, deselect the Pause During Playback option in the Camera & Microphone panel menu.

Multiple sets of switchable Head Turner views

Turner views

A character can have multiple Head-tagged group , each with its own set of views (and Head Turner behavior). For example, you might have one set of views by default, but then use a keyboard trigger to switch to a different set of views.

Set up switchable sets of Head Turner views

  1. Create a Head group (with Head tag) that contains the different views (Frontal, Left Profile, etc. tags), and add the Head Turner behavior to the Head group.

  2. Repeat step 1 for the other sets of views, and assign a key trigger to each of these other Head groups, with Hide Others in Group checked.

  3. Make sure the Face behavior is on a parent puppet of these Head group.

As you press the key trigger to show a head, you can then turn your head to trigger the different views.

See Wendigo in the Character Animator Examples download for a working example that you can modify.

Face: Pose-to-pose movement

You can produce pose-based movement from the webcam automatically, emphasizing key poses you make with your head and facial features using the Face behavior. Adjust the Pose-to-Pose Movement parameter to control how much to pause the head and face motion. The higher you set the parameter, the more the system will hold the key poses. Setting the parameter lower will cause the key poses to change more frequently.This parameter does not affect lip sync. Use the Minimum Pose Duration parameter to specify the minimum amount of time that a key pose will be held. This parameter only has an effect if the Pose-to-pose movement parameter is greater than 0%.

Tip: Increase Smoothing to 60% or more to ease the transition between key poses. Lower smoothing values can produce jarring jumps between poses.

Controls

  1. Make sure the Camera & Microphone panel is open, your webcam is on, and the Camera Input button is not disabled. The panel shows what your webcam sees.

  2. Place the selected puppet in a scene by clicking Add to New Scene or choosing Scene > Add to New Scene. A scene named after the puppet is created, and the scene is opened in the Scene panel.

As you move or rotate your head or make different facial expressions (smile, mouth wide open, etc.), the puppet in the scene should follow.

Audio from the microphone can be used to show different visual representations of your voice, via the Lip Sync behavior (described in Lip Sync: Control a puppet’s mouth with your voice).

Generate face data from prerecorded video

  1. In After Effects, extract face measurements data from the video, as follows:

    a. Import the video footage into a composition.

    b. Draw a closed mask around the face, open the Tracker panel, then set Method to Face Tracking (Detailed Features).

    c. Click the Analyze buttons to track the mask to the face.

    d. Set the current time indicator to the frame representing the rest pose for the face, then click Set Rest Pose.

    e. Click Extract & Copy Face Measurements.

  2. In Character Animator, select the puppet with both the Face behavior and its Camera Input parameters armed for record.

  3. Place the current time indicator (playhead) at the frame matching the first Face Measurements keyframe in After Effects.

  4. Choose Edit > Paste.

The face measurements data on the system clipboard is converted to a Camera Input take on the selected puppet.

Pause head movements but still allow lip sync

Hold down the semicolon (;) key. You can use this capability to, for example, have one character stationary but talking to another character that is moving in the scene. When you release the key, the puppet’s head smoothly moves to the currently tracked head position and orientation. To slow down the transition, increase the Smoothness.

The semicolon key is usable when the Face behavior’s Camera Input is armed.

This behavior has the following parameters:

  • Camera Input allows you to control puppet using your face movements on the webcam.
  • Smoothing controls smoothing out of jittery or uneven facial movements captured using the webcam.
  • Head Position Strength, Head Scale Strength, and Head Tilt Strength control the movement, scaling, and z-axis rotation of the head.
  • Eyebrow Strength controls the vertical movement of the eyebrows.
  • Eyelid Strength controls the movement and scaling of the eyelids. At 0%, no blinking triggered by the webcam occurs, which is useful if you want to control blinking automatically using the Triggers behavior (with eyelids assigned to a key).
  • Mouth Strength controls the movement and scaling of the mouth.
Note:

This parameter affects only shape-based mouth expressions. Visemes controlled by the Lip Sync behavior are not scaled or affected.

  • Parallax Strength controls parallax movement of the eyes, nose, and mouth.
  • Blink Eyes Together controls the synchronization of eye blinks (for Blink layers only). Uncheck this option to control each eye blink separately (for example, wink).
Note:

When the Camera Input and Microphone Input buttons in the Camera & Microphone panel are disabled, they override (disarm) the Arm for Record setting for Face behavior. This allows you to view, while stopped, the results for a selected puppet with recorded takes for these behaviors without needing to deselect the puppet first.

Body: Control your puppet’s arms, torso, and legs using your webcam.

Powered by Adobe Sensei, Body Tracker automatically detects human body movement using a webcam and applies it to your character in real time to create animation. For example, you can track your arms, torso, and legs automatically.

  • In the Camera & Microphone panel, make sure the Camera Input button and Body Tracker Input button are both turned on. If off (gray), click it to enable that input. Now try moving your body, waving your arms, or raising your feet and see it applied to your character.

When Body Tracker is enabled, the countdown before recording is set to 5 seconds by default so that you can move back from the camera to get your entire body into view before your start recording. To change the countdown to 10 seconds or 20 seconds, choose Timeline > Body Tracker Countdown.

Note:
  • When you use the Body behavior, using the Limb IK behavior at the same time is highly recommended to prevent limb stretching and foot sliding.

Setup

After importing your artwork into Character Animator, follow these steps to rig the puppet so that the Body Tracker can control it:

  1. Open the puppet in Rig mode.

  2. To add the Body behavior, in the Properties panel’s Behaviors section, click “+”, and then select Body.

  3. Create tagged handles for the different parts of the body’s artwork so that the Body behavior can control their movements:

      Arms: Move the origin handle to the shoulder area from where the arm should rotate; but do not tag it as “Shoulder” directly on the arm. You’ll want to tag your shoulders and hips one level higher in your puppet hierarchy – commonly the Body folder or the view folder (e.g., Frontal) – to keep your limbs attached during body tracking.
      Now, add handles at the locations of the elbow and wrist and apply the Left/Right Elbow and Left/Right Wrist tags to the handles.

      Legs: Like the arms, move the origin handle of the leg to the hip area, but do not tag it as “Hip” directly on the leg. Add tagged handles for Left Knee, Left Heel, and Left Toe to the left leg and foot, and the Right Knee, Right Heel, and Right Toe tags to the handles on the right leg and foot.

      Shoulders & Hips: The Shoulder and Hip handles should be on the body, not on the limbs. Select the parent group containing your limbs and add handles in the same shoulder and hip positions as the arm and leg origin handles. You’ll see green dots from the limbs where you can place them, or you can copy/paste the handles directly from the limbs to position them exactly. It should work fine either way. Tag these handles as Right Shoulder, Left Shoulder, Right Hip, and Left Hip.

Note:

Body and Walk behaviors on the same puppet might cause issues because both the behaviors want to control the same handles. If you do have both behaviors applied, consider turning off the Body behavior when animating walk cycles by setting the Body behavior's strength parameter to 0. The Body behavior should work fine in side and 3/4 views.

Controls

This behavior has the following parameters:

  • Strength controls the influence of the Body behavior on tracked handles.
  • When Tracking Stops controls how the puppet moves when the camera cannot track your body, such as if you moved out of view. Hold in place keeps the untracked body parts at the same place that they were last tracked. Return to rest moves the untracked body parts gradually to their original locations; use with the Return Speed parameter. Snap to rest moves them quickly to their original locations.
  • Return Speed controls how quickly untracked body parts return to their resting locations. This setting is available when the When Tracking Stops parameter is set to Return to rest.
  • Tracked Handles controls which of the tagged handles should be controlled by the Body Tracker. There are separate controls for the head, neck, waist, the joints on the left and right arms, and the joints on the left and right legs.

Head & Body Turner: Switch between groups as you turn your head and body

This behavior switches between groups, for example, different views like the front, quarter, and side/profile of a character, as you swivel your head or body left or right.

Setup

  1. Create or use an existing Head group or Body group that contains the different view layers (Frontal, Left Profile, etc.).

  2. Add the Head & Body Turner behavior to the group.

    Note: add behavior to the group layer instead of each view layer.

  3. Specify the controllable views by tagging the layers with at least two of the following:

    • Left Profile
    • Left Quarter
    • Frontal
    • Right Quarter
    • Right Profile
    • Upward
    • Downward

    The number of provided views determines the distance your head or body needs to swivel. If you provide Left Profile, Front, and Right Profile, Front is triggered when you look straight at the camera, and the profile views triggered when looking to either side. If all five views are provided, you need to turn farther to the sides to trigger the profile views.

Note:
  • Upward and Downward layers trigger only when you are facing forward. They can only be triggered by head rotation (not body rotation).
  • This behavior is not applied by default to puppets, so add it first to see its effect on a puppet.
  • Body rotation is only available when body tracker is turned on. 

Controls

This behavior has the following parameters:

  • Camera Input allows you to control puppet using your face and body movements on the webcam.
  • Controlled by allows you to select either your head rotation or body rotation to control the view.  You can apply the behavior twice (once to head-views and once to body-views) to get independent control of head and body views.
  • Sensitivity controls how far you have to swivel your head or body around the y-axis to switch to the quarter and profile layers. Decrease the value if the switching is happening too quickly, or increase it if not fast enough.
Note:

As you turn your head, face tracking accuracy for your eyes, nose, and mouth decreases, so you might want to increase Sensitivity to still have good control over facial features, or reduce Eye Gaze Strength for the Face behavior applied to the profile views. Similarly, increase the Sensitivity while using Body Tracker to have improved tracking accuracy.

Switchable head and body turner view

You can create multiple head-tagged group and body-tagged group, each with its own set of views. For example, you might have one set of views by default, but then use a keyboard trigger to switch to a different set of views.

  1. Create a Head group (with Head tag) or Body group that contains the different views (Frontal, Left Profile, etc. tags), and add the Head and Body Turner behavior to the group.

  2. Repeat for the other sets of views, and assign a key trigger to each of these other Head or Body groups, with Hide Others in Group checked. Make sure the Face behavior and Body Tracker behavior is on a parent puppet of these groups.

  3. As you press the key trigger to show head or body, you can then turn your head or body to trigger the different views.

Lip Sync: Control a puppet’s mouth with your voice

This behavior produces lip-synced animation if the original artwork contains visemes (visual representations of your mouth as you make different sounds) and you talk into the microphone. You can also process audio in the scene to generate lip sync data for a puppet.

The Lip Sync behavior has a Keyboard Input parameter that, when armed, allows you to display specific visemes by pressing the first letter of the viseme layer’s name for example, A for the Aa viseme , D for D, W for W-Oo, etc.). You do not have to add keyboard triggers manually to those layer names.

Lip Sync preferences

The Lip Sync preference window allows you to control the following:

  • Viseme Detection: controls the density (number) of visemes generated by lip sync for a given section of audio. 
  • Camera-based Muting: controls the threshold of how open your mouth needs to be (when the Camera Input is enabled) to enable lip sync. This can be useful to avoid your puppet’s mouth changing when other people are talking in the background but your mouth isn’t open in front of the camera. By default, there is no muting, so background sounds can produce visemes. 
  • Version: selects the version of the lip sync engine used to detect visemes.
  • Audio-based Muting: controls the threshold of how loud the audio needs to be to enable lip sync.

Use Lip Sync preferences to control  how the lip sync engine is used for detecting and generating visemes. To open, select Preferences > Lip Sync.

Working with visemes

A viseme is a generic facial image that can be used to indicate a particular sound. A viseme is the visual equivalent of a phoneme or unit of sound in spoken language. Visemes and phonemes do not necessarily share a one-to-one correspondence. Often several phonemes correspond to a single viseme, as several phonemes look the same on the face. 

Within Character Animator, there are three shapes that are determined by the shape of your mouth in the webcam. These only show up if no audio is detected (no one is talking). Neutral is the most common to see and should be your default "rest" mouth.

The other 11 mouth shapes, called Visemes, are determined by audio. Visemes are visualizations of key mouth positions when saying common phonetic sounds. Character Animator listens for 60+ specific sounds and translates them into visemes.

Mouthshapes determined by the shape of your mouth in the webcam
Mouthshapes determined by the shape of your mouth in the webcam

Mouthshapes determined by audio in the microphone
Mouthshapes determined by audio in the microphone

Name your characters

If you name and structure your Mouth group like this, Character Animator automatically recognizes and tags these mouth shapes upon import.

Tips to create custom mouth shapes

• Lock the top jaw. Keeping the top row of teeth in a consistent place helps things look smoother.
Cycle Layers behavior can be added to a mouth group to add a few frames of transitional animation when that sound is picked up. The mouth opening to Aa or W-Oo is a common application.
• You can add additional mouth shapes (sad, yell, etc.) into your Mouth group and show them through keyboard triggers.
• These are examples of a frontal view, but quarter and profile views can follow the same general guidelines.

Artwork setup

Specify the visemes to control

Organize and tag Mouth layers as described in Body features

Tracking setup

Improve tracking accuracy of your speaking performance

Try boosting the microphone input level in your operating system’s Sound control panel.

Note:

Try making a "boooooo" sound to see if the mouth reliably stays on the "W-Oo" viseme, and a "la-la-la-la-la" sound to see if the viseme with the tongue appears (assuming your artwork included it).

Controls

  1. Make sure the Camera & Microphone panel is open and the Microphone Input button is not disabled.

  2. Place the selected puppet in a scene by clicking Add to New Scene or choosing Scene > Add to New Scene.

A scene named after the puppet is created, and the scene is opened in the Scene panel.

You can change the audio hardware references. To change them, select Edit > Preferences > Audio Hardware.

As you talk, the audio signal is analyzed and a matching viseme for your mouth is displayed. When no sound can be detected or the microphone is disabled, control falls back to the Face behavior (if present) analyzing the video signal (your mouth expressions captured by the webcam) to possibly trigger the Smile or Surprised mouth shapes.

Generate lip sync data from prerecorded audio

  1. Either import an AIFF or WAV file into the project and then add it into the scene, or record audio using your microphone (while the Microphone Input is enabled).

  2. Add a puppet containing the Lip Sync behavior to the scene, and select the puppet’s track item in the Timeline panel.

    Make sure both the Lip Sync behavior and its Audio Input parameter are armed for recording, which they are by default.

  3. Choose Timeline > Compute Lip Sync from Scene Audio.

    The Compute Lip Sync from Scene Audio command analyzes the scene audio and creates a Lip Sync take only where the scene audio overlaps the selected puppet track items. Muted tracks are not computed. Visemes are automatically generated for the audio and are displayed below the Lip Sync take bar.

Note:

Computing Lip Sync from scene audio may take time depending on the duration of your audio.

This behavior has the following parameter:

  • Audio Input controls whether the behavior processes audio from the microphone or scene.
Note:

When the Camera Input and Microphone Input buttons in the Camera & Microphone panel are disabled, they override (disarm) the Arm for Record setting for Lip Sync behavior. This allows you to view, while stopped, the results for a selected puppet with recorded takes for these behaviors without needing to deselect the puppet first.

Use scene audio for audio-controllable behaviors

You can create Lip Sync take for behaviors that have Audio Input parameters ‐ Nutcracker Jaw and Layer Picker ‐ both of which can use audio amplitude to control them. The existing Timeline > Compute Lip Sync from Scene Audio menu command now references the behavior with an armed Audio Input parameter:

  • Create Lip Sync Take from Scene Audio
  • Create Nutcracker Jaw Take from Scene Audio
  • Create Layer Picker Take from Scene Audio

If there are multiple armed Audio Input parameters, for multiple selected puppets or multiple behaviors, the menu command will mention the number of takes that will be created. If you only want to record a take for a specific behavior, make sure the others are disabled or disarmed.

Edit visemes

You can insert, select, trim, delete, or replace visemes.

Viseme editor timeline
Viseme editor timeline

Visemes are represented as adjacent bars below the Audio Input track bar and the gaps between these bars are moments of silence in the audio. Each bar represents a separate viseme. Each viseme bar
has the viseme name displayed on it for easy recognition. Zoom in to view these names.

To zoom into the timeline, do any of the following steps:

  • If you want to timeline zoom in/out around mouse, place pointer over the specific point in the timeline and hold down Alt-vertical-wheel(Win) or Option-vertical-wheel(Mac).
  • If you want to timeline forward/backward in time, hold down Shift-vertical wheel (Mac/Win) or Horizontal wheel (Mac/Win).
Note:

If the Timeline panel is zoomed too far out to see the Lip Sync viseme bars, they change to a diagonal-lined pattern to convey that viseme information can be edited if you zoom in.

Select visemes

To select visemes or silences, do any of the following steps:

  • To select a viseme, click the left-edge of the viseme bar. The edge turns white when you select it.
  • To select a silence, click its left edge. The cursor changes to a horizontal arrow when you hover over the edges.
  • To select multiple visemes, press Shift+click(Win) or Cmd+click(Mac).
  • To retain the existing selected visemes and silences as you drag and select them, hold down Shift (Win) or Cmd/Ctrl (Mac) before starting to drag.

Adjust the timing of visemes or silences

To adjust the timing of a viseme or silence, do any of the following steps:

  • Drag the viseme bar horizontally from the edges.
  • Drag the edge between viseme bars or silences horizontally. 

The left edge of the viseme or silence moves earlier or later in time. You can drag the left edge of the viseme bars or silences across other visemes to replace them.

Delete visemes

To delete visemes or silences,

  • Select Edit > Delete or press Delete.

When you delete a viseme bar or silence, the viseme bar or silence to the left extends to the next viseme or end.

Editing Visemes using Keyboard

There are multiple ways in which you can edit visemes . Visemes can be edited from the Visemes context menu and by dragging visemes manually. Several keyboard shortcuts can also be used to edit visemes when they are selected.

  • Left arrow or right arrow: Selects the previous or next viseme or span of silence in time.
  • Up arrow or down arrow: Moves the selected viseme earlier or later by one frame in time.
  • The first letter of the viseme name, for example, A for Aa, E for Ee, W for W-Oo: replaces the selected viseme associated with that letter.
  • Forward slash key (/): Splits the selected viseme in half.

Note that Deleting a selected viseme automatically selects the next viseme or span of silence in time if one exists.

Replace visemes with another viseme or silence

Right-click the viseme to be replaced and choose a new viseme from the context menu. To replace a viseme with a silence, choose Silence from the context menu.

Replacing visemes
Replacing visemes

The letters in parentheses are sounds. For example, use D viseme for sounds like n, th, and g.

Tip: Though you can play back or scrub through time, or deselect the Puppet track item to view the result, you can also disable microphone input in the Camera & Microphone panel to make changes to a viseme and see the results on the character immediately.

Add a viseme

  • To add a viseme where there is silence, right-click in the silence and choose a new viseme from the context menu. This adds the viseme and removes the silence.
  • To add a viseme anywhere in time, click the viseme and hold down Alt(Win) or Option(Mac) and right-click to choose a new viseme from the context menu.

Depending on placement, the inserted viseme leaves silence after it. To make the inserted viseme fill the rest of the span of silence, press Alt/Option-left click to open the Visemes popup menu.

Split a viseme

To split a viseme , do any of the following:

  • Hold down Alt(Win) or Option(Mac) as the cursor changes to a razor and click the area you want to split. The razor indicates the area to be split.
  • Select and drag the playhead to the area to be split and select Edit > Split, or Cmd+Shift+D (Mac) or Ctrl+Shift+D (Win).

Reuse and copy visemes

You can cut or copy and paste lip sync takes from one puppet or project and use it in another by following these steps:

  • To retain the original visemes and copy the lip sync take, choose Edit > Copy (Cmd/ctrl+C). To cut the original viseme , choose Edit > Cut (Cmd/Ctrl+X).
  • Select the puppet track item in the timeline where you want to paste the copied viseme . This could be in the current project or another one. Position the play head where you want to paste the lip sync. To paste, choose Edit > Paste (Cmd/Ctrl+V).

Note: If you copy multiple Lip Sync takes, they are pasted in order of their selection.

Instead of copying the entire Audio Input take of visemes or Trigger take of triggers, you can selectively cut, copy, and paste viseme and trigger bars to reuse just the recordings you need. For more information, see Reuse (cut, copy, and paste) viseme and trigger bars.

Move the jaw based on the current viseme

The Lip Sync behavior can vertically move a Jaw up and down. It can be moved automatically based on the height of the current viseme , specifically the offset of the bottom edge of the viseme relative to the bottom edge of the Neutral mouth shape. With the Jaw handle along the chin of the face, as different visemes are displayed, the bottom of the face can warp to simulate the chin moving up and down.

Note: If a viseme is a cycle (i.e., Cycle Layers applied to a group of layers showing an animated viseme ), the vertical offset will be based on the tallest layer in the group. If there are multiple mouth groups, it will use the average height of all mouths with that same viseme’s tag.

The behavior has the following new parameters to control Jaw movement:

  • Jaw Movement controls how much to offset the Jaw handle. Use a lower value for subtler jaw motion. 
  • Manual Jaw Adjustment allows for additional control of each viseme’s vertical offset of the jaw.

Export lip sync take to After Effects

You can take Character Animator  visemes  into After Effects for use on different characters. For information on the steps to follow, see Lip Sync: Take export for After Effects.

Transcript-based Lip Sync

Powered by Adobe Sensei machine-learning technology, transcript-based lip sync allows you to use a text transcript to produce a more accurate lip sync result.

Note: this feature works best with English (ASCII-based) text.

Follow these steps:

  1. Open Character Animator and create a scene from an example puppet on the Home screen or open a scene containing one of your puppets. 

  2. Import an audio file into the project. Choose File > Import and select the file.

  3. With the audio selected in the Project panel, the Properties panel shows the Transcript text area where you can import or type-in the text matching the spoken words and phrases in the audio.

    To import a file, click Import in the Properties panel, then select the corresponding text file. 

    Add timecodesby typing them manually or use a transcription program to generate an SRT file with timecodes. For an .srt file, change its extension to .txt to select it for import, or copy and paste the text directly into the Transcript text area in the Properties panel.

    The audio file’s icon in the Project panel changes to JeffAlmasol_0-1626819044586.png to indicate it has a text transcript associated with it. The Type column in the Project panel shows Audio+Transcript for this file.

  4. Drag the audio file from the Project panel into the Timeline panel to add it to the scene. 

  5. Select the puppet track in the Timeline and the audio track together, then choose the Timeline > Compute Lip Sync Take from Audio and Transcript menu command. Character Animator analyzes the audio and, using the associated transcript text, produces visemes for the Lip Sync take.

  6. To make corrections to the transcript, update the text in the Transcript text area, and then choose the Compute Lip Sync Take from Audio and Transcript command again.

You can generate an SRT file using Adobe Premiere Pro. For more information, see Export captions.

Tips: 

  • Check your transcript for typos, missing words, or other mismatch errors.
  • Add timecodes to the transcript to allow the process to skip over sections with errors. You can then run standard audio-only lip sync to fill in the gaps. 
  • Create short audio file and transcript clips. If there are errors, this allows the process to fail for a limited section. 

Nutcracker Jaw: Control the lower jaw with your face or voice

This behavior moves the lower part of the puppet’s mouth as you open and close your mouth in front of the webcam or talk into the microphone. This can be a simpler way of making a puppet talk without needing to specify separate artwork for the different mouth shapes and visemes used by the Face, and Lip Sync behaviors. If Face, and Nutcracker Jaw behaviors are both on a puppet, the Face behavior can still control the rest of the face, but Nutcracker Jaw controls just the mouth.

Note:

This behavior isn’t applied by default to puppets, so add it first to see its effect on a puppet.

Setup

Specify the lower jaw to move

  1. In the Puppet panel, select the layer for lower jaw.

  2. In the Properties panel, click Tags and select the Jaw handle tag from the Miscellaneous section.

You can rotate the jaw of the puppet. To rotate the jaw, follow these steps:

  1. Select the jaw group in the Puppet panel.

  2. Add a handle and tag that Jaw.

If imported artwork has a guide or layer with the word "Jaw" in the name, the Jaw tag is automatically applied to the corresponding handle.

Controls

Move the lower jaw with your face

  1. In the Properties panel, make sure the Camera & Microphone panel is open, your webcam is on, and the Camera Input button (for visual control) is not disabled.

  2. Open and close your mouth in front of the webcam.

Move the lower jaw with your voice or other sound

  1. In the Properties panel, make sure the Camera & Microphone panel is open, your audio input is on, and the Microphone Input button (for audio control) is not disabled.

  2. Talk into the microphone. Try to speak loudly to see how it affects the intensity of the jaw movement.

This behavior has the following parameters:

The Nutcracker Jaw behavior can now open and close your character’s mouth as you talk into the microphone. Just make sure the Audio Input parameter is armed, and the microphone input is enabled.

In addition, the actual movement of the tagged Jaw layer can now be rotational (clockwise or counterclockwise), in addition to vertical. For rotational movement, create a handle tagged with Jaw as the pivot location; a Jaw-tagged origin handle is not working at this time.

This behavior has the following parameters:

  • Camera Input controls whether the behavior will process video from the webcam.
  • Audio Input controls whether the behavior will process audio from the microphone.
  • Camera Flappiness controls the offset of the lower jaw based on your tracked mouth in front of the webcam. Microphone Flappiness controls the offset of the lower jaw based on the audio amplitude of your voice in the microphone. For both parameters, 100% is the default maximum offset from the rest position — 100 pixels for position-based movement, 45° for rotation-based movement — but you can change the value to affect this maximum offset.
  • Movement controls how the lower jaw will move — Position for vertical movement, Rotation Clockwise for pivoting from the jaw’s origin in a clockwise direction (appropriate for the right profile of a character), or Rotation Counterclockwise for pivoting counterclockwise (for a left profile).
Note:

When the Camera Input and Microphone Input buttons in the Camera & Microphone panel are disabled, they override (disarm) the Arm for Record setting for Nutcracker Jaw. This allows you to view, while stopped, the results for a selected puppet with recorded takes for these behaviors without needing to deselect the puppet first.

Jaw movement based on the viseme

The Lip Sync behavior can vertically offset a Jaw handle automatically based on the height of the current viseme . For more information, see Move the jaw based on the current viseme .

Walk: Make a character walk

The Walk behavior allows a puppet to walk across the scene by controlling the puppet's legs, arms, and body. The behavior simulates common walking styles, such as strut and prance. 

You can create an animated walk cycle. A walk cycle is a series of poses played on a loop, which creates an illusion of a walking puppet. Walk cycles can convey different moods and emotions to enhance your animation. For example, long bouncy steps represent a happy walk.

Note:

This behavior assumes that the character is two-legged and two-armed, and is viewed in profile. But, it is possible to apply it to additional legs.

Upon applying the behavior on the puppet, the legs move through a looping series of poses to complete a walk cycle. The feet are planted at ground level, and the arms swing in the opposite direction of the legs while walking. Walk behavior can also adapt to long and short legs for a smooth walk motion.

To make your puppet have a more realistic arm and leg motion, use the Left Shoulder, Right Shoulder, Left Hip, and Right Hip handle tags in the characters drawn in a three-quarter perspective. Instead of swinging arms from the neck and legs from a single hip location, separate left and right shoulder and hip locations can improve the appearance of these swinging limbs.

Note that you can tag left-moving views as Left Quarter or Left Profile. Similar to right-moving views being tagged as Right Quarter or Right Profile.

Tip: The Home screen includes a template puppet - Walkbot, which has been created using the Walk behavior. You can use this template puppet to view the behavior settings and modify these settings to customize the puppet.

Walk cycle

Artwork Setup

Setup a profile view puppet with the legs and arms as independent groups to avoid unwanted overlapping. All the independent parts to the parent puppet (profile view puppet) staples automatically to prevent limbs from detaching. For best results, use Hinge or Weld attach styles for the limbs. Next, you can add sticks to the legs and arms to keep them straight and prevent bending.

Note:

This behavior is not applied by default to puppets, so add it first to see its effect.

The puppet is made of a number of independent parts attached with the parent puppet. In the Puppet panel, identify the following locations on each leg and on the body of the puppet.

Body parts in the Puppet panel

  • Neck
  • Left Elbow
  • Left Wrist
  • Right Elbow
  • Right Wrist
  • Waist
  • Hip
  • Left Knee
  • Left Ankle
  • Left Heel
  • Left Toe
  • Right Knee
  • Right Ankle
  • Right Heel
  • Right Toe

Use the correspondingly named handle tags in Properties panel to identify the locations. To find the handle tags, follow these steps:

  1. In the Properties panel, click the triangle icon next to Tags.

  2. Under the Body section, you can view the available handle tags. Hover over each tag to view the description.

Basic tags

You can create a basic walk cycle with a minimum set of tags. Before you create tags, you need to create handles, which the tags are associated with.

To create a handle, follow these steps:

  1. Select the Handle tool and add click the part of the puppet you want to add a handle.

  2. In the Properties panel, click Tags.

  3. Under the Body section, select the handle for the corresponding puppet part. You can view the handle description in the tooltip.

For foot movement, use either Ankle or Heel tag and for leg movement, use either Waist or Hip tag. If you have a left-facing character walking to the left by default, also add the Knee tag. For the leg and arm handles, be sure to set the same ones on both left-facing and right-facing puppets, for  example  Left Ankle and Right Ankle (from the perspective of the puppet).

Separate left-facing and right-facing puppets

If you have separate left-facing and right-facing views, make sure to tag them correctly.

To add tags, follow these steps:

  1. In Puppet panel, select the left-facing view.
  2. In Properties panel, click the triangle  icon next to Tags.
  3. Under Views section, you can view the layer tags.
  4. Select the Left Profile layer tag for left-facing puppet and the Right Profile layer tag for right-facing puppet.

After you tag the puppets, switch to the Scene panel to see the puppets respond to the changes. With the Start parameter set to Left & Right Arrow keys, when you press Right-Arrow key, right-facing puppet is displayed, and when you press Left-Arrow key, left-facing puppet is displayed. If you tag only one profile puppet or no profile puppet, and press the opposite arrow key, the puppet walks backwards. For example, if you tag only the left-facing puppet, the puppet walks backwards when you press the Right-Arrow key.

Controls

Make the puppet move

By default, the puppet legs walk in place to help you preview the walk cycle of the puppet and make changes. You can make the puppet move using the Walk parameters

To make the puppet move, do any of the following:

  • When Start is set to Immediately, increase the Body Speed value. Set to 100% to avoid the feet from appearing to slide along the ground. When the puppet moves out of view, click the Refresh button in the Scene panel to restart its walk.
  • When Start is set to With Left & Right Arrow Keys, press Left-Arrow or Right-Arrow key. Pressing either key ramps up the walk from the character’s rest pose to a walking motion, and releasing the key returns the character to rest; this is equivalent to changing the Strength parameter when pressing and releasing the key.

Tip: You can apply a draggable handle to the puppet’s top-level origin handle if you need to reposition it
within the scene.

Parameters

Walk behavior has the following parameters:

  • Mode controls when the puppet moves. The available options are — Immediate (when it first appears), with left and & Right Arrow Keys (when those keys are pressed), and Position-based (move the legs based on the animation of the Position parameter).
  • Position controls the horizontal position of the puppet and ensures the puppet walks as expected and does not walk out of the scene. The Body Speed parameter is not available when the mode is position-based.
  • Start/Stop Easing controls how long (in seconds) it takes to ramp up to full speed after pressing the arrow keys and ramp down from full speed after releasing them. Use a short duration to simulate a character building up energy into the walk cycle and slowing down to a stop.
  • Style controls the type of leg movement. The available options are — Walk (basic walk), Slump (upper body slouched forward), Strut (stiff gait with high knee movement), Prance (similar to a strut but with funkier upper body movement), Sneak (stealthy motion), Run (faster/longer stride), and Headbang (move head up and down).
  • Stride Length controls the horizontal spacing of the leg and arm movement. For best results, set values between 80% and 120%.
  • Step Speed controls how quickly the legs move through the walk cycle. At 100%, the character makes two steps per second. At 0%, the legs of the character don't move. Negative speed produces backward movement.
  • Step Phase controls where the legs start in the walk cycle. When Step Speed is set to 0%, adjust the Step Phase manually to maintain the stepping.
  • Body Speed controls how quickly the body moves. At 100%, the feet of the character feet don't appear to slide against the ground. At 0%, the body does not move, which is useful if you want to simulate a character running in place or moving above a scrolling background. If your puppet walks off the scene, select the Refresh button at the bottom of the Scene panel to reset the position of the puppet.
  • Arm Swing controls the range that the arms swing relative to the range of movement for the legs.
  • Arm Angle controls the orientation of the arms at the rest pose with 0° (angle control pointing up) placing the arms straight down and 90° (angle control pointing to the right) placing the arms straight out in front of the character.
  • Elbow Bend controls the amount of bending of the arms at the elbows.
  • Toe Bend controls the amount of roll or bend of the foot at the toes as the heel rises from the ground. You can adjust the parameter between 0% to 100%. 0% means no bend, while 100% means maximum bend. The parameter is set to 50% by default. The bend of the foot is controlled by the toe-tagged handle. You might want to position the handle slightly away from the tip of the toe. Note: the placement of handles and sticks should be consistent between the right feet and left feet for best results.
  • Pin Feet When Standing lets you pin the character's feet to the ground when not walking. Use this parameter when you've a single left or right view that is sometimes walking and sometimes standing. Pinning the feet this way prevents head-tracking from rotating the entire body. This parameter is on by default.
  • Strength controls the overall influence of the walk cycle relative to the drawn rest pose of the character; with 0% the body is at the rest pose (with no movement), and at 100%, the body moves through the normal walk cycle.
  • Behavior Interaction controls how the Walk interacts with other behaviors, such as Dragger combining the effects of Walk with other behaviors (this is the default setting and matches results from Version 22.4 and earlier. This will reset  Older puppets and projects to use this setting). In contrast, Override only let us Walk control its handles (and matches the Version 22.5 result).

Tip: To ease in and out of a walk, keyframe the Position parameter using Ease. When the Position’s change over time (i.e., speed of movement) drops below 20% of the normal walking pace, the puppet’s handles will be moved with reduced strength, dropping down to zero strength when the Position stops changing completely.

Moving the walk puppet

Tip: Add multiple Walk behaviors with different walk styles, speeds, and phase to produce more complex motion.

Walk: Arm and leg movements in Quarter views

The Walk behavior supports quarter-view-drawn characters. Also, a specific part of the walk cycle can be emphasized more than others, and shoulder and hip motion can be added, to produce more livelier movement.

Note:

The Walk behavior still moves the character  laterally , even if drawn in quarter view.

Setup

Biped characters drawn in three-quarter perspective produces pleasing arm and leg motion when the new Left Shoulder, Right Shoulder, Left Hip, and Right Hip handle tags are used. Instead of swinging arms from the neck and legs from a single hip location, separate left and right shoulder and hip locations can improve the appearance of these swinging limbs.

Body tags
Body tags

The Shoulder tags are best placed in the parent group of the arms. Existing puppets with a Hip handle will get Left Hip and Right Hip handle tags associated with that single handle, but you can reassign the tags to separate handles.

Note:

You can now tag left-moving views as Left Quarter or Left Profile. Similar for right-moving views being tagged as Right Quarter or Right Profile.

Controls

  • Pose Emphasis controls how much to slow down part of the walk cycle and speed up other parts to emphasize a specific pose.
  • Pose Emphasis Phase controls the pose in the walk cycle to emphasize.
  • Shoulder Sway controls how much the Left Shoulder and Right Shoulder handles move in opposite directions.
  • Hip Sway controls how much the Left Hip and Right Hip handles move in opposite directions.

Body: Control your puppet's arms, torso, and legs using your webcam.

Get help faster and easier

New user?