Arts

Technical & Character Animation

Technical Animation
State Machine
Animation Pipeline
VR / Metahuman

A collection of my animation works. Includes gameplay animation systems, pipeline studies, VR facial setup, and character animation exercises.

Collection of animation systems and character acting

Repository Overview

A collection of animation projects, covering both code-driven implementation and manual keyframe animation. This page documents technical tests, learning projects, and coursework.


1. Hybrid Physics-Animation System (UE5)

Project: “Craftopia” Technical Test(2025-06-20)

A technical prototype implementing a combat loop with physical hit reactions.

  • Physics Blending: Configured a blend profile where the lower body is Kinematic (animated) while the upper body simulates Physics upon impact.
  • Movement Constraint: Locked root movement during attack states to prevent foot sliding.
  • Camera Smoothing: Used FInterp to smooth camera rotation when locking onto targets.

2. Animation Architecture Study (Unity)

Project: LearnDarkSouls (2022-11-30)

A learning project recreating Souls-like mechanics to study Unity’s animation system and asset pipeline.

  • Locomotion: Implemented a Blend Tree using linear interpolation to handle inertia and movement speed.
  • Pipeline: Handled Blender-to-Unity workflow, including skeleton rigging, retargeting, and fixing normal orientation issues.
  • Tooling: Wrote scripts for automatic bone socket binding to attach weapons.

3. Character Animation Short (Blender)

Project: University Coursework (2023-11-25)

A standard university assignment practicing manual animation.

  • Keyframing: Manually animated character movements and actions.
  • Execution: Focused on completing the required character performance.

4. VR Facial Animation & Metahuman (UE5)

Project: Emotiquest (NeuroXR Hackathon 2025-03-24)

A VR demo focusing on emotional interaction using Metahuman characters.

  • Integration: Integrated Metahuman characters into VR using UEVR.
  • Facial Setup: Used Sequencer to layer manual Control Rig adjustments on top of procedural lipsync data for specific expressions.
  • Logic: Set up environmental triggers to switch character animation states based on user proximity.