Skip to main content

Overview

Airi supports two character model formats for creating interactive visual companions:
  • Live2D: 2D animated characters with expressive features and smooth animations
  • VRM: 3D humanoid models with standardized expressions and animations

Live2D Models

Live2D models provide 2D character animation with support for expressions, motions, and physics.

Model Structure

Live2D models consist of:
  • Model file (.model3.json): Character definition and settings
  • Textures: PNG image files for character appearance
  • Motions: Animation files for various actions
  • Expressions: Facial expression definitions
  • Physics: Hair and clothing physics settings

Loading Live2D Models

import { useLive2d } from '@proj-airi/stage-ui-live2d/stores'
import { loadLive2dModel } from '@proj-airi/stage-ui-live2d/utils/live2d-zip-loader'

// Load model from ZIP file
const modelFile = new File([zipBlob], 'character.zip')
const modelData = await loadLive2dModel(modelFile)

// Store configuration
const live2dStore = useLive2d()
live2dStore.currentMotion.value = { group: 'Idle', index: 0 }
live2dStore.scale.value = 1.0
live2dStore.position.value = { x: 0, y: 0 }

Live2D Component

<template>
  <Live2D
    :model-path="modelPath"
    :scale="scale"
    :position="position"
    @loaded="onModelLoaded"
  />
</template>

<script setup lang="ts">
import { Live2D } from '@proj-airi/stage-ui-live2d/components/scenes'
import { useLive2d } from '@proj-airi/stage-ui-live2d/stores'

const live2dStore = useLive2d()
const scale = live2dStore.scale
const position = live2dStore.position

function onModelLoaded() {
  console.log('Model loaded successfully')
}
</script>

Model Parameters

Live2D models support various parameters for controlling appearance:
import { useLive2d, defaultModelParameters } from '@proj-airi/stage-ui-live2d/stores'

const live2dStore = useLive2d()

// Set model parameters
live2dStore.modelParameters.value = {
  // Head rotation
  angleX: 0,        // -30 to 30 degrees
  angleY: 0,        // -30 to 30 degrees
  angleZ: 0,        // -30 to 30 degrees
  
  // Eyes
  leftEyeOpen: 1,   // 0 (closed) to 1 (open)
  rightEyeOpen: 1,
  leftEyeSmile: 0,  // 0 to 1
  rightEyeSmile: 0,
  
  // Eyebrows
  leftEyebrowY: 0,
  rightEyebrowY: 0,
  leftEyebrowAngle: 0,
  rightEyebrowAngle: 0,
  
  // Mouth
  mouthOpen: 0,     // 0 (closed) to 1 (open)
  mouthForm: 0,     // -1 to 1 (smile/frown)
  
  // Body
  bodyAngleX: 0,
  bodyAngleY: 0,
  bodyAngleZ: 0,
  breath: 0         // Breathing animation
}

Motions and Animations

import { useLive2d } from '@proj-airi/stage-ui-live2d/stores'

const live2dStore = useLive2d()

// Get available motions
const motions = live2dStore.availableMotions.value
// [{ motionName: 'Idle', motionIndex: 0, fileName: 'idle_01.motion3.json' }, ...]

// Play a motion
live2dStore.currentMotion.value = {
  group: 'Idle',
  index: 0
}

// Motion groups (common conventions)
// - Idle: Default standing animation
// - Tap: Reaction to being tapped/clicked
// - Shake: Reaction to screen shake
// - Flick: Reaction to flick gesture

Eye Tracking and Look-At

import { useEyeMotions } from '@proj-airi/stage-ui-live2d/utils/eye-motions'

// Enable automatic eye movements
const eyeMotions = useEyeMotions()

// Manual look-at
function lookAt(x: number, y: number) {
  const angleX = (y - 0.5) * 30  // Convert to -15 to 15 degrees
  const angleY = (x - 0.5) * 30
  
  live2dStore.modelParameters.value.angleX = angleX
  live2dStore.modelParameters.value.angleY = angleY
}

Blinking Animation

import { useBlinkAnimation } from '@proj-airi/stage-ui-live2d/composables/live2d'

const blink = useBlinkAnimation({
  interval: 3000,      // Time between blinks (ms)
  duration: 150,       // Blink duration (ms)
  variation: 0.3       // Randomness factor (0-1)
})

// Blink will automatically update modelParameters

Model Storage (OPFS)

import {
  registerLive2dModelToOPFS,
  loadFromOPFS
} from '@proj-airi/stage-ui-live2d/utils/live2d-opfs-registration'

// Save model to Origin Private File System
await registerLive2dModelToOPFS(modelId, zipFile)

// Load from OPFS
const modelPath = await loadFromOPFS(modelId)

VRM Models

VRM is an open 3D avatar format designed for VR applications.

VRM Model Structure

VRM models include:
  • 3D mesh: Character geometry
  • Textures: Material textures
  • Bones/Skeleton: Humanoid rig
  • Expressions: BlendShape-based facial expressions
  • Spring bones: Hair and clothing physics
  • Look-at: Eye and head tracking

Loading VRM Models

import { loadVrm } from '@proj-airi/stage-ui-three/composables/vrm'
import { Scene } from 'three'

const scene = new Scene()

const result = await loadVrm('/models/character.vrm', {
  scene: scene,
  lookAt: true,
  onProgress: (progress) => {
    console.log(`Loading: ${(progress.loaded / progress.total * 100).toFixed(0)}%`)
  }
})

if (result) {
  const { _vrm, _vrmGroup, modelCenter, modelSize, initialCameraOffset } = result
  console.log('Model loaded:', _vrm)
}

VRM Component

<template>
  <TresCanvas>
    <TresPerspectiveCamera :position="[0, 1, 3]" />
    <TresAmbientLight :intensity="0.5" />
    <TresDirectionalLight :position="[1, 1, 1]" :intensity="1" />
    
    <VRMModel
      :model-url="modelUrl"
      @loaded="onLoaded"
    />
  </TresCanvas>
</template>

<script setup lang="ts">
import { TresCanvas } from '@tresjs/core'
import { VRMModel } from '@proj-airi/stage-ui-three/components/Model'

const modelUrl = '/models/character.vrm'

function onLoaded(vrm) {
  console.log('VRM loaded:', vrm)
}
</script>

VRM Expressions

import { useVRMExpression } from '@proj-airi/stage-ui-three/composables/vrm'

const expression = useVRMExpression(vrm)

// Set expression (0 to 1)
expression.setExpression('happy', 1.0)
expression.setExpression('neutral', 0.0)

// Standard VRM expressions:
// - happy
// - angry
// - sad
// - relaxed
// - surprised
// - aa (mouth wide open)
// - ih (mouth slightly open)
// - ou (mouth rounded)
// - ee (mouth wide)
// - oh (mouth open)
// - blink
// - blinkLeft
// - blinkRight
// - lookUp
// - lookDown
// - lookLeft
// - lookRight

VRM Animations

import { useVRMAnimation } from '@proj-airi/stage-ui-three/composables/vrm'
import { VRMAnimationLoaderPlugin } from '@pixiv/three-vrm-animation'

const animation = useVRMAnimation(vrm)

// Load animation clip
const clip = await animation.loadAnimation('/animations/wave.vrma')

// Play animation
animation.play(clip, {
  loop: true,
  crossFadeDuration: 0.3
})

// Stop animation
animation.stop()

Lip Sync

import { useVRMLipSync } from '@proj-airi/stage-ui-three/composables/vrm'

const lipSync = useVRMLipSync(vrm, {
  gain: 1.0
})

// Update lip sync from audio
const audioContext = new AudioContext()
const analyser = audioContext.createAnalyser()
lipSync.update(analyser)

Look-At Control

import { VRMLookAtQuaternionProxy } from '@pixiv/three-vrm-animation'
import { Vector3 } from 'three'

// Get look-at proxy
const lookAtProxy = vrm.scene.getObjectByName('lookAtQuaternionProxy') as VRMLookAtQuaternionProxy

// Set look target
const target = new Vector3(0, 1.5, 0)
vrm.lookAt?.lookAt(target)

Model Format Requirements

Live2D Requirements

  • Format: Cubism SDK 4.0+ (.model3.json)
  • Textures: PNG format, power-of-2 dimensions recommended
  • Motions: Motion3.json format
  • Package: ZIP file containing all assets
  • File Structure:
    character.zip
    ├── character.model3.json
    ├── textures/
    │   ├── texture_00.png
    │   └── texture_01.png
    ├── motions/
    │   ├── idle_01.motion3.json
    │   └── tap_01.motion3.json
    └── expressions/
        └── expressions.json
    

VRM Requirements

  • Format: VRM 0.0 or 1.0 (.vrm)
  • Rig: Humanoid bones following VRM specification
  • Textures: Embedded or external (GLTF-compatible formats)
  • Max File Size: Recommended < 50MB for web performance
  • Optimization:
    • Use texture atlases
    • Reduce polygon count for web
    • Compress textures
    • Remove unnecessary bones

Performance Optimization

Live2D Optimization

import { VRMUtils } from '@pixiv/three-vrm'

// Reduce texture resolution
const textureScale = 0.5  // 50% of original

// Limit motion update rate
const updateInterval = 1000 / 30  // 30 FPS instead of 60

// Disable physics when not visible
if (!isVisible) {
  model.physics?.disable()
}

VRM Optimization

import { VRMUtils } from '@pixiv/three-vrm'

// Optimize VRM model
VRMUtils.removeUnnecessaryVertices(vrm.scene)
VRMUtils.combineSkeletons(vrm.scene)

// Disable frustum culling for VRM
vrm.scene.traverse((object) => {
  object.frustumCulled = false
})

Best Practices

  1. Test Models: Always test models in the target environment before production
  2. Optimize Assets: Compress textures and reduce polygon counts for web
  3. Handle Loading: Show loading indicators and handle errors gracefully
  4. Cache Models: Store frequently-used models in OPFS or IndexedDB
  5. Update Efficiently: Batch parameter updates and limit update frequency
  6. Resource Cleanup: Dispose of models and textures when no longer needed

Troubleshooting

Live2D Model Not Loading

// Check file structure
console.log(await loadLive2dModel(file))  // Should list all files

// Verify model3.json path
// Path should be relative to ZIP root

VRM Model Appears Black

// Ensure proper lighting
scene.add(new THREE.AmbientLight(0xffffff, 0.5))
scene.add(new THREE.DirectionalLight(0xffffff, 1))

// Check material MToon settings
vrm.scene.traverse((obj) => {
  if (obj.material?.isMToonMaterial) {
    console.log('MToon material:', obj.material)
  }
})

Performance Issues

// Monitor FPS
const stats = new Stats()
document.body.appendChild(stats.dom)

// Reduce quality settings
renderer.setPixelRatio(Math.min(window.devicePixelRatio, 1.5))
renderer.setSize(width * 0.75, height * 0.75)

Build docs developers (and LLMs) love