Skip to main content

Overview

After recording audio, React Voice Visualizer provides comprehensive playback controls with synchronized waveform visualization. This page explains how playback works, the audio element management, and how visualization is synchronized with playback progress.

Audio Element Management

Internal Audio Reference (v2.x.x)

Starting in version 2.x.x, the library manages the audio element (audioRef) internally. You no longer need to create or pass a ref manually.
The audio element is created when recording stops or when a preloaded blob is set:
// From useVoiceVisualizer.tsx:200
audioRef.current = new Audio();
The ref is declared in the hook:
// From useVoiceVisualizer.tsx:53
const audioRef = useRef<HTMLAudioElement | null>(null);
And returned for programmatic access:
// From useVoiceVisualizer.tsx:390
return {
  audioRef,
  // ... other controls
};

Setting the Audio Source

After the recorded blob is processed, the audio source is set:
// From useVoiceVisualizer.tsx:116-117
const audioSrcFromBlob = URL.createObjectURL(blob);
setAudioSrc(audioSrcFromBlob);
Once processing is complete, the src is assigned:
// From VoiceVisualizer.tsx:359-365
function completedAudioProcessing() {
  _setIsProcessingOnResize(false);
  _setIsProcessingAudioOnComplete(false);
  if (audioRef?.current && !isProcessingOnResize) {
    audioRef.current.src = audioSrc;
  }
}

Playback Controls

Starting Playback

The startAudioPlayback() function initiates or resumes playback:
// From useVoiceVisualizer.tsx:310-323
const startAudioPlayback = () => {
  if (!audioRef.current || isRecordingInProgress) return;

  requestAnimationFrame(handleTimeUpdate);
  startPlayingAudio();
  audioRef.current.addEventListener('ended', onEndedRecordedAudio);
  setIsPausedRecordedAudio(false);
  
  if (onStartAudioPlayback && currentAudioTime === 0) {
    onStartAudioPlayback();
  }
  if (onResumedAudioPlayback && currentAudioTime !== 0) {
    onResumedAudioPlayback();
  }
};
1

Start Time Updates

Begins the requestAnimationFrame loop to track current playback time
2

Play Audio

Calls audio.play() with error handling
3

Add Event Listener

Listens for the ‘ended’ event to reset playback when audio finishes
4

Update State

Sets isPausedRecordedAudio to false
5

Trigger Callbacks

Calls onStartAudioPlayback or onResumedAudioPlayback based on current time
The actual play() call includes error handling:
// From useVoiceVisualizer.tsx:294-308
const startPlayingAudio = () => {
  if (audioRef.current && audioRef.current.paused) {
    const audioPromise = audioRef.current.play();
    if (audioPromise !== undefined) {
      audioPromise.catch((error) => {
        console.error(error);
        if (onErrorPlayingAudio) {
          onErrorPlayingAudio(
            error instanceof Error ? error : new Error('Error playing audio')
          );
        }
      });
    }
  }
};
The audio.play() method returns a Promise that can reject if playback is prevented (e.g., by browser autoplay policies). The error is caught and passed to the onErrorPlayingAudio callback.

Pausing Playback

The stopAudioPlayback() function pauses playback:
// From useVoiceVisualizer.tsx:325-338
const stopAudioPlayback = () => {
  if (!audioRef.current || isRecordingInProgress) return;

  if (rafCurrentTimeUpdateRef.current) {
    cancelAnimationFrame(rafCurrentTimeUpdateRef.current);
  }
  audioRef.current.removeEventListener('ended', onEndedRecordedAudio);
  audioRef.current.pause();
  setIsPausedRecordedAudio(true);
  const newCurrentTime = audioRef.current.currentTime;
  setCurrentAudioTime(newCurrentTime);
  audioRef.current.currentTime = newCurrentTime;
  if (onPausedAudioPlayback) onPausedAudioPlayback();
};
1

Cancel Animation Frame

Stops the time update loop
2

Remove Event Listener

Removes the ‘ended’ event listener
3

Pause Audio

Calls audio.pause()
4

Update State

Sets isPausedRecordedAudio to true and stores current playback position
5

Trigger Callback

Calls onPausedAudioPlayback if provided

Toggle Play/Pause

The togglePauseResume() function handles both recording and playback:
// From useVoiceVisualizer.tsx:340-362
const togglePauseResume = () => {
  if (isRecordingInProgress) {
    // ... recording pause/resume logic
    return;
  }

  if (audioRef.current && isAvailableRecordedAudio) {
    audioRef.current.paused ? startAudioPlayback() : stopAudioPlayback();
  }
};
This allows a single button to control both recording and playback states.

Current Time Tracking

Time Update Loop

During playback, the current time is tracked using requestAnimationFrame:
// From useVoiceVisualizer.tsx:205-211
const handleTimeUpdate = () => {
  if (!audioRef.current) return;

  setCurrentAudioTime(audioRef.current.currentTime);

  rafCurrentTimeUpdateRef.current = requestAnimationFrame(handleTimeUpdate);
};
This creates a ~60fps update loop that synchronizes the visualization with playback.
Using requestAnimationFrame instead of the audio element’s timeupdate event provides smoother, more frequent updates for visualization (useVoiceVisualizer.tsx:210).

Formatted Time Display

The hook provides pre-formatted time strings:
// From useVoiceVisualizer.tsx:60-61
const formattedRecordedAudioCurrentTime = formatRecordedAudioTime(currentAudioTime);
Example output:
  • currentAudioTime = 65.432formattedRecordedAudioCurrentTime = "01:05:4"
  • Format: MM:SS:D (minutes:seconds:deciseconds)
Usage:
const { formattedRecordedAudioCurrentTime, formattedDuration } = useVoiceVisualizer();

return (
  <div>
    <span>{formattedRecordedAudioCurrentTime} / {formattedDuration}</span>
  </div>
);

Duration Calculation

The audio duration is extracted from the AudioBuffer:
// From useVoiceVisualizer.tsx:123
setDuration(buffer.duration - 0.06);
The duration is adjusted by -0.06 seconds to account for small padding in decoded audio buffers.
The formatted duration:
// From useVoiceVisualizer.tsx:58
const formattedDuration = formatDurationTime(duration);
Example:
  • duration = 125.4formattedDuration = "02:05m"
  • Format: MM:SS with ‘m’ suffix

Seeking (Manual Time Control)

Users can click on the waveform to jump to a specific position:
// From VoiceVisualizer.tsx:379-390
const handleRecordedAudioCurrentTime: MouseEventHandler<HTMLCanvasElement> = (e) => {
  if (audioRef?.current && canvasRef.current) {
    const newCurrentTime =
      (duration / canvasCurrentWidth) *
      (e.clientX - canvasRef.current.getBoundingClientRect().left);

    audioRef.current.currentTime = newCurrentTime;
    setCurrentAudioTime(newCurrentTime);
  }
};
The canvas has an onClick handler:
// From VoiceVisualizer.tsx:408
<canvas
  ref={canvasRef}
  onClick={handleRecordedAudioCurrentTime}
  // ...
/>
1

Calculate Click Position

Converts mouse X coordinate to canvas position
2

Calculate Time

Maps position to timestamp: (duration / canvasWidth) * clickX
3

Update Audio Element

Sets audio.currentTime to the new position
4

Update State

Calls setCurrentAudioTime to trigger visualization update

Synchronized Visualization

The waveform visualization updates in sync with playback:
// From drawByBlob.ts:17-39
export const drawByBlob = ({
  barsData,
  canvas,
  currentAudioTime = 0,
  duration,
  mainBarColor,
  secondaryBarColor,
  // ...
}: DrawByBlob): void => {
  const { context, height } = canvasData;

  // Calculate playback progress as percentage
  const playedPercent = currentAudioTime / duration;

  barsData.forEach((barData, i) => {
    const mappingPercent = i / barsData.length;
    const played = playedPercent > mappingPercent;

    paintLine({
      context,
      color: played ? secondaryBarColor : mainBarColor,
      // ...
    });
  });
};

Color-Based Progress

Bars change color based on whether they’ve been played:
  • Played bars (before current position): secondaryBarColor
  • Unplayed bars (after current position): mainBarColor
<VoiceVisualizer
  controls={controls}
  mainBarColor="#FFFFFF"        // Unplayed (white)
  secondaryBarColor="#5e5e5e"  // Played (gray)
/>
The transition happens smoothly as currentAudioTime updates at ~60fps.

Progress Indicator

A visual line tracks the current playback position:
// From VoiceVisualizer.tsx:473-502
{isProgressIndicatorShown && isAvailableRecordedAudio && !isProcessingRecordedAudio && duration ? (
  <div
    className="voice-visualizer__progress-indicator"
    style={{
      left: timeIndicatorStyleLeft < canvasCurrentWidth - 1
        ? timeIndicatorStyleLeft
        : canvasCurrentWidth - 1,
    }}
  >
    {isProgressIndicatorTimeShown && (
      <p className="voice-visualizer__progress-indicator-time">
        {formattedRecordedAudioCurrentTime}
      </p>
    )}
  </div>
) : null}
Position calculation:
// From VoiceVisualizer.tsx:392-393
const timeIndicatorStyleLeft = (currentAudioTime / duration) * canvasCurrentWidth;
This creates a vertical line that moves across the waveform as audio plays.

Playback End Handling

When audio playback completes naturally:
// From useVoiceVisualizer.tsx:364-373
const onEndedRecordedAudio = () => {
  if (rafCurrentTimeUpdateRef.current) {
    cancelAnimationFrame(rafCurrentTimeUpdateRef.current);
  }
  setIsPausedRecordedAudio(true);
  if (!audioRef?.current) return;
  audioRef.current.currentTime = 0;
  setCurrentAudioTime(0);
  if (onEndAudioPlayback) onEndAudioPlayback();
};
1

Stop Updates

Cancels the time update animation frame
2

Pause State

Sets isPausedRecordedAudio to true
3

Reset Position

Sets currentTime to 0 for next playback
4

Trigger Callback

Calls onEndAudioPlayback if provided
The ‘ended’ event listener is added in startAudioPlayback:
// From useVoiceVisualizer.tsx:315
audioRef.current.addEventListener('ended', onEndedRecordedAudio);
And removed in stopAudioPlayback and clearCanvas.

State Management

The hook exposes several playback-related states:
const {
  audioRef,                              // HTMLAudioElement ref
  isPausedRecordedAudio,                 // true when paused
  isAvailableRecordedAudio,              // true when audio is ready
  currentAudioTime,                      // current position in seconds
  duration,                              // total duration in seconds
  formattedRecordedAudioCurrentTime,     // formatted current time
  formattedDuration,                     // formatted duration
  audioSrc,                              // blob URL for download
  startAudioPlayback,                    // function to play
  stopAudioPlayback,                     // function to pause
  togglePauseResume,                     // function to toggle
  setCurrentAudioTime,                   // function to seek
} = useVoiceVisualizer();

Custom Playback UI Example

const { 
  isAvailableRecordedAudio,
  isPausedRecordedAudio,
  currentAudioTime,
  duration,
  startAudioPlayback,
  stopAudioPlayback,
  setCurrentAudioTime,
} = useVoiceVisualizer();

if (!isAvailableRecordedAudio) return null;

return (
  <div>
    <button onClick={isPausedRecordedAudio ? startAudioPlayback : stopAudioPlayback}>
      {isPausedRecordedAudio ? 'Play' : 'Pause'}
    </button>
    
    <input
      type="range"
      min={0}
      max={duration}
      value={currentAudioTime}
      onChange={(e) => setCurrentAudioTime(parseFloat(e.target.value))}
    />
    
    <span>{Math.floor(currentAudioTime)}s / {Math.floor(duration)}s</span>
  </div>
);

Error Handling

Playback errors are handled through the onErrorPlayingAudio callback:
const controls = useVoiceVisualizer({
  onErrorPlayingAudio: (error) => {
    console.error('Playback failed:', error);
    alert('Unable to play audio. Please try again.');
  },
});
Common playback errors:
  • NotAllowedError: Browser autoplay policy blocked playback
  • NotSupportedError: Audio format not supported
  • AbortError: Playback was interrupted
Browsers may block audio.play() if it’s not triggered by a user interaction. Always initiate playback from a click/tap handler. See Autoplay policy for details.

Cleanup

When clearing the canvas or unmounting, audio resources are cleaned up:
// From useVoiceVisualizer.tsx:262-267
if (audioRef?.current) {
  audioRef.current.removeEventListener('ended', onEndedRecordedAudio);
  audioRef.current.pause();
  audioRef.current.src = '';
  audioRef.current = null;
}
This ensures:
  • Event listeners are removed to prevent memory leaks
  • Playback is stopped
  • Blob URL is released
  • Reference is nullified

Next Steps

Recording

Learn about audio recording and blob generation

Visualization

Understand how waveforms are rendered

Custom Controls

Build custom playback UI controls

Hook API

Complete useVoiceVisualizer API reference

Build docs developers (and LLMs) love