useHooks.iov4.1.2
DocsBlogGitHub
Hooks
No hooks found in any category.

useAudioRecorder

sensors

Installation

npx usehooks-cli@latest add use-audio-recorder

Description

A comprehensive hook for audio recording with real-time analysis using getUserMedia, MediaRecorder, and Web Audio APIs. Provides full control over audio recording including pause/resume, duration tracking, and optional real-time frequency analysis.

Parameters

NameTypeDefaultDescription
options?UseAudioRecorderOptions-Configuration options for the audio recorder

Parameter Properties

options properties:

NameTypeDescription
audioBitsPerSecond?numberAudio bitrate in bits per second (default: 128000)
mimeType?stringMIME type for the recording (default: 'audio/webm')
timeslice?numberTime interval for data chunks in milliseconds
enableAnalysis?booleanEnable real-time audio analysis (default: false)
fftSize?numberFFT size for audio analysis (default: 2048)

Return Type

UseAudioRecorderReturn
PropertyTypeDescription
isSupportedbooleanWhether audio recording is supported in the current browser
isRecordingbooleanWhether audio is currently being recorded
isPausedbooleanWhether recording is currently paused
streamMediaStream | nullThe current media stream from getUserMedia
mediaRecorderMediaRecorder | nullThe MediaRecorder instance
audioBlobBlob | nullThe recorded audio as a Blob
audioUrlstring | nullObject URL for the recorded audio
durationnumberRecording duration in seconds
errorstring | nullError message if any operation failed
analysisDataAudioAnalysisData | nullReal-time audio analysis data (frequency, time domain, volume)
startRecording() => Promise<void>Start audio recording
stopRecording() => voidStop audio recording
pauseRecording() => voidPause audio recording
resumeRecording() => voidResume paused recording
clearRecording() => voidClear the current recording data
downloadRecording(filename?: string) => voidDownload the recorded audio file

Examples

Basic Audio Recording

Simple audio recording with start/stop functionality

1import { useAudioRecorder } from '@usehooks/use-audio-recorder'; 2 3function AudioRecorder() { 4 const { 5 isSupported, 6 isRecording, 7 audioUrl, 8 duration, 9 startRecording, 10 stopRecording, 11 downloadRecording 12 } = useAudioRecorder(); 13 14 if (!isSupported) { 15 return <div>Audio recording not supported</div>; 16 } 17 18 return ( 19 <div> 20 <button onClick={startRecording} disabled={isRecording}> 21 Start Recording 22 </button> 23 <button onClick={stopRecording} disabled={!isRecording}> 24 Stop Recording 25 </button> 26 <p>Duration: {duration}s</p> 27 {audioUrl && ( 28 <div> 29 <audio src={audioUrl} controls /> 30 <button onClick={() => downloadRecording('my-recording.webm')}> 31 Download 32 </button> 33 </div> 34 )} 35 </div> 36 ); 37}

Recording with Pause/Resume

Audio recording with pause and resume functionality

1import { useAudioRecorder } from '@usehooks/use-audio-recorder'; 2 3function AdvancedRecorder() { 4 const { 5 isRecording, 6 isPaused, 7 duration, 8 startRecording, 9 stopRecording, 10 pauseRecording, 11 resumeRecording, 12 clearRecording 13 } = useAudioRecorder(); 14 15 return ( 16 <div> 17 <button onClick={startRecording} disabled={isRecording}> 18 Start 19 </button> 20 <button onClick={pauseRecording} disabled={!isRecording || isPaused}> 21 Pause 22 </button> 23 <button onClick={resumeRecording} disabled={!isPaused}> 24 Resume 25 </button> 26 <button onClick={stopRecording} disabled={!isRecording}> 27 Stop 28 </button> 29 <button onClick={clearRecording}> 30 Clear 31 </button> 32 <p>Status: {isRecording ? (isPaused ? 'Paused' : 'Recording') : 'Stopped'}</p> 33 <p>Duration: {duration}s</p> 34 </div> 35 ); 36}

Real-time Audio Analysis

Recording with real-time frequency analysis and volume monitoring

1import { useAudioRecorder } from '@usehooks/use-audio-recorder'; 2 3function AudioAnalyzer() { 4 const { 5 isRecording, 6 analysisData, 7 startRecording, 8 stopRecording 9 } = useAudioRecorder({ 10 enableAnalysis: true, 11 fftSize: 1024 12 }); 13 14 return ( 15 <div> 16 <button onClick={startRecording} disabled={isRecording}> 17 Start Analysis 18 </button> 19 <button onClick={stopRecording} disabled={!isRecording}> 20 Stop 21 </button> 22 23 {analysisData && ( 24 <div> 25 <p>Volume: {(analysisData.volume * 100).toFixed(1)}%</p> 26 <div> 27 Frequency Data: {analysisData.frequencyData.length} bins 28 </div> 29 <div> 30 Time Domain: {analysisData.timeData.length} samples 31 </div> 32 </div> 33 )} 34 </div> 35 ); 36}

Dependencies

react

Notes

  • Requires user permission to access microphone
  • Browser support varies for different MIME types
  • Real-time analysis may impact performance on lower-end devices
  • Audio context is automatically cleaned up on component unmount
  • Recording automatically stops when component unmounts

Implementation

1"use client"; 2 3import { useState, useEffect, useRef, useCallback } from "react"; 4 5interface UseAudioRecorderOptions { 6 audioBitsPerSecond?: number; 7 mimeType?: string; 8 timeslice?: number; 9 enableAnalysis?: boolean; 10 fftSize?: number; 11} 12 13interface AudioAnalysisData { 14 frequencyData: Uint8Array; 15 timeData: Uint8Array; 16 volume: number; 17} 18 19interface UseAudioRecorderReturn { 20 isSupported: boolean; 21 isRecording: boolean; 22 isPaused: boolean; 23 stream: MediaStream | null; 24 mediaRecorder: MediaRecorder | null; 25 audioBlob: Blob | null; 26 audioUrl: string | null; 27 duration: number; 28 error: string | null; 29 analysisData: AudioAnalysisData | null; 30 startRecording: () => Promise<void>; 31 stopRecording: () => void; 32 pauseRecording: () => void; 33 resumeRecording: () => void; 34 clearRecording: () => void; 35 downloadRecording: (filename?: string) => void; 36} 37 38const useAudioRecorder = ( 39 options: UseAudioRecorderOptions = {} 40): UseAudioRecorderReturn => { 41 const { 42 audioBitsPerSecond = 128000, 43 mimeType = "audio/webm", 44 timeslice, 45 enableAnalysis = false, 46 fftSize = 2048, 47 } = options; 48 49 const [isRecording, setIsRecording] = useState(false); 50 const [isPaused, setIsPaused] = useState(false); 51 const [stream, setStream] = useState<MediaStream | null>(null); 52 const [mediaRecorder, setMediaRecorder] = useState<MediaRecorder | null>( 53 null 54 ); 55 const [audioBlob, setAudioBlob] = useState<Blob | null>(null); 56 const [audioUrl, setAudioUrl] = useState<string | null>(null); 57 const [duration, setDuration] = useState(0); 58 const [error, setError] = useState<string | null>(null); 59 const [analysisData, setAnalysisData] = useState<AudioAnalysisData | null>( 60 null 61 ); 62 63 const chunksRef = useRef<Blob[]>([]); 64 const startTimeRef = useRef<number>(0); 65 const pausedTimeRef = useRef<number>(0); 66 const intervalRef = useRef<number | null>(null); 67 const audioContextRef = useRef<AudioContext | null>(null); 68 const analyserRef = useRef<AnalyserNode | null>(null); 69 const sourceRef = useRef<MediaStreamAudioSourceNode | null>(null); 70 const animationFrameRef = useRef<number | null>(null); 71 72 const isSupported = 73 typeof navigator !== "undefined" && 74 !!navigator.mediaDevices && 75 !!navigator.mediaDevices.getUserMedia && 76 !!window.MediaRecorder; 77 78 const updateDuration = useCallback(() => { 79 if (startTimeRef.current) { 80 const elapsed = Date.now() - startTimeRef.current - pausedTimeRef.current; 81 setDuration(Math.floor(elapsed / 1000)); 82 } 83 }, []); 84 85 const analyzeAudio = useCallback(() => { 86 if (!analyserRef.current || !enableAnalysis) return; 87 88 const frequencyData = new Uint8Array(analyserRef.current.frequencyBinCount); 89 const timeData = new Uint8Array(analyserRef.current.fftSize); 90 91 analyserRef.current.getByteFrequencyData(frequencyData); 92 analyserRef.current.getByteTimeDomainData(timeData); 93 94 // Calculate volume (RMS) 95 let sum = 0; 96 for (let i = 0; i < timeData.length; i++) { 97 const sample = ((timeData[i] ?? 0) - 128) / 128; 98 sum += sample * sample; 99 } 100 const volume = Math.sqrt(sum / timeData.length); 101 102 setAnalysisData({ 103 frequencyData: frequencyData.slice(), 104 timeData: timeData.slice(), 105 volume, 106 }); 107 108 if (isRecording && !isPaused) { 109 animationFrameRef.current = requestAnimationFrame(analyzeAudio); 110 } 111 }, [isRecording, isPaused, enableAnalysis]); 112 113 const setupAudioAnalysis = useCallback( 114 (mediaStream: MediaStream) => { 115 if (!enableAnalysis) return; 116 117 try { 118 audioContextRef.current = new (window.AudioContext || 119 (window as any).webkitAudioContext)(); 120 analyserRef.current = audioContextRef.current.createAnalyser(); 121 sourceRef.current = 122 audioContextRef.current.createMediaStreamSource(mediaStream); 123 124 analyserRef.current.fftSize = fftSize; 125 analyserRef.current.smoothingTimeConstant = 0.8; 126 127 sourceRef.current.connect(analyserRef.current); 128 129 analyzeAudio(); 130 } catch (err) { 131 console.warn("Failed to setup audio analysis:", err); 132 } 133 }, 134 [enableAnalysis, fftSize, analyzeAudio] 135 ); 136 137 const startRecording = useCallback(async () => { 138 if (!isSupported) { 139 setError("Audio recording is not supported in this browser"); 140 return; 141 } 142 143 try { 144 setError(null); 145 const mediaStream = await navigator.mediaDevices.getUserMedia({ 146 audio: { 147 echoCancellation: true, 148 noiseSuppression: true, 149 autoGainControl: true, 150 }, 151 }); 152 153 setStream(mediaStream); 154 setupAudioAnalysis(mediaStream); 155 156 const recorder = new MediaRecorder(mediaStream, { 157 audioBitsPerSecond, 158 mimeType: MediaRecorder.isTypeSupported(mimeType) 159 ? mimeType 160 : "audio/webm", 161 }); 162 163 chunksRef.current = []; 164 165 recorder.ondataavailable = (event) => { 166 if (event.data.size > 0) { 167 chunksRef.current.push(event.data); 168 } 169 }; 170 171 recorder.onstop = () => { 172 const blob = new Blob(chunksRef.current, { type: recorder.mimeType }); 173 setAudioBlob(blob); 174 setAudioUrl(URL.createObjectURL(blob)); 175 setIsRecording(false); 176 setIsPaused(false); 177 178 if (intervalRef.current) { 179 clearInterval(intervalRef.current); 180 intervalRef.current = null; 181 } 182 183 if (animationFrameRef.current) { 184 cancelAnimationFrame(animationFrameRef.current); 185 animationFrameRef.current = null; 186 } 187 }; 188 189 recorder.onpause = () => { 190 setIsPaused(true); 191 pausedTimeRef.current += Date.now() - startTimeRef.current; 192 if (animationFrameRef.current) { 193 cancelAnimationFrame(animationFrameRef.current); 194 animationFrameRef.current = null; 195 } 196 }; 197 198 recorder.onresume = () => { 199 setIsPaused(false); 200 startTimeRef.current = Date.now(); 201 if (enableAnalysis) { 202 analyzeAudio(); 203 } 204 }; 205 206 recorder.onerror = (event) => { 207 setError(`Recording error: ${event.error?.message || "Unknown error"}`); 208 setIsRecording(false); 209 setIsPaused(false); 210 }; 211 212 setMediaRecorder(recorder); 213 recorder.start(timeslice); 214 setIsRecording(true); 215 startTimeRef.current = Date.now(); 216 pausedTimeRef.current = 0; 217 setDuration(0); 218 219 intervalRef.current = setInterval(updateDuration, 1000); 220 } catch (err) { 221 const errorMessage = 222 err instanceof Error ? err.message : "Failed to start recording"; 223 setError(errorMessage); 224 } 225 }, [ 226 isSupported, 227 audioBitsPerSecond, 228 mimeType, 229 timeslice, 230 setupAudioAnalysis, 231 updateDuration, 232 enableAnalysis, 233 analyzeAudio, 234 ]); 235 236 const stopRecording = useCallback(() => { 237 if (mediaRecorder && mediaRecorder.state !== "inactive") { 238 mediaRecorder.stop(); 239 } 240 241 if (stream) { 242 stream.getTracks().forEach((track) => track.stop()); 243 setStream(null); 244 } 245 246 if (audioContextRef.current) { 247 audioContextRef.current.close(); 248 audioContextRef.current = null; 249 } 250 }, [mediaRecorder, stream]); 251 252 const pauseRecording = useCallback(() => { 253 if (mediaRecorder && mediaRecorder.state === "recording") { 254 mediaRecorder.pause(); 255 } 256 }, [mediaRecorder]); 257 258 const resumeRecording = useCallback(() => { 259 if (mediaRecorder && mediaRecorder.state === "paused") { 260 mediaRecorder.resume(); 261 } 262 }, [mediaRecorder]); 263 264 const clearRecording = useCallback(() => { 265 if (audioUrl) { 266 URL.revokeObjectURL(audioUrl); 267 } 268 setAudioBlob(null); 269 setAudioUrl(null); 270 setDuration(0); 271 setAnalysisData(null); 272 setError(null); 273 }, [audioUrl]); 274 275 const downloadRecording = useCallback( 276 (filename = "recording.webm") => { 277 if (!audioUrl) return; 278 279 const link = document.createElement("a"); 280 link.href = audioUrl; 281 link.download = filename; 282 document.body.appendChild(link); 283 link.click(); 284 document.body.removeChild(link); 285 }, 286 [audioUrl] 287 ); 288 289 // Cleanup on unmount 290 useEffect(() => { 291 return () => { 292 if (intervalRef.current) { 293 clearInterval(intervalRef.current); 294 } 295 if (animationFrameRef.current) { 296 cancelAnimationFrame(animationFrameRef.current); 297 } 298 if (stream) { 299 stream.getTracks().forEach((track) => track.stop()); 300 } 301 if (audioContextRef.current) { 302 audioContextRef.current.close(); 303 } 304 if (audioUrl) { 305 URL.revokeObjectURL(audioUrl); 306 } 307 }; 308 }, [stream, audioUrl]); 309 310 return { 311 isSupported, 312 isRecording, 313 isPaused, 314 stream, 315 mediaRecorder, 316 audioBlob, 317 audioUrl, 318 duration, 319 error, 320 analysisData, 321 startRecording, 322 stopRecording, 323 pauseRecording, 324 resumeRecording, 325 clearRecording, 326 downloadRecording, 327 }; 328}; 329 330export { useAudioRecorder }; 331export type { 332 UseAudioRecorderOptions, 333 AudioAnalysisData, 334 UseAudioRecorderReturn, 335}; 336