Audio DSP Skill
Expert knowledge for implementing audio digital signal processing in web applications.
When to Use
-
Implementing audio effects (reverb, delay, chorus)
-
Real-time audio processing with Web Audio API
-
FFT analysis and visualization
-
Convolution-based effects
-
Audio worklet development
Web Audio API Patterns
Basic Audio Context
const audioContext = new AudioContext(); const analyser = audioContext.createAnalyser(); analyser.fftSize = 2048;
Audio Worklet (Real-time Processing)
// Register processor await audioContext.audioWorklet.addModule('processor.js'); const node = new AudioWorkletNode(audioContext, 'my-processor');
Common Algorithms
Convolution Reverb
const convolver = audioContext.createConvolver(); const response = await fetch('/impulse-response.wav'); convolver.buffer = await audioContext.decodeAudioData(await response.arrayBuffer());
FFT Analysis
const dataArray = new Float32Array(analyser.frequencyBinCount); analyser.getFloatFrequencyData(dataArray);
Knowledge Base Resources
Query for DSP algorithms:
SELECT * FROM algorithm WHERE tags CONTAINS 'audio' OR tags CONTAINS 'dsp'; SELECT * FROM paper WHERE abstract @@ 'convolution reverb impulse';
Performance Guidelines
-
Use AudioWorklet for real-time processing
-
Avoid main thread blocking during audio
-
Buffer sizes: 128-512 for low latency
-
Sample rate: Match device (usually 44100 or 48000)
-
Use SharedArrayBuffer for worklet communication
React Integration
// Custom hook pattern const useAudioAnalyser = (audioContext: AudioContext) => { const analyserRef = useRef<AnalyserNode>();
useEffect(() => { analyserRef.current = audioContext.createAnalyser(); return () => analyserRef.current?.disconnect(); }, [audioContext]);
return analyserRef; };
Related KB Queries
-- Find reverb implementations SELECT * FROM knowledge WHERE content @@ 'reverb implementation';
-- Get validated algorithms SELECT * FROM algorithm WHERE success_rate > 0.8;