Back to projects
2017Open Source

Audio Waves

Analog microphone signals visualized as digital waveforms

JavaAndroidAudioRecord APISignal Processing
View Source

Can You See Sound?

I was messing around with Android's audio APIs and realized you could get raw PCM data out of the microphone — not processed audio, the actual amplitude values at each sample point. If those are just numbers, and a canvas is just a grid of pixels, then rendering a waveform is just mapping one to the other.

The reference I had in mind was an oscilloscope. That line that reacts instantly and honestly to whatever sound is happening. I wanted that on a phone screen.

PCM Out, Waveform In

AudioRecord gives you a buffer of 16-bit PCM samples — values between -32768 and 32767 representing amplitude over time. Read those on a background thread, normalize them to the height of a Canvas view, and draw a Path connecting each sample as a point. That's the core of it.

The hard part wasn't the rendering. It was keeping the audio thread and the UI thread from stepping on each other — AudioRecord runs on its own thread, and Android's UI won't let you touch views from anywhere except the main thread. Handler/Looper bridges that gap.

Record → Buffer → Draw

A producer-consumer pipeline. AudioRecord fills a circular buffer with PCM samples on a background thread. On each UI frame, the Canvas view reads the buffer and draws the waveform. Handler posts from the audio thread to the UI thread.

01
Mic Permission

Runtime RECORD_AUDIO permission with a rationale dialog before AudioRecord starts

02
AudioRecord Init

44100 Hz sample rate, 16-bit PCM, mono channel — the configuration that works across the most devices

03
Circular Buffer

Thread-safe buffer holding the last N samples. Acts as a scrolling window of recent audio data.

04
Canvas Renderer

Custom View reads the buffer, draws a continuous Path with each sample as a Y coordinate scaled to view height

05
Controls

Record/stop button, gain slider for amplitude scaling, freeze-frame to pause the waveform

Under the Hood

Android AudioRecord

Low-level API, direct PCM access at 44100 Hz, 16-bit. Runs on a dedicated thread. getMinBufferSize() gives the safe minimum buffer — used 2x that for headroom.

Custom Canvas View

Overrides onDraw(), builds a Path by iterating through the PCM buffer and setting each sample as a (x, amplitude) point. Invalidated on each new audio chunk.

Handler/Looper

AudioRecord thread posts new buffer data to the main thread via Handler. Keeps audio reads off the UI thread, canvas updates on it.

Signal Scaling

PCM values (-32768 to 32767) normalized to view height. Gain multiplier adjustable via slider — necessary because quiet audio is otherwise invisible.

What Made It Hard

  • AudioRecord's minimum buffer size isn't a fixed number — it depends on the device's audio hardware. Had to call AudioRecord.getMinBufferSize() at runtime and then double it, because using the exact minimum on some devices caused dropouts.
  • Rendering at 60fps while continuously reading audio data on a separate thread put real pressure on older hardware. Had to profile and reduce unnecessary object allocations inside the draw loop to keep the frame rate stable.
  • Quiet sounds produced an almost flat line; loud sounds clipped the view entirely. Dynamic gain — a simple multiplier that adjusts based on the recent peak amplitude — made the waveform readable across a much wider range of input levels.