Audiovisual
Map sound behavior into visuals and performance-oriented output systems.
- OSC and MIDI visual control
- audio-reactive graphics
- Blender / Three.js / TouchDesigner
Define control protocols. Learn how to transmit data between a modular sound engine and an external environment to generate visuals.
Map sound behavior into visuals and performance-oriented output systems.
Theory, structure, and practical context are all driven from content files.
Concrete repository anchors already exist for this lesson track.
By the end of this lesson, you should understand:
To connect sound and visuals when they exist in different software ecosystems (e.g., VCV Rack and TouchDesigner/WebGL), a reliable “transport layer” is needed.
Even the most brilliant patch won’t rock 3D graphics if the commands don’t arrive on time or are limited by low resolution.
Audiovisual projects use the following protocols:
Without a clear transport model and mapping documentation, an audiovisual setup becomes improvised and fragile. Setting up stable communication channels is the foundation without which you cannot start “forcing” pixels to move to the music.
Transmits discrete events: “Note C3 pressed, Velocity 100” or CC (Continuous Control) messages.
Pros: Universal, supported by hardware and DAWs “out of the box”. Excellent for triggers and quantized commands. Cons: Low resolution (only 128 steps). When modulating a visual parameter slowly, this will cause stepping (jitter). Limited by channel count and physical cables.
Works via network protocols (UDP/TCP) and uses human-readable address paths. For example: /synth/bass/filter transmits the number 0.85231.
Pros: Extremely high resolution (Float type), allowing the transmission of ultra-smooth generator curves and LFOs without jitter. Transmitted over a local network or Wi-Fi (you can send a signal from the musician’s laptop to a powerful visualizer PC). Cons: Requires network port configuration. Not all hardware understands OSC natively.
In OSC, IP addresses and Ports play a critical role. Beginners often forget that if VCV Rack sends data to Port 8000, the visual receiver must also be set to “Listen Port 8000”.
If you send audio frequencies (44,100 data points per second) over OSC, the network will instantly choke and freeze. For OSC, it’s better to send only slowly changing parameter modulations.
127.0.0.1 (if everything is on one computer) and a port (like 7000)./vcv/lfo./vcv/lfo.Create a mini-table (“Mapping List”) with any three connections in your project:
Example Mapping List:
graph LR
subgraph PROTOCOL[Transport Layer]
MIDI_GATE[MIDI Gate]
OSC_LFO[OSC Float / LFO]
MIDI_CC[MIDI CC Knob]
end
subgraph VISUALS[Visual Engine]
BLOOM[Bloom Intensity]
CAM[Camera Rotation]
PART[Particle Decay]
end
MIDI_GATE ==>|Sharp Flash| BLOOM
OSC_LFO -.->|Smooth Pan| CAM
MIDI_CC -.->|Density Control| PART
classDef signal fill:#1A202C,stroke:#2D3748,stroke-width:2px,color:#E2E8F0;
classDef visual fill:#2C7A7B,stroke:#319795,stroke-width:2px,color:#E6FFFA,stroke-dasharray: 4 4;
classDef env fill:none,stroke:#4A5568,stroke-width:1px,stroke-dasharray: 2 2;
class MIDI_GATE,OSC_LFO,MIDI_CC signal;
class BLOOM,CAM,PART visual;
class PROTOCOL,VISUALS env;
Think about which protocol would be best to transmit these 3 signals.
Try connecting your smartphone to your computer via OSC apps (e.g., “TouchOSC”). You’ll be able to create your own interface with faders on your phone and send data directly to the audiovisual setup via Wi-Fi, controlling sound and visuals straight from your phone screen.
Once channels are established and we know how to communicate with visual tools, it is time to put everything together—setting up the stage and preparing a complete Audiovisual Live “Scene”, which we will analyze in the final lesson of this section.
Use the linked patch entries below as concrete repository anchors for this lesson track.
Adjacent lessons in the same track keep the topic progression coherent.
The first system diagram connects the modular engine, DAW layer, and visual output layer.