Audiovisual
Map sound behavior into visuals and performance-oriented output systems.
- OSC and MIDI visual control
- audio-reactive graphics
- Blender / Three.js / TouchDesigner
Turn disparate layers: modular generative sound, DAW routing, and the visual engine — into a single coherent, optimized live scene.
Map sound behavior into visuals and performance-oriented output systems.
Theory, structure, and practical context are all driven from content files.
Concrete repository anchors already exist for this lesson track.
By the end of this lesson, you should understand:
Performance design is the culmination and the final layer where all the knowledge gained during the modules comes together:
When all this works harmoniously and is ready to be performed in real-time, we call it a “Live Scene”.
A working (stable) scene is rarely a “mess” of plugins. It has a clear division of roles:
graph TD
subgraph CONTROL[Performance Control & Sync]
MIDI[MIDI Controller / Macro Knobs]
CLOCK[Master Clock & Transport]
end
subgraph AUDIO[Audio Flow]
VCV[VCV Rack Generative Engine]
MIX[DAW Post-Effects & Mixing]
end
subgraph VISUAL[Visual Flow]
VIS[Visual Engine]
OUT((Screen Output))
end
CLOCK ==>|BPM / Play / Stop| VCV
CLOCK ==>|Sync| VIS
MIDI -.->|Macro CCs| VCV
MIDI -.->|Effect Sends| MIX
VCV ==>|Dry Multitrack Audio| MIX
VCV -.->|OSC Data Mapping| VIS
VIS ==>|Render| OUT
classDef signal fill:#1A202C,stroke:#2D3748,stroke-width:2px,color:#E2E8F0;
classDef accent fill:#2C7A7B,stroke:#319795,stroke-width:2px,color:#E6FFFA;
classDef logic fill:#9B2C2C,stroke:#C53030,stroke-width:2px,color:#FFF5F5;
classDef env fill:none,stroke:#4A5568,stroke-width:1px,stroke-dasharray: 2 2;
class CONTROL,AUDIO,VISUAL env;
class CLOCK,MIDI logic;
class VCV,MIX signal;
class VIS,OUT accent;
A Single Synchronization Source (Master Clock): Usually, a DAW (Ableton) or a hardware sequencer sets the tempo (BPM) and sends Play/Stop commands to all other nodes (visuals and the virtual modular).
An Evolving Generative Engine (VCV Rack): A patch that generates the core of the track autonomously but allows you to manipulate its timbre.
Mixing and Effects Layer (Post-Effects in DAW): Return audio channels to the DAW, distributing them into subgroups (Kick, Bass, Textures) to balance volumes and apply compressors, delays, and reverbs.
A Calibrated Visual Layer: Receives OSC/CV signals from the sound core and reacts only where necessary, without overloading the graphics card.
Performance Controls (Performance Desk): A physical MIDI controller on the desk with a few “Macro knobs” to control the most prominent parameters of the system.
Without a structured scene design, a patch might be interesting on paper, but unfit for a live performance.
In the dark of a club, under pressure, you won’t be able to open modules on the screen and connect virtual cables with a mouse. You must control the music like a ship’s captain: via a few grouped knobs (faders) that produce a powerful musical/creative result. This is called a “safe performance interface”.
An audiovisual performance is an incredible stress test for a laptop: generative calculations hit the CPU (processor), while graphics load the GPU (video card). An on-stage crash means an interrupted show.
Rules for Playing Live:
“Let me map the entire patch onto this 64-button panel.” On stage, due to stress, you will forget 90% of your mapping. Use a maximum of 4-8 physical control elements, but make sure each of them (the macros) produces a tectonic shift in the sound or visuals.
You crank up the Delay Feedback and Distortion to the max (an epic drop!). But how do you return everything to a clean, pulsing rhythm? Beginners often forget where the knobs were originally set. In Ableton or MIDI scripts, always configure a return to a “zero/quiet scene”. Bind an instant disable for aggressive effects to an “undo button”.
Your final assignment for this part of the course: Sketch the architecture of your future live audiovisual scene on a virtual whiteboard (or paper). Write down 4 blocks:
Generative Stress Test: Imagine the worst — your MIDI controller suddenly disconnects via USB. Will your patch “survive” and continue to play autonomously using generative algorithms for another 2-3 minutes while you restart the connection? If the music collates into a single screeching note without the controller, the patch is too dependent on human hands. A good generative system always breathes on its own.
Congratulations! You have mastered the core concepts of building modern audiovisual environments: from cables and oscillators to preparing a full-fledged multimedia live scene. Now it’s time to experiment!
Use the linked patch entries below as concrete repository anchors for this lesson track.
Adjacent lessons in the same track keep the topic progression coherent.
The first system diagram connects the modular engine, DAW layer, and visual output layer.