When I first saw this audio/visual generative performance piece by sandufi on Instagram I was interested in how stylized image generation could be done in real time, so I started investigating.
Turns out it was done in TouchDesigner with a LOT of Python scripting. From my research, the system starts with setting up different cameras and computers for various inputs and processing since real-time image diffusion is an extremely resource-intensive task. He first collected motion tracking data (body pose and face), then conditioning them with ControlNet to control the accuracy of humanoid generation (in this case it’s the cement buildings) in image diffusion models. Some of his models were custom-trained that enable live deep fakes and body transformation, stylized and guided by text prompts. For the rest of the high-level process I believe is done between Chataigne and TouchDesigner.
I’m aware of a real-time image diffusion asset in TouchDesigner called stream diffusion, but sandufi opted for Chataigne, which is similar to comfyui in allowing interaction between different AI tools and TouchDesigner.
https://editor.p5js.org/XXHYZ/sketches/Bx8nDj164
Set up a grid of particles, used perlin noise to deform it over time, then added index finger as an attraction field to move around the particles.
It was pretty counter-intuitive to map the values of my 2D hand gesture xy locations to my 3D coordinates, so instead I just flipped the values. I used similar code for mapping mouse2D to mouse3D and just replaced values with my index finger’s XY position.
let hand3D = createVector(index.x - width / 2, index.y - height / 2, 0);
// Iterate over all points
for (let p of points) {
// Generate movement direction based on Perlin noise
let nX = noise(p.x * noiseScale, p.y * noiseScale, p.z * noiseScale + frameCount * noiseSpeed) * TWO_PI;
let nY = noise(p.x * noiseScale + 100, p.y * noiseScale + 100, p.z * noiseScale + frameCount * noiseSpeed) * TWO_PI;
let nZ = noise(p.x * noiseScale + 200, p.y * noiseScale + 200, p.z * noiseScale + frameCount * noiseSpeed) * TWO_PI;
// Map Perlin noise to velocity vectors
let velocity = createVector(cos(nX), sin(nY), cos(nZ));
velocity.mult(pointSpeed); // Scale the velocity
// Apply hand interaction force
let force = p5.Vector.sub(p, hand3D); // Vector pointing from the hand to the point
let distance = force.mag(); // Calculate the distance to the hand
let maxDistance = 150; // Set a maximum range for the force field
if (distance < maxDistance) {
force.normalize(); // Normalize the force to get the direction
let strength = map(distance, 0, maxDistance, 1, 0); // Force decreases with distance
force.mult(strength * forceStrength * -1); // Apply repulsion or attraction
p.add(force); // Apply the force to the point's position
}
// Update point position
p.add(velocity);
Over time the particles eventually clump into one place, and I haven’t figured out how to continuously interact with it without interruption. And there was also significant performance issues.