ml5.neuralNetwork()
. For example, capture body or hand poses to create a custom classifier that drives an interaction. How could regression be applied to modify continuous elements like color gradients or speed?I remember coming across a book, *Generative Design: Visualize, Program, and Create with JavaScript in p5.js* last year in my Creative Coding class and saw a few shape drawing examples from chapter 2 that really caught my eyes and maybe intuitive to manipulate with body gesture inputs and machine learning.
I picked the sketch P_2_3_4_01 as a reference to build my paint brush sketch as I liked how the strokes seemed to vary according to the speed of my mouse, and I think physics and gesture might go very well together.
Everything worked well, I followed the book and built a drawing sketch that basically draws shapes of my external SVG file, scale the angle and size by comparing my previous XY and current mouseXY (speed), starts drawing when I hold my mouse, and changes brush type by number keys. Now it’s time to swap mouse inputs with hand inputs.
The first thing I have to figure out of course, is the remapping of XY values of my index finger tip to my actual canvas. The input XY both have a range of 0-648 (my canvas size), but sometimes it gets a little wonky, so for safety I mapped the values to my canvas, hopefully clamping my finger XY values to my canvas width and height.
fingerTipX = map(hand.index_finger_tip.x, 0, video.width, 0, width);
fingerTipY = map(hand.index_finger_tip.y, 0, video.height, 0, height);
thumbTipX = map(hand.thumb_tip.x, 0, video.width, 0, width);
thumbTipY = map(hand.thumb_tip.y, 0, video.width, 0, height);
Comparing the distance of the tip of my thumb and the index finger tip, giving it an arbitrary threshold to determine the state of my hand pinch.
let distance = dist(thumbTipX, thumbTipY, fingerTipX, fingerTipY);
if (distance < 30) {
console.log("PINCHED!!!");
}
return distance < 30;
I used Teachable Machine to recognize the various hand gestures to control my brush types, and also increase or decrease the brush size.
I later also added a “control” dataset with images of me and no hand present, and just my background so I don’t accidentally trigger any events to change my brush type, but TeachableMachine just won’t upload my new changes to the server so I had to stick with 1, 2, 3, 4, Lower, and Higher as my outputs. The Higher label actually serves as my control in this case as it always detects camera images of me with no hand present with high confidence.