1. Develop a p5.js sketch with ml5.js, using one or more of the models we’ve covered in the first half of the semester. Here are some ideas to get you started:
  2. Document the process of making your p5.js sketch on the ml5.js Project Step 2 Wiki page. In your blog post, include visual documentation such as a recorded screen capture / video / GIFs of your sketch.

Work Process

Inspiration

I remember coming across a book, *Generative Design: Visualize, Program, and Create with JavaScript in p5.js* last year in my Creative Coding class and saw a few shape drawing examples from chapter 2 that really caught my eyes and maybe intuitive to manipulate with body gesture inputs and machine learning.

image.png

image.png

I picked the sketch P_2_3_4_01 as a reference to build my paint brush sketch as I liked how the strokes seemed to vary according to the speed of my mouse, and I think physics and gesture might go very well together.

Building my Paint Sketch

Everything worked well, I followed the book and built a drawing sketch that basically draws shapes of my external SVG file, scale the angle and size by comparing my previous XY and current mouseXY (speed), starts drawing when I hold my mouse, and changes brush type by number keys. Now it’s time to swap mouse inputs with hand inputs.

Mapping my Hand XY values

The first thing I have to figure out of course, is the remapping of XY values of my index finger tip to my actual canvas. The input XY both have a range of 0-648 (my canvas size), but sometimes it gets a little wonky, so for safety I mapped the values to my canvas, hopefully clamping my finger XY values to my canvas width and height.

fingerTipX = map(hand.index_finger_tip.x, 0, video.width, 0, width);
    fingerTipY = map(hand.index_finger_tip.y, 0, video.height, 0, height);
    thumbTipX = map(hand.thumb_tip.x, 0, video.width, 0, width);
    thumbTipY = map(hand.thumb_tip.y, 0, video.width, 0, height);

Pinch for Start Drawing

Comparing the distance of the tip of my thumb and the index finger tip, giving it an arbitrary threshold to determine the state of my hand pinch.

let distance = dist(thumbTipX, thumbTipY, fingerTipX, fingerTipY);
  if (distance < 30) {
    console.log("PINCHED!!!");
  }
  return distance < 30;

Teachable Machine Training Hand Gesture for Brush Types

I used Teachable Machine to recognize the various hand gestures to control my brush types, and also increase or decrease the brush size.

gesture.mp4

I later also added a “control” dataset with images of me and no hand present, and just my background so I don’t accidentally trigger any events to change my brush type, but TeachableMachine just won’t upload my new changes to the server so I had to stick with 1, 2, 3, 4, Lower, and Higher as my outputs. The Higher label actually serves as my control in this case as it always detects camera images of me with no hand present with high confidence.