Collaborators: Carlie (Yutong) Zhang and Anastasis GenmanidisAbove: Sketches made in Processing and later translated for OpenFrameworks. Theses gestures were based on our original sketches that we did by hand. Currently, these sketches are being drawn at the ‘discretion’ of the computer. Above: Anastasis controlling the X-Carve with Python commands through Terminal.
Above: (left to right) sketches with robot testing positioning, example of code from Processing, polystyrene sheeting which cracked under stress of the drawing machine (this was later replaced with vinyl which gave it a proper amount of spring).
Above: Sketches made by robot through terminal with more control.Above: (left to right) Rhino model of pen holder, fabricated pen holder with 3D printed and lasercut parts, pen holder drawing (showing spring mechanism from the side)
Left: Facial recognition of camera running in OpenFrameworks. Camera recognizes drawing area with the QR code in the bottom right corner of the drawing. We used OFX_Aruco and OFX_FaceTracker to recognize face and drawing plane.Above: Bot running smoooothly.