Process Post 2: Making Artwork with ml5

March 9, 2022

Last week, I made this post about learning how to use ml5js to create interactive artwork. Since then, I found the answers to all my questions, redefined my concept, and successfully completed my first interactive display. 

Here are some of the things I've learned since making my last blogpost:

1. How to make a full screen display 

My friend and classmate Jose showed me how to toggle between a minimized and fullscreen display on p5js. 

This is the full screen code:

function mousePressed() {
if (mouseX > 0 && mouseX < displayWidth && mouseY > 0 && mouseY < displayHeight) {
let fs = fullscreen();
fullscreen(!fs);
}
}

Thanks Jose!

2. How to incorporate audio into my code

This Daniel Shiffman video taught me how create a looping soundtrack for my code. 

My audio file consists of fire crackling over the soft tune of a piano.



3. How to place customized emitters on key points of the body

I created an emitter for the key points of the PoseNet skeleton. I didn't need emitters for the eyes and ears so I only made emitters for  13 out of the 17 skeleton keypoints. 


I want to share some of the notes I made after the first showing of my piece. 

Strengths of Fireflection


a) The presentation was considered and tidy

  • The display was full screen and the code was hidden
  • The lighting setup, audio, and projected display created an immersive viewer experience 
  • The p5js code worked properly and fulfilled its intended purpose

b) My work successfully communicated my concept. Viewers were able to understand and interact with the work on their own without me having to explain it. I found this out during my Digital Interventions 201 class critique.


c) The work engaged viewers. After my critique, some of my classmates danced, jumped around, did exercise, and played their own music in front of the display. I also posted my work on my Instagram, and I received positive feedback.


Plans for future iterations of Fireflection

  • Simplify code to make it run faster (it was lagging a little during the first presentation)

  • Reconfigure the emitters so that they are part of an array but individually customizable

  • Make the audio responsive to the viewer. Perhaps the fire crackle sound gets louder when the fire burns brighter and the viewer moves faster.

  • Use more red lights in the gallery to illuminate viewers (this will make PoseNet recognize viewers better and prevent glitching. Luckily, I don’t need to retrain the model)



The Final Artwork

This piece is titled Fireflection and it generates blazing sparks in response to users' movements.


.



A group viewing Fireflection


A close up of the Fireflection projection

 
Fireflection - as the name suggests - is a reflection made of fire. 

Just as a real mirror, the viewer's body is reflected back almost in real-time thanks to a camera system linked to a computer. Fireflection explores the boundary between digital and physical space by using webcam footage to simulate fire, one of the most natural materials. As the digital particles burn, a soft tune and the sound of crackling fire fill the space. The viewer, bathed in red light, becomes an integral part of the work which transforms from inert surface to constantly renewing portrait.  Through the fire mirror, I invite viewers to reflect on perception itself, and bring together contrasting symbols to personify the relationship between human, machine, and nature.



.
.h

Comments

Popular Posts