*The Illusion of Life - Essays On Animation - Edited by Alan Cholodenko*
Phase II Updates
Approach: Because of the issues from last week put me back somewhat on my proposed timeline:
Choices Made: I began small with attributing the numbers (after they had been scaled to a frequency) into a 'cycle' object. This will, instead of how I originally had it with pure MIDI tones, give a broader range to the already-received values from the data, partially eliminating the 'jumping' between notes. I also realized that when it wasn't 'playing' something on the screen, it was also attempting to still output a MIDI note -- if I listened closely I could hear consistent 'clicks' coming from the patch, meaning it was still playing even though I had told it to not play until it recognized that there was an object in front of it. This week I also went back to basics with building the patch and got to incorporate the size of the blob attributed to the amount of amplitude it gives off (this method of getting the size had been found in the previous weeks, but I couldn't get it to work up until now). I tested this using a growing/shrinking circle that would be 1:1 with pixel space on the screen and the amount of amplitude outputted. I threw in a couple of other circles to make sure it still recognized multiple objects on the screen. I investigated a patch as well called HS Flow that incorporated the detection of movement between subsequent frames. I want to further this use and make it so I'll have a version of my work that only updates/turns on the Digital Audio Converter (dac) object when it notices movement. I eventually got the color tracking working as well, but the actual use of this seems like it may not be beneficial to me yet. I wouldn't know exactly what to attribute it to, as I doesn't necessarily have a big factor in what I am doing. Part of the issue with this patch as well, is that it has to be 'set-up' to find certain colors in an area, not lending itself to automatically matching the colors in a scene. I think this would find it's use somewhere in the future, but for now I don't imagine it helping me as much as I thought it would. Another issue is that lighting conditions are constantly different in multiple settings, and wouldn't yield the same results as other settings that are dimly/brightly lit. Inspirational Sources:
*Using Color and x/y Position as pitch on/off music tests*
List of Interactive ways to create Instruments
During the class discussion, it was mentioned to me:
Questions Raised & Needs:
Next Steps: When I talk to Jessica and Janet, I want to remind them that I am interested in the idea of these projects becoming useful in the medical field, but it was solely created first as a tool for the artist, and that's what I'll be focusing on. For the patch, I'll incorporate more steps into the patcher itself regarding blob elongation/direction. I've also been given some Kinect patches from Matt that I can utilize and experiment with over the summer. I'll be investing more ways of current audio/visual readings and examples. I've begun using p5js as a function for coding sounds as well. -Taylor Olsen Comments are closed.
|
All PostsArchives
May 2020
Blog ContentsInterests: |