User Tools

Site Tools


cs444ss:progress_reports

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
cs444ss:progress_reports [2010/05/05 10:49]
sandesr1
cs444ss:progress_reports [2010/05/09 22:02] (current)
sandesr1
Line 1: Line 1:
-So far, I've been able to create a performance interface in PD that loops sound files according to where they are in playback. +Sam Sanderson 
-I'm currently looking into how to create samples from playback of a wav in PD, but so far have been unable to figure it out. +CS444
-However, I should have that ready by Mon.+
  
-<href= "​IndependentStudy/​IndyStudyPD/​performanceinterface.pd">​ Audio Mixer+This semester I have divided my efforts in programming to encompass creating various modes of interfacing in the manipulation of both audio playback and video display. Prior to entering this course I had completed ​project in an independent study with Professor Greg Pond that involved using Nintendo’s “wiimote” as an interface for a synthesizer built in the Pure Data programming environment. The synthesizer was constructed so that it is capable of playing several octaves of a pentatonic scale. Also included were sampling and delay functions. The delay function controls the amount of time that lapses between the first instance of a triggered pitch and it’s second, delayed occurance. The sampling function allows the musician to record short, 5-10 second samples for playback
  
 +In order to connect the wiimote to the digital instrument, first the Bluetooth signal the wiimote broadcasts had to be converted into a midi signal that can be utilized in PD. This was accomplished using the open-source program “Osculator” which interprets data broadcasting from a Bluetooth device and assigns that data a value on a specific midi channel. Once the Bluetooth signal has been converted into useful midi values, then the control line object, “ctlin”,​ in pure data can be called to read the signal from it’s assigned channel. This data can then be assigned to various elements in the pure data interface. The data generated by the pitch accelerometer in the Nintendo wiimote, was assigned to control the movement up and down the pitch values in the synthesizer. The roll accelerometer was assigned to control the delay function. The various buttons on the wiimote were configured to either record or playback samples. Finally two buttons were assigned to control the audio output so that the synthesizer could be turned off.
 +
 +Working on this project left me with some questions. For example, the wiimote instrument described above can manipulate synthesized notes, but what about pre-recorded .wav files? How could they be incorporated into a live musical interface within Pure Data? In other words, I became interested in building a graphical user interface for the live mixing and manipulation of pre-recorded sound files during playback. My first foray into this area was the construction of a program, which automatically manipulates the playback of a sound file. The program, pdPI2 (short for pure data Performance Interface 2), allows the user to play two pre-recorded sound files and manipulate the left to right panning of those files during playback. In addition to being able to control the pan from speaker to speaker, the user can also trigger a continuous looping function in the program. Once triggered, a five second sample is recorded from the second sound file and placed in an array. As soon, as the second sample has been recorded and placed into it’s array table, a five second sample from the first sound file is recorded and placed into a separate array. Simultaneously,​ the first sample (from the second file) plays back over the two original files. This creates a delay in the overall sound of the two files, as each file repeats a new sample from itself every five seconds, while progressing through the file. 
 +
 +While this program was interesting sonically, it did not allow for an amount of interactivity comparable to what I was able to achieve with the wiimote. As this was the case, I began searching the pure data libraries and forums for ways in which a greater interactivity could be achieved in the playback of sound files. My second patch is a rewrite and modification of Alberto Zin’s Pure Data patch “Tuner”. Zin’s patch utilizes the grid object in the “unauthorized” library in PD, to control the fading in and out of eight distinct sound files. Each file 1-4 and 5-8 is assigned a quadrant of the grid. For example, if the red locator dot in the grid object is placed in the top left corner of the grid the only file in the outgoing mix is file 1. If the dot is placed in the bottom left, file two; bottom right, file three; and top right file 4. If the dot is placed in the center, all four files are mixed equally. The mix between files 1-4 and 5-8 is controlled using a horizontal slider below the grid. If the slider is positioned all the way to the left, only files 1-4 are put through to the final mix; all the way to the right, the patch only sends files 5-8; and in between the patch sends a mix of all eight files weighted to the position of the slider. Zin’s patch also allows the user to call up new files using PD’s “open panel” function. While working with Zin’s patch, I found that the majority of sound files I was trying to use with the program played at rates twice as fast as the original file. In an effort to correct this, I attached vertical sliders to each of the eight files. This allowed me to change the speed of each file’s playback on the fly. The resultant program allows the user to create interesting and engaging mixes of sound or music.
 +
 +After working with interactivity and interface in relation to live and pre-recorded sound, I became interested in how a similar interactivity could be utilized in live and pre-recorded video. This switch from sound to video also required a switch in programming environments from Pure Data to Processing. My first effort in creating interactivity in video was a program that called the “capture” object of Processing’s video library, and juxtaposed the video feed from the computer’s web-cam with a pre-recorded video. While this satisfied my requirements for utilizing both live and pre-recorded inputs, the interactivity just wasn’t there. Upon discovering Andre Sier’s “flob” library, I started to re-think the ways in which live and recorded video might interact with each other within the contexts of processing. Sier’s library is built to do “fast fill, multi-blob detection”. What the “flob” library does is track solid, moving, shapes in the computer’s video feed. For example, if you wave your hand in front of the camera while running a flob program, the program would track all of your fingers and your palm separately. In my final program for this semester, I used the data taken from the amount of “blobs” (moving shapes) located in the video feed can then be used to trigger certain an event. By attaching a .mov file to the if statements governing an integer, “gain”; I was able to use the amount of motion in the video feed to trigger the pre-recorded video. Once “gain” reaches an amount of 100 or greater, the pre-recorded video is displayed over the computer’s video feed. If “gain” drops below 90 then the video stops playing and the video feed is visible again. At the same time, a variable “mt” (for movie time) is set to the position of the playback head in the .mov file. This allows for the movie to start up again where it left off once “gain” reaches above 100. In contrast to my programs focusing on sound, this program does not allow the user to manipulate files in any manner. Instead, the program forces the viewer to interact with it if he/she wishes to view the movie file in its entirety.  ​
 +
 +Here are some various versions of audio mixers as they evolved.
 +
 +[[http://​arthur.sewanee.edu/​sandesr1/​IndependentStudy/​IndyStudyPD/​pdInstruments/​stillis.aup.pd|AudioSynthesizer]]
 +{{picture_8.png}}
 +
 +[[http://​arthur.sewanee.edu/​sandesr1/​IndependentStudy/​IndyStudyPD/​pdInstruments/​pdPI2.pd|pdPI2]]
 +{{picture_9.png}}
 +
 +[[http://​arthur.sewanee.edu/​sandesr1/​IndependentStudy/​IndyStudyPD/​performanceInterface.pd|PerformanceInterface]]
 +{{picture_10.png}}
 +
 +Here is the link to the flob trigger program in Processing.
 +
 +[[http://​arthur.sewanee.edu/​sandesr1/​IndependentStudy/​Processing/​FlobTrigger/​flobtrigger/​flobtrigger.pde|FlobTrigger]]
 +
 +You'll need to download and install Andre Sier's [[http://​s373.net/​code/​flob/​flob.html|Flob library.]]
cs444ss/progress_reports.1273074542.txt.gz · Last modified: 2010/05/05 10:49 by sandesr1