User Tools

Site Tools


prensf2010:daily_progress

Daily Log and Progress Report

July

July 5-8

Many

Alright, so in the past few days I've put together a series of examples of how to use the uGen framework/minim to complete several of the assignments for the 276 course. They are posted at the minim_vs_ess page. Mostly theses are focusing on the easiest way to complete the assignment, but they make things a little bit too easy in some ways, and a little too hard in others. I'm working on ways to get a little but closer to the intent of the assignment, but minim makes somethings very easy regardless. I have to salute the developers of the library. I'll be spending some more time working on a delay algorithm, and I'll be working some more on the draw stuff project.
-Nels

July 1

9:00-5:00

So I've put together a test page for the minim pitch shifting vs the ess pitch shifting at the pitchShifting page. Its a mock up at the moment though both pieces of code are functional. If that is a good format, I'll keep doing them that way. I've spent a fair bit of time going over the Generative-Gestaltung this afternoon as well. That is a lot of fun, even if it is in German. Its a great examination of some the more complex uses of the processing environment and is much more focused on design and technique.
-Nels

June

June 30

9:00-5:00

Today was somewhat productive on the Mercurial front again, out of interest in the process I've forked a version of the sphereGen/audio generation project from Professor Carl's bitbucket account. I've made some changes that are viewable at bitbucket. The process is very straight forward, and I left some notes about possible source distribution techniques as a message to Professor Carl. When we pick a good way, I'll write up instructions for that methodology on the Mercurial tutorial page. This afternoon I had a meeting with Greg Pond about a combination of projects, including the setup of a three-dimensional printer (of the Plastruder/Makerbot variety). He also shared with me some of the demonstration pieces from Marius Watz's shakerag lectures and workshop, so I can peruse some of those. More as it comes
-Nels

June 29

9:00-5:00

I started on some ideas with the DrawYerself concept, and they are in a zip here. It should have all the stuff it needs to run on the 1.0 version of Processing. It takes advantage of the access that Processing give you to its frame object, and it gives you a multi-use file dialogue. All the things you need in a little toy graphics program. I'd post a screen shot, but that would take all the fun out of it.
-Nels

June 28

9:00-5:00

I've got a few things finished up using the Minim library as a replacement for Ess, particularly audio playback and sample manipulation. I'm not entirely sold on the effect methods used by Minim. When an effect is applied, it is only applied against the current buffer, so it makes it a little odd to get at the whole track. I'm still working on ways around that, but I've made an amusing video of the results here. There isn't any audio, but the effect applied is a normalization filter. When applied you can see a definite change in the activity of the visualizer, the zip of this sketch is here.
-Nels

June 21-23

Many irregularly spaced intervals

I've been a little out of routine the past couple of days, but I've got a few bits and pieces that I think are worth sharing. Since I'm now looking at swapping ess and minim for the audio examples in the 276 class, I've started a couple of projects that use minim. The first is an example of loading files, playing the files, and generating visualizers. The sketch can be found here. I've actually used some of the techniques used to build the animations of stock data in the videos posted below. I'll post more as it comes.


-Nels

June 18

10:00-4:00

I've got a system down that I like for generating animations of Processing sketches. There is an avi encoding method that simply uses jpeg frames, and the rendering tool I have for windows is working well. My other thought is that the quicktime tools available in processing may work better when the process is simplified. I'll write up a sketch this weekend to handle generating quicktime movies from stored images, hopefully it will work a bit better than the real-time capture stuff we did with the webcams. I've got another video of RIMM stock data here.
-Nels

June 17

9:00-5:00

In order to better display the work I did relating to visualizations of large data sets, I've posted a video to youtube. Since there server has to preprocess the footage before general release I'll have to wait to link it, but its worth it (maybe). I managed to get good video out of Processing by using the saveFrame() function and then parsing the image into a single video file. It took about 4 hours to render the frames, with 2 hours spent in postprocessing the images into an avi file. With any luck it will work on youtube, or I'll have to postprocess again with a different container. Any way, the tool I use is windows centric, which makes me a bad person, but its simple and effective. IIt is called JPGVideo, and I have no idea where/when/how I found it. I assume I got it when I was doing video in Processing in November, but I'm not entirely sure. Since I got it from google (more than likely) you can too, and I'm sure its not the only one of its kind. Any way, I'll link the video as soon as it becomes available.
-Nels
– Update
Its here

June 16

9:00-5:00

After seeing Marius Watz's lecture last night at Shakerag I decided that it would be both fun and potentially useful to examine the process of using large data sets as a seed for large three-dimensional visual systems. I pulled a large csv sheet off of Yahoo's trading data site (Specifically I used RIMM but that not really relevant since none of the data was really analyzed with a specific purpose beyond the aesthetics). I've hosted the images I generated separately here. I also completed a much more functional version of the Drum Machine concept over at jmusic. I solved the issue of only one instrument per beat, and it now works rather well. I think I'd like to expand the functionality a little bit more and allow the end user to select which instruments are on which row, but we'll see about that. I think I'll also start exploring the controlP5 library so that I don't have to spend as much time handling my control elements. Writing a graphic slider is fun but it takes time. I'm planning to incorporate some of these graphic things into a library as an exercise sometime soon. When that goes up I'll make sure a note is made here. As a basic list I'll be incorporating a random color palette selector, so a user can supply the palette as a PImage, and I'll include my buttons and sliders and things like that. The plan right now is to build it all with ant so I'll also take the opportunity to learn a bit about setting up xml build files. Thats it for today
-Nels

June 15

12:00-5:00

I spent most of the afternoon working on how to put together a drum machine in Processing using SoundCipher as the playback engine. So far it looks promising, but there are some issues with using the same SoundCipher object to handle playback, without using the SCScore system of organization. I'm going to build a version using the SCScore class. I tried using multiple SoundCipher objects to handle the playback, but that caused a fault with the midi device on my machine. The current best version is posted over at jmusic.
-Nels

June 11

9:00-4:00

We've made a shift, but SoundCipher seems to be pretty good, especially when paired with oscP5. More to come.
-Nels

June 10

9:00-5:00

This was another day working between jMusic and SoundCipher. I'm starting to get a feel for how to manage the both of them, and I found the golden method for handling playback from inside the draw method in Processing. The answer is the playNote playChord pair provided by SoundCipher. There is an example of this over at jMusic. With the oscP5 library it also makes it easily possible to manage data from multiple sources and multiple types of data can be transmitted. I'm going to remove all of the audio generation tools from the Grid Music application setup a general midi receiver, so that there can be several tools accessing playback at once. I'm also looking at implementing the same playNote and playChord methods from SoundCipher, with the tools provided in jMusic. This may require updating the way jMusic interacts with the JavaSound API. My suspicion is that the “depreciated API” warning I'm receiving when I build jMusic is caused by deprecations in JavaSound. If I can track down that error and confirm that I'll note it more explicitly. More later.
-Nels

June 9

9:00-5:00

I've narrowed down the problem areas with jMusic a bit further. It still isn't working right but there number of place where I know it isn't working right has gotten smaller. With that in mind I've started working with SoundCipher. SoundCipher is a midi composition library written by Andrew Brown specifically for Processing. The tools available in SoundCipher are roughly comparable to those in jMusic, and the organization of the library is similar, but overall more concise. I have some example sketches posted on arthur RandWalker, andCipherToy. I'll post more as it comes.
-Nels

June 8

9:00-5:00

In the interests of figuring out why the GridMusic application is such a memory hog, I've set it up in objectdraw. The sad thing is that it is no better. I fact the performance is even more degraded and erratic. I'm going to continue trouble shooting the java version, because there are still a few things not quite right with the graphic organization, and its making things a little funky in the midi sequencing. I've also been going through the whole project and commenting out print statements. In Processing, the System prints from jMusic go nowhere. On the console they take over. Now that those are mostly gone, I've tried a few different start up configurations with the thread stack size, but it hasn't made a noticeable difference. I'm going to finish ironing out the graphical kinks, then start afresh manana.
-Nels

June 7

9:00-5:00

I've got a much better version of my midi grid posted and its starting to take shape a little. I did include a little easter egg add on to the visual component, but I just couldn't resist the joy of keypress functions. When you press 'r' a randomly selected 10 percent of the available notes will become active. Its no where as linear or good to listen to as the constrained random walk, but it does make for an interesting listen. I've posted some other updates at the jmusic page, so check there. I've also set up revision control on the GridMusic application and it is available at http://bitbucket.org/cleverwhorl/gridmusic

-Nels

An example of the random generation function

June 6

I've posted an updated algomuse build at the Google code repository, and I've included the console output from and ant build with xlint enabled. It is not very interesting to read, but most of the errors are causes by the absence of any adherence to the java generic classes update in java 1.5. Other errors tend to be related to redundant typecasts, and unchecked operations that can be avoided by using java Generics. I'm going to see if this still works in Processing if I add in the parts to get generics up and running, since I know you can't use them in Processing. Other than that I'm still playing with how thing are put together in midi formats, and I think as a side goal I'll be examining the internal structure of midi files.

-Nels

June 4

9:00-5:00

This is the result of the afternoons work on an overlay midi rendering utility. It takes advantage of the JMC since its there. I'll post more as I get more. -Nels

June 3

9:00-6:00

Today I managed this! I'm intending that it be an example of practical interaction between jMusic and Processing. I have a mashup between minim and a Conway style game of life cellular automata thing but it sucks and I'd just as soon throw that away and start afresh. I'll post more later. Its time to start cooking dinner.
-Nels

June 2

9:00-5:00

I finally got the jMusic source set up in my directory structure so I can build the jar myself! Yay. I've also switched over operating systems for a while. I'm switching back and forth between Ubuntu Studio Lucid Lynx and OSX. I've found that Processing for Linux doesn't have the best interface, so I'll probably keep using OSX for that. I've set up a source code repository at googlecode and its linked in pretty well now. I may have started on that a bit early, but with as often as I change machines, I think it will be useful for me to be able to get at my recent stuff more easily. The current project title is algomuse since our end goal is a algorithmic music composition library. I've take a look at the dev.processing page on libraries and I'll set up a formatted download that matches their specifications as part of the code hosting. As I get examples working I'll start including them as well.

I'm also debating a bit on the jMusic library itself. I think I finally understand the purpose of the rendering process that is making it a bit difficult to handle the playback thread with the PApplet. Since all of the tasks related to the play thread, must complete before draw() is called, I not sure how best to manage the fix so Processing runs correctly. Anyway, the whole library is geared towards handling a whole bunch of setup ahead of time and then just running, in the Processing we want to be able to get at the playback objects over time. My first thought is that I want to either

  • set playback (the call to the Play object) to an event or that is called after draw initializes
  • incorporate the functionality of the play thread into PApplet or vice versa.

That may not be exactly how I want to phrase my intentions, but I didn't have great luck the last time I wanted to get threads to play nice inside processing when I tried to do it with jogl. I'm still refining the detail as I go through the jMusic source so we'll see where this goes.
-Nels

June 1

9:00-5:00

Command line kung foo is a somewhat under appreciated discipline, and today I was reminded that I need to exercise mine more often. Between setting up a Mercurial repository in a Windows environment, an Ubuntu Linux environment, and in OSX, I feel like I've gotten a fair workout on the command line. There is a fairly minimal instruction set that outlines my basic learning path with Mercurial here. I've tried to be complete and post my readings on that page. I found the materials provided by bitbucket and at the Mercurial's website directly to be the most helpful. I can't emphasize enough that step one in learning revision control is learning why you want to do it. Other wise it tends to be a bit like voodoo at first (ok its kindof voodoo anyway). After working with sphereGen this past semester, and after trying to navigate my sketchbook from the open file dialogue in Processing I know I need to find a better way to manage my source code. I'm committing my self to incorporating revision control into my work flow. The Mercurial installation is by far the easiest part, and managing local revision work is very intuitive. I'm still not a big fan of TortoiseHG but it works and installs well in WindowsXp. I haven't seen a good graphic version for OSX or Ubuntu, but that doesn't mean there isn't one.

On a lighter more research based note I've started delving deeper into the source code behind jMusic. It is a bit baffling how much data storage overhead is incorporated into the class structure for that library. In doing that I did find some resolutions to the problems that I've been having with jMusic in Processing. The basic Play class is starting its own thread, and managing its thread according to an internal time frame, which conflicts with Processing's thread. Its not really a bad thing but it means that a call to play should come after the canvas has been initialized. The other problem I'd been encountering was that when playback completed the sketch would close. This is fixed with a boolean option to Play.audio or play.midi that tells the Play class whether or not you want the Play thread to terminate the parent thread when playback completes. In the process of finding all of this out I also found an interesting forum article about the intentions for the end use of jMusic and jmEtude here. I'm still pondering the last post of the thread, but it I think I'm beginning to see the difference between Processing and jMusic. Processing is all about real time interaction, whereas jMusic is more about an initial setup, and then a complete terminal run without a lot of meaningful interaction from other interfaces. I don't see this as a bad thing, but it certainly make communication between a running PApplet thread and a running instance of Play more difficult. I'm thinking that I'm going to start revision control on my copy of the jMusic library and start playing with the source some. I'd like to change some of the things that make it harder to use jMusic with Processing before I start wrapping up the class any further. Not only that but it'll be good practice :-P

May

May 28

9:00-5:00

So today was a break from the Arduino project, and I've started in on the jMusic library. So far I'm still working out the kinks of using the library from inside the Processing IDE, but for the most part jMusic has been cooperative. There are certainly a few quirks that will have to be dealt with before it will be useful as a library. The sketches are posted over at jmusic and I'll post teh same warning there that I post here. Because of the issues between the PApplet and the thread attached to the Play class in jMusic, if you start the applet, the only (convenient) way to stop the playback is to close out your browser at the application level. Thats one on the kinks I'd like to work out sooner rather than later. However other than that, I have found that there is a considerable amount of bulk attached to jMusic. Talk about a lot of stuff. There are facilities for virtually all aspects of music composition/playback/management/etc. I'm going to have to sit down and figure out dependancies so I can strip out the things I don't need. I think my baseline version would ideally contain the facilities to manage midi playback and just the components for composition, allowing me to strip out all of the graphics objects and some of the extraneous classes. From there I think it will get a lot easier to figure out how to make jMusic playback “play nice” inside PApplet. Either way I have to say that this library is very intuitive to manage for midi composition. I'll have to play with jmEtude in order to see how it compares. -Nels

May 27

9:00-5:00

This will I hope be the longest I spend on interfacing Processing and Arduino during this project. The issues I've tried to address today are posted over on the bug log. It may just be me being overly picky, but its causing issues when the arduino has to handle multiple messages across multiple serial lines. Its started to drop characters when pushed to hard, which so far has been all the time. So the commands being sent to the VDIP1 are getting garbled. I switched to this evented style this afternoon to try and solve the problems I had with relying on the available function to determine the contents of the buffer, without much success. This also brings to moot for the day the solutions I was considering for handling data on the Arduino. I think that the problems may go away as I start doing more on the Arduino side and reduce my reliance on sending messages to via the serial connection. With that in mind, I think I will just set up the Arduino to write a single value to disk in a loop. Baby steps and all that ^_^ -Nels

May 26

9:00-5:00

Alright, so things are going well over at the Arduino page and I finally got a circuit diagram post for connecting the VDIP1 and the Arduino.

Feel free to skip this:
<Digress>
Another interesting side note is the name of the firmware we are using as the master controller for the usb flash drives. The name Vinculum is of latin origin and in this case would easily translate to chain, which I find to be oddly appropriate for the context in which we are working as this device literally chains together a controller and a slave storage device.
</Digress>

Moving along with the spirit of the Arduino project the next step from using the Arduino as an interpreter is to start writing programs that will do it without a user to talk to. With that in mind the next component will be written in both Arduino and Processing, so that the Arduino has an external source of data to collect and write to the newly available flash drive. The best thing to do will be to store a log on the Processing side, and use that as a comparison against the data collected by the Arduino and stored to the flash disk. This has been made much easier by the creators of the NewSoftwareSerial library for Arduino, and I owe them a debt of gratitude. You can find more of their work here. Looking forward a bit, it seems that the Arduino project will be slowing pace after this test, to make room for jMusic. With that in mind, I'll be starting on Making Music With Java by Andrew R. Brown. That should provide the background to start getting into the classes provided by jMusic to do algorithmic music composition. The goal is to get those classes and the support structure they need to function fully encapsulated as a Process library. That means also that I'd like to extend the syntax highlighting for the library, and produce some fairly decent documentation. So that's the stuff in the pipes for the rest of the week. -Nels

May 25

9:00-5:00

I finally got the basic proof of concept worked out for the Arduino flash drive. The code for the repl is posted here. The next step is then of course to make the steps modular so that data collected on the Arduino side can be written to a log file. With that in mind I'm going to attach an analogue sensor to the 'nove soon so that I can get an idea of what data collection looks like. Don't judge me to hard on the aesthetics of the repl. The VDIP1 doesn't return newlines so it can get to be a real mess.
-Nels

May 24

10:00-5:00

So this was another day of banging against the Arduino project. So far I am a little miffed at the difficulties I'm having. The circuit is/should be complete, and the Vinculum chip is receiving power. However, the chip is having some difficulties getting the Arduino to correctly signal the Vinculum. The rub lies in that the Arduino has a single serial pair for transmission/reception. More can be added and managed by additional libraries, but their functionality is a bit limited. I've found some other code examples using similar hardware, here. I intend to further explore the NewSoftSerial library tomorrow as part of this project in order to allow me to debug the results I'm getting without resorting to walking down the hall and borrowing an oscilloscope. By using multiple serial ports I'm also hoping that for a proof of concept to be able to directly control the Vinculum via the Arduino serial interface. So instead of hardcoding commands for the Vinculum onto the Arduino chip, just use the Arduino as an interpreter for commands sent from the serial monitor running on the Mac. The result should be similar to a very user un-friendly terminal. The initial goal is to be able to read error messages as they are passed form the Vinculum. The full version will encapsulate the commands as an internal structure for managing sensor data, but I think the initially, the goal should be full communication. Here's hoping…

May 21

9:00-5:00

This was a much more of a head scratcher kind of day. This morning I completed a third version of a random music generator based on the pentatonic scale. It is much more appealing to listen to than the earlier versions. Still not good music, but not bad for pseudo-random number generation. The program maps a range of random values into six ranges partitioned based on the upper bound of the random number generator. These six ranges correspond to the A C D E G pentatonic scale, and the map includes the octave. The early problems with the Minim sound library are starting to disappear. They appear to be related to the rate at which the frequency of an oscillator is changed. In the initial versions it is possible for the duration of the sound to be set to zero, which caused problems with audio playback. Since a sine function is used to convert the positive values fed to the pentatonic mapper into the balanced range -1:1 the range of durations generated was shifted so that it still constitutes a full period of a sine wave, keeping the probability even, but eliminating the possibility of sending a zero value as the duration of the sound.

The afternoon turned back to the Arduino project. It appears that the process of actually connecting the VDIP1 usb device to the Arduino is not going to be as difficult as I thought, but it took a couple of hours perusing the documentation for both the Arduino board and the VDIP1 to get all of the right information as to how to make it work. The common ground for both chips is a serial UART standard which stands for Universal Asynchronous Reciever/Transmitter. This standard will allow for direct communication between both devices. The VDIP1 also supports a binary mode of access, which will reduce the load on the Arduino in the final version. So the project listed on the Arduino forum which describes this process make a world more sense.

May 20

10:00-5:00

Today I got oriented! I'm starting on Arduino, and have so far tested two variants the Deumilanove and the Diecimila. Both are cooperating with the Arduino development environment in OSX, with further tests to be pursued in other operating systems (but the same hardware) later. I also examined the Processing serial interface to allow direct communication between Processing and the micro-controller. This afternoon also led to a slight detour in examining fractal music, which resulted in a pair of Processing sketches on random noise. I spent a fair amount of time crawling through the Arduino playground reading about several projects that utilize the VDIP1 usb device. It appears have a simple setup, and hasn't changed substantially since 2008, and is apparently stable. My understanding of the process is that the Arduino chip is sending a serial message to the VDIP1 and the controller onboard the VDIP1 interprets the message. Since VDIP1 handles the bulk of file i/o, it should take some of the work out of managing the file system. The command structure for the VDIP1 is well documented, and I believe (don't hold me to this) works with the FAT16 file system. More mañana.
-Nels

Notes

prensf2010/daily_progress.txt · Last modified: 2011/05/04 17:39 by scarl