This week’s discoveries

-If you haven't read it, this interview with Bartholomäus Traubeck , who got honorary mentions Ars Electronica PRIX in 2012 with his piece: "Years".

-This video he mentions in the same interview:

-A great guide from Amanda Ghassaei to Max/MSP on Instructables. Max/MSP is very similar to Pure Data (actually Pd is very similar to it, since the same person/people from Max team built Pd later on) but has a better interface imho. An essential tool for creating interactive works and for much more.

-The concept of a Harmonic Table. This attracted my attention, since it would be a great idea for building generative soundscapes and artworks. Coding this structure in Python would be a nice idea, I will keep that in my mind. Why Python? Two reasons, first it is very easy to write and code. Second, you can do lots of stuff inside Python and extend the scope of your code greatly by sending the data to other platforms, frameworks and programs. So go Python!


Click for larger image of "Harmonic Table"


-This guide on encryption by LifeHacker.Don't forget to check the other links in the article if you find it interesting!

This guide on the Virtual Machine concept. What is a Virtual Machine and why use it? Basically it is a software that enables you to use an OS (say Windows) while you are using your current one (say OS X) without needing to restart your system and easily benefit from the use of multiple OS's.


ReHearSal featured on ID-Mag


It's always enjoyable to see your work getting featured somewhere. My friends and I worked on this project for couple of months and I'm glad to see it is appreciated.

The link on ID-Mag's website can be directly reached from here.


An Online Tutorial For Creating Interactive Artworks


Hey all,

It has been a busy couple of months and my two friends and I managed to exhibit an interactive work that we named Rehearsal. For detailed information and a handsome video, do check here.

This will be a tutorial for creating an interactive project with Pure Data and Ableton Live on Windows.

In couple of sentences, our project gathers user input from Kinect sensors, processes the data in vvvv and sends floats/integers to Pure Data (Pd), Pd receives the data and maps it to MIDI notes, sends the MIDI notes over LoopBe1 Midi to Ableton Live and lastly, Ableton Live plays them out loud. At the last step I also used a synthesizer VST plug-in to create an appealing sound. Seems intimidating when you read it like that, but trust me once you know what to do it isn't that hard.

The exhibition consisted of four computers with four individual Kinect sensors and each four computer ran vvvv, Pd, Ableton Live and synthesizer VST plug-in. A fast computer would be nice, all these stuff require effort for the processor to deal with. We had four fast computers, but you can run vvvv and the rest of the software on different computers, since the connection between vvvv and Pd is done with TCP.

I was responsible for the sound design part, so I worked with Pure Data, Ableton Live and the synthesizers.

Don't mind the fancy topic, I was just trying to catch your attention. Although I can tell you that if you are into interactive works, you will definetely benefit from this tutorial. I will also provide my Pure Data patches to make everything easier for you. Please note that Pd is a free and open source software and LoopBe1 is also free, however Ableton Live is a product that you would need to buy (it has a one-month trial period though).

First, if you haven't already installed Pure Data to your computer, you should download it. It is a must if you are interested in DIY sounds, computer generated (algorithmic) music, computer aided live shows and most definetely, interactive projects. I recommend the extended version, since it has lots of additional libraries in it. Also Dr. Rafael Hernandez's Pd tutorials are great to get started with. Don't feel like these videos are the ultimate guides to learn Pd though. After a couple of videos, I figured that Pd help files that come along with Pd and googling stuff is the best option to get your way around. Also you can ask me anything about Pd, I will do my best to reply you quickly. I know what it feels like getting stuck at a point and never finding a solution to your problem.

So anyway, you should have installed Pd-extended by now. You are ready to roll. Click on the Pd file I have provided with this post and let's investigate it step by step. Download and open "Interactive MIDI.pd". Two windows will show up; Pd-extended is the main window of Pure Data and Interactive MIDI.pd is the file you opened with Pure Data. Pure Data is a node based, visual programming language. You will see bunch of blocks connected to each other with lines. Kind of like a block diagram. The structure is easy to understand: the hierarchy goes up to down. The output of the first block is connected to the next block's input (At this point let's assume this totally correct. I will show other stuff as we move on).

You see netreceive 9922 is the first block we have. Netreceive opens the port 9922 on your computer and listens to the incoming messages from that port. Right click on it and go to Help to read more about it. This Interactive MIDI.pd file was meant to listen to the incoming messages from port 9922, to which the Kinect data was sent. I was getting seven floats, which were the x and y axis coordinates on a canvas. When you went in front of Kinect and waved your hand from left to right, one of the floats I was getting would change, say, from 0 to 8. Now I received seven of these floats in a packet, so I used unpack function as the second node. The third set of blocks are just numbers, to visualize the floats that I get. Next, we have select function. This redirects the received float at an instance to the related output. It receives seven floats and repeats this process for each input.

Next we have buttons, or 'bang's as they are called in Pure Data. This is again used for visualization purposes. When you receive a series of floats simultaneously, it is harder to see what's going on without a button. Think of them as LEDs in a circuit, which show the path taken by a signal at an instance.

Now, this part looks a bit crowded but it's very simple to understand what is done here. Until here, I receive some coordinates from Kinect sensor as floats, and I have changing signals as someone interacts with the sensor. What I want to do is to map these signals to some beautiful audio output. Heck, I just want to produce some cool sound. So, what do I do?

I had the idea to choose a musical scale and get the notes out of this scale, connect them to the signals I receive and whenever the signal changes, it will play a note in that scale. No matter what signal I get, the played note will be in a specific scale and it will be a consonant note. So, what's with the numbers in the blocks, right? They are the frequencies of each note in that scale. For example, if I want to use F3, I type 174.61 in - or simply 174. I just googled for the list of these frequencies and found this table. Use it! It is wonderful, and you are totally free to select whatever note/scale you want. You can also create a dissonant structure and use notes that are off the scale. Or you can use totally random notes - just go ahead and try it out.

All these frequencies are connected to another number block, which changes whenever a signal is transmitted to it. If the signal comes from 131 at time t, it becomes 131 until the next signal comes from another block. Then it takes the number of that block and it goes on like this. Then it passes its value to ftom function, which is the abbreviation of frequency-to-midi. Pretty straight-forward, this converts the frequency we get to a MIDI note, a note that you could play on a keyboard. You can see this MIDI note in the next block.

The rest is MIDI connection. Makenote block takes 'pitch', 'velocity' and 'duration' as input and gives 'pitch' and 'velocity' as output. The pitch is what we get from the ftom block. You can manually change the other values though. Makenote is connected to noteout block, which transmits the MIDI notes. So as you see, we don't use Pd to create the sound. We could do that, but using a DAW (Ableton Live in thix example) and a synthesizer is a much more elegant way to produce the sound. So, I chose to transmit the MIDI notes to the synthesizer in Ableton Live.

Do not forget to go to the main Pd-extended window and click on DSP button, which stands for Digital Signal Processing. As long as this block is unticked, Pd will do the computation but will not produce any sound. This is where LoopBe1 comes into play.

LoopBe1 is an internal MIDI port for Windows OS, which virtually connects the input and output of different software. So when you want to take the output of Pd and connect it to the input of Ableton Live, you need it. For Mac, you have other options such as Jack OS X. There is LiveOSC, a widely used interface for connecting MIDI, but for me LoopBe1 was the easiest and the most user-friendly tool I could find.

Don't forget, these tutorial patches are made for the incoming data on port 9922, but you can always change it. Try deleting the netreceive 9922 block and inserting a randomizer: Create a bang, connect it to a 'metro 500' object and connect 'metro 500' to 'random 20' object. Then, connect the 'random 20' object to 'unpack f f f f f f f'. When you click on the bang with the proper mouse icon, you will see your patch working. Also check the ftom output, it changes as the signals come through right? These are the different notes played by your computer. Good work there!

It has been a long tutorial but I tried to explain the methodolgy in mapping sound according to the data input. Next time, I will show how to do things on the DAW side.

If you could really read all this post up until here and do everything properly, you really have a good patience. This was my first tutorial post and I apologize for the poor explanation. Feel free to ask me anything regarding this tutorial and I will try to answer your question.

Until next time!