Music Control, Interactive Music Systems, Physical Computing, Natural User Interface, Tangible Computing, OSC, MIDI, Max/MSP, TUI/NUI, Interactive Scultpure, Processing, Chuck, Arduino, FTIR, Audicle, Monome 40h, DIY, openSource, Reaktor 5, Granular Synthesis, Analog Synthesis, Analog Sequencers, Touch Control, Haptics, Xenome, The Stribe
what is soundwidgets.com?
It's a blog where I post cool stuff I find on the web. I try to post projects which more or less relate to the above topics. Sometimes I just post random stuff.
This also acts as an informal project blog for a music control device I'm designing and building called the Stribe.
I also occasionally post clips and info relating to experimental electronic music I make under the name phineus.
Latest tracks by phineus do you sell stuff?
Actually, yes! You can support The Stribe Project by buying kits from CuriousInventor.com, or by buying Stribe.org T-Shirts or paticipating on the Stribe Project Forum.
You can support Phineus by ordering the Compleat Works of Phineus on USB hard-drive for $25 including shipping. Send e-mail to order. what does "stribe" mean?
It means "stripe" or "striped cloth" in Danish.
A couple of things occurred to me during Jeff Han's multi-touch interface demo (see it here).
1) Add sound. The part where he's zooming in and out of a cloud of dots and talking about data modeling, I wanted to hear whooshing sounds and changes in background hum to give me additional information about my realitve location within the 3D space.
2) Use sound to simulate haptic (touch) information. Can you fool the brain with sound? As you drag your fingers around on a virtual surface, if you represent changes in surface texture as changes in sound, do you kind of feel it? Not to mention efficiency. You can type really fast if you can touch-type, not having to look at the screen to see where your hands are. Wht if instead you could "hear/feel" where they are? Keep the eyes free for detail work.
3) What about eye-tracking, too? Hands can do one thing while eyes look somewhere else.
4) Multi-user. Multiple people could look at a screen and create something: images and music based on where they're each looking, highlight certain data, meanwhile you can track what each is doing with their hands. If you had enough processing power (how about multiple automonomous CPUs sending synched OSC info to a server which runs the display) you could track a whole crowd's responses in real-time. Latency would be kind of okay because it would make people focus on one spot until the changes began - might be less chaotic, too.
Giant Hand was made with Reaktor 5. The ambient tracks use granular synths, each song is one short waveform chopped and replayed through filters. The rhythmic stuff is done by flipping multiple samples.
This page is the source of much knowledge: mtg.upf.edu/reactable/?related It's got links to videos and explanations and people behind a wide variety of tangible music interfaces e.g. soundwidgets of the highest order.
How about a plaza, with each bench, bush, etc a tangible object on a huge reacTable? Add in some humans, dogs, pigeons, and track them as tangibles, too, based on blob shape. An aerial view of a park becomes a sound canvas. Now pipe the sounds in over xmitters in the park, sending to optional headsets. Let people make their own soundtrack - quiet and reflective for reading a paper on a bench as runners zizz by. Picking up more as the commute thickens. Quiet at night, expcept in cases of sudden movement.
So Sparkfun sells these button pads like the monome has, but bigger. Plus they have room for 3-color LEDs. Make or buy a monome logic board or two (schematic is on their site - they MIGHT be selling built logic boards and kits soon) and buy some Sparkfun pads, PCBs, 3-color LEDs and make your own tri-cube-omial or something.
SynthEdit is a Modular Synthesiser for Windows XP and Vista. With SynthEdit you can design your own Synth from the ground up. Drag and drop modular components, connect them with virtual "patch cords". - www.synthedit.com
I've been meaning to post about SynthEdit, because it is so cool, and because it's free. I loaded this onto my computer and it worked immediately with my basic soundcard (Echo Mia MIDI). Immediately I was making sounds with a Moog-like synth and then an ARP 2600. I got to see the layout and get an idea of what these might sound like in real life. Definitely cured me of the expensive urge to buy vintage mono-synths for their unique sounds, esp since most of those are out of tune all the time due to neglect. Not that an out of tune synth is necessarily a bad thing. ANYhoo, it's very cool and if you're looking for free synth sounds well there you go. As far as I know SynthEdit doesn't speak OSC, yet.
In the mid-90's I met a guy named Tim Anderson who had built a painting machine and was controlling it with Max running on a Mac. (check out Tim's recent homemade wooden hydrofoil video - and his Robot Art video). He explained that Max was was a new kind of intelligent music software for sending MIDI, but it could control anything, really. Soon my plans for the Sound Square included the idea of sending the gesture output thru Max for processing before playing it audibly.
Around the same time I read a book called Interactive Music Systems, written by a guy named Robert Rowe (then at MIT, now at NYU Steinhardt School). He had built (and includes on CD) a Max patch called Cypher, which consisted of two horizontal rows of objects. The top row were "listener" objects, and the bottom row were "player" objects; you programmed them then patched them together... just like all this "new" stuff where you build synths and stuff inside a GUI. (I recently tried to read the disc but my XP machine spit it out and my Mac Powerbook wasn't quite sure what to with it, either.)
So really Max is the Grand-Mac-Daddy of them all. Somewhere along the way the product became commercial and is now sold and updated by Cycling 74 along with some other cool stuff like Jitter (for video). Interesting recent news is that Abelton, makers of Live, and Cycling 74 are getting hitched in some sort of merger. A great feature of Max is the huge community and tons of existing patches, and Live has a rabid following... Could mean great things for both products or the beginning of the War for Middle Earth.
I've been running through the tutorials that come with the 30-day eval copy and they are short and easy to get through, and definitely get you past the "duh" factor and up and running.
I'm only up to chapter 8 so far - there are a total of 53 chapters... yawn.
So, I've really enjoyed playing with my monome 40h this last week, even if the play was often punctuated by long bouts of downloading and configuring. After the first couple of days I pretty much had the hang of what needs to be running for it to work, and was able to play with some of the neat pre-built apps available free on monome's site.
balron asks, and then answers the essential question: "What is the best way to map a 12-tone scale to an 8x8 grid?" Essentially it's a tonal chord generator. There's a huge drop-down menu with various tonal mappings to choose from. Then it lets you set 4 levels of delayed and shifted responses to what you play, which can create some crazy sequencer-like music as well as contemplative jazz-ish compositions, depending how you tune and play it. The responses display on the leds as they play, which is really cool because then you can follow the response lights with similar patterns of your own - like jamming.
flin is like having 8 vertical 8-step sequencers. Think of led raindrops. You can push buttons on the monome to set the drops to be short and fast or long and slow. 8 virtual on-screen keyboards let you assign the notes. The droplets trigger MIDI which I send out to my synth. Hours of mindless fun right there.
flip is 4 "pages" of an 8-beat sequencer sampler. You drag .wav samples onto 4 pads and then use the buttons to arrange each sample. Holding the space-bar on you computer puts the monome into control mode, where you can use button groupings to mute and un-mute, switch samples, switch which sample you're working on, and relative volume levels. Hitting the ~ (tilde) key scrambles the sample randomly. Shift lets you reset the sample to normal again. Again, hours of mindless fun.
mlr is similar to flip in that it allows you to load up samples and then control them with the monome. It maps a sample horizontally to each row and then you can control the begin- and end-points with the monome. Good for "glitching" e.g. re-starting the sample from different points. You can also set individual samples to play in reverse (right to left).
There are more apps for Chuck, Pd, Processing, Reaktor 5, etc, but by now I had started looking for a good tool to chop up some songs so I could play with mlr and flip some more (being tired of the same 20 clips I downloaded off the internet). This sent me on a trip down beat-slicing lane.
My basic desire was an app that would slice a song into 64 roughly equal parts based on the beat structure of the song. Then map each part to a button on the monome. Turns out this takes some pretty fancy algorithms to do automatically. I found a lot of tools, but they all had pretty hefty price tags, so I spent some time looking at free or opensource options, per some suggestions on the monome site. After various frustrations with VSTs in Cubase and/or interfaces that were baffling to me, I started to think that perhaps a commercial app would be the way to go. At least there would be documentation and a forum and some Youtube vids and I could get some support if/when totally lost.
After considering Max/MSP and Abelton Live 6, I ended up choosing Reaktor. I'll probably eventually get both Max/MSP AND Live, but for now I think Reaktor will give me the most bang for my buck.
Reaktor looks deep, in terms of having a lot of ready-to-use apps (ensembles) but also a sophisticated and accessible core for building my own tools.
AND it has a couple of different beat-slicing tools built right in. I found it for $224.50 shipped from Nova Musik. There are already a few apps up on monome.org for Reaktor, and it speaks OSC, so I'm psyched.
My first crack at making some music with the monome 40h. I used 2 of the free apps that are available on monome's website, "Balron" and "Flin", to send MIDI to my Alesis Micron synth. Just ad hoc experiments yet sort of cool.
"The brainchild of Plogue Art et Technologie, Bidule is a cross-platform application that is gaining recognition world-wide as the new standard in modular music software" - www.plogue.com
This tool is not free but is currently only $75 US. At first it seemed very promising, same paradigm as Buzz but more evolved. However, I installed it on my Windows XP machine at home and again at work on my XP-64 machine and the interface seems kind of buggy. Clicking the "Start" button shifts the whole workspace and moves some modules off-window, with no way to retrieve them. Very frustrating. Hopefully they'll work out some of these quirks as I'd really like to try it out.
"Buzz is the first ever "easy to use" free modular software based synthesizer. What this means is that the entire system is based on objects, which may be routed in a modular fashion, giving you the freedom to be as creative as you want. For example, if you wish to run 3 Physical Modelling synths and a Drum Machine through 2 seperate Stereo Delays, into a Mixer, through a Compressor and Parameteric EQ, and finally out to your speakers - no problem. Lay down your synths, connect the wires and you're done." - www.buzzmachines.com