Where’s the music II?
The great link dump continues. Part two is a couple of older gig recordings…
Where’s the music I?
Time was, there was a whole load of music on this site. Until I get round to designing a way to present all that again (or just dump it all on Soundcloud) I thought I’d throw up the links as posts.
Thoughts on OpenNight #4
Rob Munro was kind enough to upload a recording of the set I played at The FleaPit last Thursday. Download it and have a listen if you weren’t able to make it along.
I was pleased with how it went on the whole (and was offered another gig afterwards so it can’t have sounded too bad). The main problem was that it took me a couple of minutes to settle into a groove which isn’t ideal from the point of view of hooking people in who might be thinking about popping to the bar.
OpenNight #4
I’ll be playing at The Fleapit, 45 Columbia Road, E2 on April 15th. Come along!
Reasons to love SuperCollider
When I first started to get into the nuts and bolts of sound generation (rather than tweaking someone else’s plugins), I did what a lot of other people do and downloaded Pure Data. It’s a great environment to start playing around in, you connect an oscillator block to a filter block to an output block and you’ve got a little subtractive synth and it’s clear how the signal flows through it.
Evidence
I made a decision a while back to focus on playing live rather than recording. I’ve spent a lot of time hunched over Cakewalk Sonar tweaking tracks and I daresay I’ll go back to that mode again eventually (or if I’m lucky find a compromise) but right now I want to play. The downside of this is that it can be a bit tricky to explain to people what the hell it is my music actually sounds like.
Gig at The FleaPit - 11 February 2010
I’ll be playing at The FleaPit on Columbia Road, London, E2 on February 11th as part of one of OpenLab’s OpenNights
More camera simulation
I’m making slow but steady process with the camera simulation. The blurring effect I mentioned at the end of the first post on this is now in there using the low-pass filter code that I posted yesterday. I’m really happy with how that panned out so I thought I’d post another video and the code. I’ve used a bit of off screen drawing to do this which I don’t recall being covered in the Processing documentation so I’ll write that up for another post.
A low pass filter in Processing
In trying to further humanise my wobbly camera project I needed some of the parameters to change in a way that was linked but with one of the parameters responding at a slower rate. To do this, you need a low pass filter which will be familiar to anyone that’s spent any time making electronic music (anyone else should follow the link). There are a few ways to implement such a filter but for my purposes a quick moving average version was good enough.
Un-Steadicam
I was watching videos of neurons growing this morning (it’s nearly work at least) and it occurred to me that it might be fun to get Processing to draw something similar. I had a whole plan worked out with branching, and making the strands repel one another, maybe a bit of L-system action. Not unusually when experimenting with Processing, I did a little bit of the plan, saw something pretty and got side-tracked, in this case by the trails of particles with brownian motion.