Posted 6 months ago

Here is a tutorial from Michael Dessen on how to use the JackTrip software to stream high quality audio between computers for rehearsal or telematic performance. I think I am going to try this very soon.

Posted 6 months ago

### Supercollider Tweets: Background / Tips

Updated 11/20/2012: Added 7 new tips courtesy of Nathaniel Virgo

I have previously made two blog posts showing some of my Supercollider tweets (Part 1 and Part 2). I posted these up without any real explanation and I wanted to spend a little time talking about why and how I have been…

Here are Schemawound’s tips for writing SuperCollider code that fits within the length of a tweet.

Posted 7 months ago

### Using WiiMotes with AbelSim Change Ringing Software

I have recently started learning about change ringing again after a brief time exploring it a few years ago. I have been looking for a way to practice in between sessions and one of the other ringers alerted me to the Mabel (Mac OS X version) of the AbelSim Change Ringing software. This software provides an interface for you to practice different change ringing methods on simulated tower and hand bells. On windows, there is a way to hook up handbell simulator sensors to the computer, but there is no such technique for the OS X version. I have figured out a work around for this by using the Mabel software in combination with WiiMotes and the OSCulator software.

The basic process is to assign the accelerometer function of each WiiMote to the appropriate keystroke that triggers the appropriate bell in the Mabel program. So then with two WiiMotes you can use one for each bell to practice ringing and get a feel for ringing that is slightly more realistic than pressing keys on your keyboard. The main trick is to tune the accelerometer smoothing value in the WiiMote settings. I have found that a value of 55 tends to offer a good balance in sensitivity where it is easy to trigger the bell, but not too easy that you accidentally ring when you don’t want to. You will have to experiment to find a setting that works for you.

Posted 8 months ago

(Source: okdubu)

Posted 8 months ago

### IRIS in Cambridge, MA

I’m performing my solo bass and electronic music from IRIS in quad sound next week in Cambridge, MA. The performance is a part of the Ampersand series which is a collaboration between WMBR and the List Visual Arts Center.

I will also be playing some new pieces at this performance. One from last summer and a new work in progress that incorporates aggressive panning with freezing sound elements. Please tell friends who might be interested. Full details are at this link http://wmbr.org/ampersand/

Posted 8 months ago

Okay, so I am getting into change ringing again, and this performance is pretty incredible.

Posted 8 months ago

### free-programming-books/free-programming-books.md at master · nashvillebrigade/free-programming-books · GitHub

A huge index of free resources

Indice de libros de programación gratis.

Interesado en el apartado de matemáticas el cual me hace mucha falta…

Reblogging for later

Posted 9 months ago

Using SuperCollider as an Oscillator and Gate Sequencer with a Modular Synth

Tonight I had the goal of interfacing SuperCollider with the modular synth a bit more. I still don’t have a proper oscillator module in my eurorack synth, so I have been looking for ways to send sound externally into the modular from the computer, gameboys, etc. I have also been wanting to experiment with shaping sequenced sounds coming from SuperCollider using the modular.

The concept is that SuperCollider generates a constant stream of audio (like an oscillator module), and can be sequenced using the built-in Pattern library. Every time a note changes, SC sends out a gate trigger signal through one of the audio outputs which is then used to trigger an envelope controlling a VCA.

So here is a solution I came up with today. In order to do this you need SuperCollider, a DC-coupled audio interface (MOTU or others), a floating-ring cable, and a modular synth.

The signal flow is as follows:

MOTU OUT 3 (SC audio) —> Filter —> VCA —> Delay —> Speaker
MOTU OUT 8 (SC gate using floating ring cable) —> envelope Trigger In —> VCA CV In
LFO Module Signal —> Filter Cutoff CV In

SynthDef(\oscwithGate, {
|freq=440, t_envgate=0, dur=1, amp=0.8, audioOut=0, gateOut=7|
var osc, gater;
osc = LFSaw.ar(freq, mul: amp);
gater = Trig1.kr(t_envgate, 0.1).range(0, 5);
Out.ar(audioOut, osc);
Out.ar(gateOut, K2A.ar(gater));

Pdef(\osc,
Pmono(
\oscwithGate,
\octave, Prand([2, 3, 4, 5], inf),
\root, 0,
\degree, Prand([0, 1, 2, 3, 4, 5, 6, 7], inf),
\dur, Pseq([Pn(0.25, 8), Pn(1/3, 6), Pn(1/5, 5)], inf),
\envgate, 1,
\audioOut, 2,
\gateOut, 7,
));

Pdef(\osc).play;
Pdef(\osc).stop;

This code creates the simple oscillator modules and takes care of the gating at the same time. Every time a new note is chosen in the Pmono, the \envgate command sends a “1” which triggers the Trig1.kr UGen which is sent to the audio output specified by the gateOut argument. I had to increase the range of the Trig1.kr UGen to 5 otherwise I didn’t get a hot enough signal to actually trigger the Trig input on the Maths module.

Posted 9 months ago

### Using the Scope to Visualize Multi-Channel Audio and Control Signals in SuperCollider

I was working tonight on some random multichannel expansion sound design in SuperCollider and I realized that I wanted to make sure that my synth was expanding correctly and I wanted a way to visualize it. I recently realized that it is possible to include multiple Out UGens in SynthDefs and decided to use this capability to first visualize the waveform audio signal and then the envelope control signal.

First, here is the synth that I have been working on:

SynthDef(\triBank, {
|lo=20, hi=440, gate=1, timeScale=1, out=0, scopeOut=0|
var sound, env, mix;
sound = LFTri.ar({ExpRand(lo, hi)}!5, {Rand(0, 4)}!5, 0.2);
env = EnvGen.kr(Env.asr({Rand(1, 4)}!5, 1, {Rand(1, 4)}!5, -4), gate, timeScale: timeScale, doneAction: 2);
sound = sound * env;
mix = Mix.new(sound);
Out.ar(out, mix);

SynthDef(\reverb, {
|in=16, out=0, size=1, damp=0.1|
var input = In.ar(in, 1);
var reverb = FreeVerb.ar(input, 0.5, size, damp);
Out.ar(out, Pan2.ar(reverb, 0));

~reverb = Bus.audio(s, 1); //assign a reverb bus

~reverbSynth = Synth.new(\reverb, [\in, ~reverb, \out, 0, \size, 1], s, \addToTail); //start reverb synth

x = Synth(\triBank, [\out, ~reverb, \timeScale, 1, \lo, 60, \hi, 440]); //start drone

x.set(\gate, 0, \timeScale, 1); //free drone

It is a Triangle Oscillator synth modeled on the Snazzy FX Drone Bank Eurorack Synth Module. I also have a simple reverb synth to make everything sound a little better. Then there is just some code to start and stop the synths. Change the argument values and see what they sound like. You can change the timeScale to make shorter or longer attack and decays.

So in my debugging work, I wanted to make sure I was actually creating five different oscillators with five different envelope shapes. I decided to use the scope to make sure everything was working. First, let’s create a plotTree and a scope to visualize the audio server and the audio waveforms.

s.plotTree;
s.scope(7, 0, rate: \audio);

Now let’s change the SynthDef so that in addition to outputing the Mixed audio, we output the 5 individual channels to a different audio bus.

SynthDef(\triBank, {
|lo=20, hi=440, gate=1, timeScale=1, out=0, scopeOut=0|
var sound, env, mix;
sound = LFTri.ar({ExpRand(lo, hi)}!5, {Rand(0, 4)}!5, 0.2);
env = EnvGen.kr(Env.asr({Rand(1, 4)}!5, 1, {Rand(1, 4)}!5, -4), gate, timeScale: timeScale, doneAction: 2);
sound = sound * env;
mix = Mix.new(sound);
Out.ar(out, mix);
Out.ar(scopeOut, sound);

x = Synth(\triBank, [\out, ~reverb, \timeScale, 1, \lo, 60, \hi, 440, \scopeOut, 2]);

x.set(\gate, 0, \timeScale, 1);

Now when we run that code, we can see the Mixed audio on audio busses 0 and 1 and then the five individual triangle waves on busses 2 through 6.

I also wanted to visualize the envelope signals. To do this, I just have to change the SynthDef again. The second Out UGen needs to be set to Control rate and it needs to output the env signal. Also remember to change the Scope to control rate.

SynthDef(\triBank, {
|lo=20, hi=440, gate=1, timeScale=1, out=0, scopeOut=0|
var sound, env, mix;
sound = LFTri.ar({ExpRand(lo, hi)}!5, {Rand(0, 4)}!5, 0.2);
env = EnvGen.kr(Env.asr({Rand(1, 4)}!5, 1, {Rand(1, 4)}!5, -4), gate, timeScale: timeScale, doneAction: 2);
sound = sound * env;
mix = Mix.new(sound);
Out.ar(out, mix);
Out.kr(scopeOut, env);

x = Synth(\triBank, [\out, ~reverb, \timeScale, 1, \lo, 60, \hi, 440, \scopeOut, 0]);

x.set(\gate, 0, \timeScale, 1);

Now we can watch the different envelopes rise and fall at different rates.

This technique is primarily useful for troubleshooting and learning more about what is happening with your audio and control signals. I have found that it has been very useful to me when designing new sounds. Let me know if there are any questions or clarifications needed.

Posted 10 months ago

peter speer at the mca

Looks great! But is he trying hide the fact that there is a laptop there?