Recorded on December 22nd, 2016
While pressing NanoStudio into service today, I realized that it doesn’t allow any control over velocity curves. A NanoStudio organ sample I was playing from the Alesis Micron sounded weak, because real pipe and electric organs don’t respond to velocity. If only I could filter out velocity messages from reaching NanoStudio.
Midiflow does this and way more! I remember 30 years ago reading a review of the Axxess MIDI Mapper in Keyboard Magazine. It allowed for a zillion ways to reprogram and reroute MIDI notes, channels, and controllers into any way you cared to twist them. Well now you can do this same stuff in an iOS app, and it’s much easier to program than the Mapper.
Midiflow is like AudioBus for MIDI. You can string together inputs, modifiers, and outputs with a different configuration for each song. And NanoStudio has a good enough MIDI implementation that you can force it to “listen” to Midiflow’s virtual MIDI outputs and ignore the system MIDI bus. Midiflow can also respond to MIDI program change commands, and it can even send commands (like its own program changes) to a destination app when you load a song setup. It has a MIDI monitor so you can see the actual numbers flowing out of a controller if you need to debug something. You can also set it to filter out some or all notes from reaching a sound generator. It’s just too great to even believe. If you have any kind of MIDI rig going with hardware controllers and soft synths in an iPhone or iPad, you need Midiflow.
Recorded on December 21st, 2016
I found a better iPhone MIDI-compatible sampler app! NanoStudio to the rescue, for the insanely low price of $6.99. The screenshots in the App Store do not do justice to what’s under the hood with this thing. It’s quite a bit less immediately friendly than GarageBand, but it’s also way more powerful.
Like so many apps these days, it was tons of effects built in, as well as very flexible sound generators, either sample- or synth-based. The sampler lets you edit waveforms right in the app, although I haven’t been able to get quite as seamless a loop as I can in GarageBand. You can set individual instruments to receive separate MIDI channels, so if your keyboard controller can split its output, you have a multi-timbral synth/sampler.
It feels like its latency isn’t quite as low as GarageBand’s, but I can live with a few more milliseconds of delay between a key press and hearing a sound start. The trade-off is that it is much faster at switching between patches. It’s worth it just to have the rest of the band not waiting on me to switch sounds all the time.
And it can respond to MIDI program changes! This means I can select a patch to play on the Alesis Micron and NanoStudio will pull up the correct corresponding patch as long as I have everything in the right slots. This is hugely helpful!
Unlike GarageBand, though, I can’t seem to make it work with the iRig Pro for recording samples into it, but I believe I can get around that with AudioCopy.
There’s also a full sequencer and a whole separate interface for triggering sounds from “pads” on-screen. And all this is just what I’ve been able to do on the iPhone. I haven’t even tried it with our old iPad because I don’t have a MIDI adaptor that works with a 30-pin connector. But from what I’ve seen, the iPhone version will do fine.
Recorded on August 13th, 2016
Today I was trying to get the KX76 mod wheel to control the LFO depth on the Micron. No matter what I did, the wheel was just dead. I figured the previous owner had mapped it to some other MIDI control value. Sure enough, I quickly found this Yamaha FAQ page, KX76/KX88 Assigning a Preset Controller Code to a Programmable Control, and it fixed me up. I learned that you want to assign the KX76 mod wheel to Preset Controller Code 11 (which it wasn’t before), and then set a modulator on the Micron with a source of MIDI CC1 (the standard MIDI mod wheel controller number — I don’t know why it’s not 11) and assign it to control LFO depth, filter cutoff, filter resonance, or whatever you want.
To get the warbly end-of-phrase accent part of “Love is a Battlefield” correct, I added modulators (all with CC1 as a source) to map the mod wheel to control LFO 1 amplitude, LFO 1’s effect on OSC 1 pitch, Envelope 1 Release time (inverted), and Envelope 1 Sustain time (inverted). The effect is that when I temporarily crank the mod wheel on the KX76, the formerly lush pad sound transforms into a spazzy chord with a quick release. If I hold the chord, it dies out slowly. And then when I pull the mod wheel back down to zero, the lush, warm pad sound returns. I know just exactly enough about MIDI controller codes to be dangerous.
Recorded on August 11th, 2016
Who knew that GarageBand on OS X had a whole window tab dedicated to turning recorded software instrument parts into musical notation? You can even quantize and tweak notes, so your sloppy playing can look orderly. This is part of what I just printed to a PDF right from GB. Very handy!
Recorded on August 10th, 2016
Now that I have the Yamaha KX76 and the Alesis Micron on speaking terms, I need to be able to play one Micron voice from the KX76 and the other from the local keyboard of the Micron itself. The only practical way seems to be to set separate key ranges for Parts A and B in a Micron setup. Doing that is easy, but sometimes the zones overlap for the separate parts you want to play, so I need to be able to transpose the notes I’m playing on the KX76, so as not to play everything in the bottom octave of keys on that controller.
I spent more time than I’d like to admit before I broke down and looked at the KX76 manual. Thankfully, there was an example of how to set Toggle Switch 1 (TS1) to do just that. Intuitive it is not.
Recorded on August 8th, 2016
Why have I not been using my MIDI controller all this time?