Cool! Nice sense of space. The panning and spatial placement is really effective in creating a flow from one element to another.
I’ve taken @futureaztec’s advice about stereo imaging to heart. I actually was using a moogerfooger phaser to create pseudo mid/side processing.
@stschoen Thanks! I was definitely trying to suggest a story with the song, that’s basically how I try to structure songs since I have given up on the pop song format. Ironically I have been really enjoying super repetitive techno lately so I should stick up for sterile repetition, which definitely has it’s place.
Really great tune!
…or a DAW.
I am interested to know how you are working @robertsyrett? Personally, I feel drawn to the flow of having my machines and programs all clocked and ready for a transport cue. Then I record live. I haven’t gotten around to working in ‘song mode’ with the elektron stuff. However, my dad gave me his mixer so I am enjoying using Korg Gadget as a sort of backing band that carries some instrumentation through some organized changes. Being able to get a sync with Ableton Link, then adding that voice to the outboard mixer is really fun to me.
As the Gadget DAW runs, I can then respond with my other gear. I don’t feel very competent at it, but it helps me get out of the overthinking that can get in the way of having a song done after spending two full days messing around on an idea.
I want to stress that once more. In terms of Eurorack productivity, Gadget seems to compliment it well in that you can work really fast and automate later. I think it just has that immediacy/limitation factor going for it.
Ableton Live was definitely used to add the piano and other overdubs but the base tracks was the digitakt controlling the minilogue and volca sample as well as playing back samples of those and other sources. I record it in live until I got a take I thought I could edit properly, mostly focusing on changing patterns and muting parts as well as occasionally tweaking knobs on the moogerfooger or “analog isolator” on the volca sample.
The digitakt is my main songwriting tool, as I like to make many different versions of particular patterns and not worry about the linearity of the sequence. Theoretically Ableton Session View could serve this purpose, but I’m getting pretty close to maxing out my digitakt knowledge and the immediacy is there. I still have much to learn in p-locking external gear via midi but so far so good.
I actually need to reintegrate Gadget (and the iPad generally) back into the flow of things. This month is going to be a recording intensive month (I hope) so maybe now is the time!
I am trying to think of a way to get Gadget to control Audulus using Taipei. I am thinking a workaround might be to run Audulus on the iPhone and send bluetooth over midi from Gadget on the iPad to Audulus on the iPhone. Then I wouldn’t have that brick wall with the ES-8 in/out conflict when you try to run Audulus and a DAW on the same device. Might try that later today.
Has anyone tabled the idea that the guy who made AUM might be able to get rid of this conflict with some simple changes to the program, or is it a harrier problem? That is, why can’t AUM open up all in/outs on the ES-8 all at once?
Even Ableton Live with its clip-based workflow in session view, encourages a somewhat repetitive approach. It’s a much different way of recording than a traditional DAW like Logic or Reaper. Not that I’m objecting, it was the session view and tight integration with Push that encouraged me to switch from Reaper. In many ways, today’s tools are perfect for the solo musician and allow one to create pieces that would have been very difficult to produce in the “old” analog days. I’m really criticizing my own approach more than anything. It’s so easy to lay down 16 bars of percussion, some chords, and a melody, that sometimes I don’t spend enough time trying to make a statement with my work. It’s an embarrassment of riches in some ways. Between the soft synths, hardware, traditional instruments and DAW software, we have access to a palette of sounds that earlier composers only dreamed of. It’s up to us to use the tools to their best advantage. @robertsyrett’s piece was a good example of using his resources to full advantage. I have to admit I’m envious of his skill.
As far as the ES-8 iPad issue is concerned, it seems to me that the biggest challenge at this point is the lack of multi-channel support for Audulus in IAA or Audiobus mode. I have strongly encouraged @Taylor to consider adding multi-channel support to the AUv3 version of Audulus 4, both for macOS and iOS. When you run Audulus using IAA or Audiobus, Audulus doesn’t communicate directly with the audio interface. The host application is responsible for talking to the hardware. Audulus only provides a single stereo pair in and out. Unfortunately iOS doesn’t allow more than one audio interface at a time. On macOS I can set the ES-8 as the system audio interface and run Audulus standalone and at the same time run Ableton Live using my Focusrite interface. That way I have access to all of the ES-8 channels for Audulus and at the same time I can use my primary audio interface for my DAW. I have two channels of the ES-8 plugged into the Focusrite so I can get audio from Audulus into the DAW. Complicated, but it works pretty well. It would be much simpler if I could just run Audulus as an AU in Ableton and use the ES-8 for everything. Even better if Audulus allowed you to choose which interface to use (at least on macOS)
I will give this issue some serious attention tomorrow, but I think if you can route Gadget into Audulus and then record externally with a field recorder or disting.
I think sometimes it is hard to see why these things are more fundamental than other features, until you get yourself into a situation with your gear and you hit a wall. I hit this wall last October. There is still so much to love, but this is my main wall.
Just thinking about using Gadget as a piano roll/modulation tracker, while bringing that information into Audulus as MIDI (via Taipei), then driving knobs and notes in Audulus patches using that Gadget MIDI data. I am about to try that using MIDI over bluetooth.
okay. that took five minutes. I really just guessed around with the routing and got lucky I think. What is pretty amazing is I can play a note on Taipei on the iPad and it goes over bluetooth to my iPhone with AUM and Audulus, whereby I get a MIDI note and gate which I can then wire to the ES-8 module, which then can control the Euro rack.
So, essentially it is a workaround to get Gadget to sequence a Eurorack by using two ios devices + AUM + Audulus. I wonder if this is of any use to @stevo3985?
Is there a strategy to grab the CC# knobs and XY Pad coordinates to modulate Audulus?
No, can’t be that simple. Sure enough, I twist a knob in Taipei and it then controls the pitch bend in Audulus on my iPhone. Amazing. Is there a way to grab the the bare knob response and send it out to the ES-8?
Yes, under nodes.
Just MIDI CC learn that knob, then move one in Taipei. Hook the knob up to the ES-8. Cordless modulation.
Let’s make a new Topic.
Forgot about him. What a ride we are on.
…because it keeps seeming to get heavier.
Nice. Since you brought up Noisia Radio, this track is from a good friend of mine (and former roommate). Check it out:
He’s really talented and currently signed to Teebee’s label. Here’s another one he did a few years ago: