Thoughts on Overhauling MIDI in Audulus?
  • Hey there! Just looking to get your thoughts on how to overhaul the MIDI system in Audulus beyond the obvious of just allowing MIDI to pass out of Audulus. What are your ideal implementations and what would they look like?

  • You probably don’t need to duplicate the custom coding capabilities of tools such as MidiFire, but it would be nice to build fancy MIDI LFOs, sequencers, etc. High-resolution MIDI as an option. Definitely MPE support. On iOS, the ability to select arbitrary MIDI in and out sources. Possibly even the ability to run as a MIDI AUv3 similar to Rozeta, but I must admit I’m not sure when I would choose that over audio AU mode.

    If you have MIDI learned controls, definitely send out the current CC values when changing patches, to allow knob positions of external controllers to stay in sync. See Model 15 for a great example. In general, Model 15’s MIDI implementation is top notch.
  • @orand - the basic idea of MIDI out would be that any CV within Audulus that you throw at it would be quantized into the 0-127 steps - so no need to create separate MIDI-specific LFOs.

    Audulus will be AUv3, and have MIDI in and out for that as well, so you could actually use it as a MIDI effect, like an arpeggiator or something.
  • I would like to see more routing possibilities for midi in as well as midi out. My digitakt can send 8 midi CC signals in addition to the typical note, gate, pitch-bend, etc. And I think it would be fantastic to parameter lock a ton of stuff in audulus via the elektron sequencer.

    Also, I hope Audulus can send midi CC per channel as well as program changes.
  • I've been thinking about this myself recently, so I'm glad you started this thread. We already have MIDI CC input pretty well covered with the knob node, although I would like to see support for high resolution data both in and out. MIDI nodes should all be channel aware. It would also be nice to display/change which CC message/channel a knob is programmed for (perhaps in the context menu). would love to have CC support as an alternative to MIDI notes for triggers/toggles for instances where you're dealing with an on/off signal rather than a range of values. I think we should have a MIDI input node that receives PC messages, one for system messages and one that receives MIDI clock(s). The keyboard node should be updated to support MPE and polyphonic aftertouch (you could use the existing poly approach to pitch bend etc when in MPE mode and for aftertouch) I think it would be most useful for output to have a separate node for MIDI note out, CC out, PC out, SYSeX out, clock(s) etc. Regardless of the details, it would be great to be able to send and receive the entire set of MIDI messages.
  • @stschoen so midi learn associated with the knob's context menu is immediate and practical, but I would like to take the incoming signals from the electron sequencer and use them like a signal, and having ghost-controlled knobs as signal sources is visually confusing. Knobs are already overloaded with functionality I think there is room for a more comprehensive "keyboard" node.

    Another thing which would be interesting to see would be a midi transport control node, to send start/pause, stop, f forward, etc.
  • You make a good point. Maybe we need a dedicated CC input node as well as the knob. I’m hoping we will be able to send/receive all MIDI including transport functions.
  • I'm so excited to see this thread.

    There are lots of great ideas here.

    To echo (and build on) RobertSyrett's comment about routing:
    1) A general settings menu with a tab for midi related stuff (think the midi tab in Ableton with options for ins/outs, clock preferences, etc.; or the midi menus in Animoog or Model 15) would be the next logical step, in my opinion. I'm currently setting up my bluetooth and wired midi controllers in other apps like animoog, then switching over to Audulus with my fingers crossed, hoping things work. That's no fun!

    2) I would also love to see some sort of framework that would allow us to develop modules that manipulate midi: scale, transpose, quantize, reassign (e.g. "send cc1 data from 0-64 to channel 1; send cc1 data from 65-128 to channel 2"), strip (e.g. "strip all cc data from a stream", "strip all aftertouch data from a stream", etc.). You mention above that "any CV within Audulus that you throw at it would be quantized into the 0-127 steps - so no need to create separate MIDI-specific LFOs," so it sounds like you're already thinking along these lines.

    3) though not technically midi, Ableton Link is a must. Zmors has a handy link module that I use all the time if for no other reason than to send my es-8 a pulse that's synced to everything that my link-enabled ios sequencers are doing. I think I saw that this is already slated for Audulus implementation, though...

  • If you do Ableton Link, definitely add their new Start Stop Sync!
  • Ive been coding a bit of midi stuff myself lately in JavaScript of all languages but anyhow ... it got me thinking about Audulus and how we don’t really have the concept of “data” versus “signal” like in most other environments.

    Which is really cool because it allows the Audulus way of plug anything into anything it’s all good. That’s the real joy of the platform to me.

    But then it got me thinking about how at the end of the day, a midi message is just 3 unsigned integers ... writing lots of code to flip those around and parse them and stuff got me to thinking “how cool would it be to be able to write patches to deal with raw midi data like in Audulus”

    Maybe when it comes to modeling midi transformations in Audulus, something like an enhancement to the polyphony model would do the trick? For instance, if I could have like cable-in-cable action ... so like if I could bundle an arbitrary number of quads inside a poly, then I can model midi messages as a single quad cable (byte1, byte2, byte3, and an extra?) ... then I can do whatever I want with a little math ... bust it back out to something like a quad-to-midi ... then you know pipe it out an interface, or maybe INTO a keyboard node to bust out midi-to-hz
  • I noticed yesterday that the Trigger node toggle[ ] function doesn’t work with a MIDI note assigned to the Trigger. If the toggle is on, it works as expected with the mouse, but if a MIDI note-on is sent the trigger is switched on, and if a note-off is sent the trigger is switched off, so even if toggle is on the trigger doesn’t latch using MIDI. I don’t know if this the expected behavior since the docs don’t mention toggle[ ]. I think it would be better if the trigger switched on with the first note-on and off with the second note-on and ignored note-off when toggle is enabled, so that it would be consistent with the mouse (finger). Also, as I have mentioned previously, it would be great if triggers and knobs captured the MIDI channel. I have a Korg nanopad which works pretty well with triggers, unless I want to use a keyboard. Since the triggers respond to any channel, I have to pick notes for the triggers I won’t play on the keyboard. I know I can use notes outside the keyboard’s normal range, but then I have to reconfigure the nanopad. It would be much easier if I could set the Korg for channel 2, the keyboard for channel 1 and have Audulus separate them.
  • @stschoen, I have been using different channels for triggers and just use the gate output from the midi keyboard node.
  • If you set the controller for the triggers to a different channel than the keyboard, the triggers won't interfere with the keyboard node, however if you have a note assigned to a trigger and play it on the keyboard you will activate the trigger. I usually use very high notes (C#-8 and up are pretty safe) but that means reconfiguring the nano pad. It would be preferable if I could just set a MIDI channel on the nano pad that is different than the keyboard and have the triggers (and knobs) respond to just that channel.
  • Oh I agree! We definitely could use more flexibility, I was just proposing a stop-gap work around. But thinking about it now, my solution might be more practical on the digitakt than the nano pad.
  • Would be nice if trigger nodes could work with other MIDI CC too (not only keys).
  • +1 on triggers using CC as well as notes
  • I've been playing around with touchOSC and it had a feature that allows onscreen knobs to receive MIDI as well as send. If Audulus sent the MIDI CC from knobs configured for MIDI when they're moved, touchOSC and Audulus controls could stay synced. A knob that could send CC data would be one way to implement MIDI CC output. I would still like to see a complete MIDI interface but if you could configure a knob to send a specific CC (preferably low or high res) on a specific channel coupled with a note/velocity out node, and pitchbend out, you would cover a lot of common applications.