Video Synthesis

So this unit here is a bit out of my range and out of stock. I’ve had some ideas for a while on how to integrate voltage controlled video into performance. Most of my tracks are recorded while performing them, and I just got a super cheap webcam. So I was thinking about how I might be able to do something internally.

It led me to wondering about internal CV. So we can obviously send CV out of an ES8. But can we take CV out of Audulus and into the background of the computer to be worked with?

It seems to me, as keep moving forward, that once Audulus is the hometown of your production setup, one way to think about a small complimentary rack is to outsource CPU heavy tasks in order to keep Audulus free. So I am making room in my 54hp such that it leaves the most room to explore synthesis.

I have thought about a small projector for a while to host the signal, but I would also need a place to work with it.

I like the idea that I could take a webcam signal and ‘treat’ it somehow with live CV internally. There is Audulus’ modulation signals already present, on the one hand, and a usb camera provides video data, on the other.

There is are also workarounds with MIDI, using Audulus with Korg Gadget. So just want to start a practical performance conversation on synced video manipulation/synthesis.

By “place to work with” video, again it is also a cpu/performance/reliability concern. This little guy, also from LZX Industries, caught my eye:
Screen Shot 2020-01-01 at 1.39.17 PM

Again, the idea here isn’t so much about the products, but what can be done within the box.

In general passing signals from Audulus to other programs isn’t a problem. I’m not sure how you would use the signals to modulate video from the webcam. One possible approach would be using jitter which is the part of max MSP that supports video but that would require a mac and the software. Not very familiar with hardware options.

I think its pretty interesting when you start to think about breaking out the cv from Audulus internally. I mean, does AUM or something like it ‘host cv’? Should I be looking for some kind of video software/app that could then take that output and modulate effects? Productively, how could I take a patch and steal some gate and cv – then use this to effect a live video signal so that I could send it out to a projector or video screen in order to have live visuals?

To the causal person, sure there are obvious routes. But the internal idea seems novel – or, at least, if it can be done someone enlighten me.

I want to add some context here. I think this line of rave decco/visuals software control development needs to be present. Not as the exact route, but an old vine that should be considered…

It’s interesting to think of CV as a standard for controlling everything, then to start imagining various ways Audulus could interface – but also how signals could be brought in as CV (the crowd in front effecting synths by derriving a signal from a video camera, and grabbing information from that moving image to feed into Audulus, for example).

Further, incorporating CV into performance tools is happening in parallel…Screen Shot 2020-01-01 at 5.04.36 PM

I’ve done some testing with AUM and it passes “CV” signals just fine. Remember that from the perspective of iOS there is no difference between CV and audio. They’re both just a stream of numbers.

1 Like

BTW the ring is pretty cool.

Do you know if there is a CV to MIDI app or something that could also move that CV into a domain of signals that iOS could make use of?


I was trying to get Lumen to respond to Midi CC’s from Korg Gadget’s MIDI out module that is also controlling Audulus remotely. Can’t get the MIDI over bluetooth working between Lumen on Macbook and gadget on the ipad yet.


I am using the free app. I feel like I am one bluetooth midi configuration away. Need something like AUM for OSX. Or, something like Lumen for iOS.

If you need a MIDI tool for macOS, I use an app called midipipe
http://www.subtlesoft.square7.net/MidiPipe.html
It’s freeware and lets you do just about anything with MIDI on macOS. I use it to route MIDI from my UltraNova to the Behringer synths when I’m. using a DAW. It can change MIDI channels, filter MIDI messages, delay, script, etc. Very flexible tool. You create pipes that chain together MIDI operations to take awn input and convert it into an output.
Another tool I find useful is Studiomux/MIDImux. It uses a lightning connector between the iPad and the Mac to send MIDI and audio between the devices.
https://zerodebug.com/studiomux
You can buy just the MIDImux part, but I find the audio routing useful as well. The latency is pretty good and the hardwire makes it pretty reliable.

2 Likes

It is quickly spilling out into the familiar computer mess. One of the issues is that both programs are a bit blind. So you can’t trace signals visually very well. In my experience I could spend all day “troubleshooting,” but I am going to let it sit for a while.

With MIDIMUX one thing I missed was that I needed the “server” installed on the Macbook. It is easy not to notice that font on the website.

Walking away for a bit :no_mouth:

On a side note, I find Lumen a bit unexciting. Wondering if there is an alternative video synthesis simulator or what other approaches there might be. I can mix between a built in cam and a usb cam, and modulate some colour which is interesting…

Was looking at what might be going on on VCV Rack. Found this on a reddit:

"VCV Rack’s engine tops up at 192kHz, which is way too low to send significant amounts of video signal without compression. Component video of the kind of stuff you could find in a LZX module requires at the very least 1MHz: a RS-170 signal follows the NTSC convention of 15kHz vertical, so you’d have something like 12.8Hz of bandwidth to represent one whole line in Rack (192/15).

Because it’s all emulated, you could “cheat”, ignore real-life constraints, and get up to 32 bits of data per tick, at 192kHz that’s 6.144Mbps. Definitely enough to stream good quality video in a standard format like H264. But your signal is now pure noise if you look at it in the frequency domain, so while you could interoperate video modules together you’re basically moving away from the neat thing about modular synthesis which is that everything is a signal and you can process it with everything else."

Yeah the server provides the “bridge” between the iOS app and macOS. It’s a bit confusing to set up, but works pretty well for me. MIDIpipe has a module called list that will display MIDI messages. Very handy to see what’s going on while you set things up. You can stick it in the chain at any point and monitor the message traffic.

I teased out 2 problems.

  1. a) I downloaded Midi monitor and could get the midi info a bit easier. Then I grabbed a hardware midi controller (it mapped to ableton).

    b) I don’t ‘see’ midi data coming from Korg Gadget over usb. I can see my hardware midi controller, but not gadget over usb.

  2. Because the hardware controller maps to Ableton reliably, I tried to map it to Lumen. Didn’t work.

So I have an issue getting Korg Gadget to send midi over USB; and I have an issue with Lumen itself.

One nice thing though is I was able to map transport functions from the test controller into my ipad through the Macbook. So I can control korg gadget with hardware that is not directly connected. Nice. It will start and stop, control mutes and knobs in ableton and in gadget simultaneously.

Looks like Lumen has a free demo. I might play around with it if I get a chance. From what I’ve read you should be able to control it with MIDI, but often “the devil is in the details”.

Did your hardware controller appear as a choice in the Lumen MIDI source selection window? Also are you using midimux to connect the iPad to the Mac?

Lumen seems a bit unremarkable. I don’t have Max. Video synthesis in one sense is an open question in terms of what parameters could be broken out for modulation or complex algorithmic manipulation…

Yes my controller appeared as a choice but would not map. I wonder if Lumen won’t map in the demo version. Yes, I was using midimux b/w gadget on the ipad and my Macbook. Neither “watch” app produced MIDI data for the Gadget > Macbook link through Midimux.

TouchViZ is an iOS app that can use your device’s camera and movies along with various video effects which can all be controlled via MIDI or OSC. It supports external screens and projectors.

2 Likes

I was able to successfully send MIDI from Gadget running on my iPad (using Taipei) via MIDIMux to my iMac and then on to my UltraNova. I tested both MIDI notes and CC’s. I haven’t fooled with Lumen as yet.

1 Like

I am happy you own Gadget. For some reason I think its one of the best products as far as getting to finished tracks. Then with Taipei, you can sequence hardware, audulus, whatever.

I actually bought Gadget based on posts from you and @robertsyrett . I waited a while to see if it would go on sale, but I finally decided to bite the bullet and pay full price. The sequencer and TaiPei gadget were the deciding factors. I have only had it for a few days and haven’t had a chance to explore it fully, but it seems to be pretty well thought out. Lots of nice sounding virtual synths. I’m hoping they will add AUv3 support at some point but that would really be icing on the cake. I was never very happy with BM3 as an iOS DAW. It’s pretty flexible, but not very well thought out from a UI perspective. I lucked out and got it for free, otherwise I don’t think I would have ever bought it.

1 Like