Video Synthesis

There’s a side to having a closed system that becomes productive because you are, in a sense, stuck in one giant synthesizer. It’s digital, but that is yet another nice limitation. Then the fact that it has Ableton Link is fun because production-wise, all transports and bpm info is automatic so you can start in Gadget or add it to some other gear late in building a track — but you are not adding a synth, you are actually working with a speedy mini-DAW. I have been using the send on my mixer to pass Gadget through a compressor and fuzz pedal which can give it that punch and character of the elektron Analog Heat.

I was thinking it would be nice to be able to use other AUv3 plug-ins within Gadget. I have a few already and of course I’m looking forward to Audulus V4. The ability to host plug-ins would make Gadget a pretty decent iOS DAW.

1 Like

I have been using Gadget to augment a DAWless setup. In this style, the daw becomes a slave to a generative patch, hooked up to hardware. Once I hit record, I scramble to grab instruments and turn knobs. With Gadget in the background it can introduce changes. This frees me up to play a stringed instrument live. But then I have Ableton at the end just recording audio. If the audio recording could use some more detail, it is easy to just add some loop clips. This isn’t to say what I am doing isn’t flawed. The main thing is it is fun and I feel like there are some bright moments in every so so track.

1 Like

TiuvhViZ also looks a bit underwhelming for the price. This one has Ableton Link and seems authentic.

The K Machine looks pretty sophisticated. The documentation was interesting. It would take a bit of work to really figure it all out. Seems reasonably priced. No updates since 2018, but pretty regular before that.

1 Like

Hey dude, have you considered the OP-Z with the CV I/O module for your use cases? It occurs to me that this may also be in the realm of ‘not in price range’ but I have both of the items mentioned, and even though it was $750 total, when added up, I love the Z workflow.

Its ability to integrate with A3, Gadget, modular hardware, and also work with video using the VideoLab studio stuff you can find in GitHub makes it one of my favorite pieces of hardware out of all the things I own, soft/hardware both being considered.

Some devs have done some REALLY dope stuff with the SDK, and therein exists the ability to link video, CV, MIDI and other actions of the sequencer, with live performance in mind. I don’t know if this is exactly what you are looking for, but it stayed firmly planted in my mind as I read this whole thread. Also, Happy New Year, btw! :smiley:

2 Likes

Yes, happy New Year. The OP-Z duplicates too many things I can already do. I think just slowly finding my way into some unique ways to modulate video parameters without it becoming a gear thing is the road I am on. I have been able to not buy gear for a while now. Audulus saved me. I feel so grateful for what I have and I do not feel incomplete — just taking my time.

I have a nice little modular synth rack but I sold the key module. So I have a hole in the middle of my rack. I might try to put a complex oscillator in there but I am not sure.

2 Likes

@futureaztec (Edited in an effort to condense my replies to this thread)

Ok, yeah I get that. I just figured I would mention it cuz I have it, and it is the only thing I am currently aware of that has the capability. No sense buying a repeat, though, unless you planned to replace something old with the new thing. It sounds like you are doing good. Lemme know what you find if you do figure out a solution, and I will do the same! :slightly_smiling_face:

Also,this is not a solution, but a neat explanation of how video and audio transmission works over broadcast, which I found while searching. More about it here.

I have been thinking about this all day since I read the thread yesterday. It seems to me that if you wanted to do some sort of video based integration to your projects, you would probably need to figure out a way to send multiple video inputs (I would think maybe stock footage video clips of neat looking streams looping) inbound on different channels.

From there, you would probably want to send it into a multiplexer, and have a modulation signal changing the stream source, based on random or planned signal events. This would then change the ‘channel’ at the output, and provide a switching effect that could be coordinated with your performance.

I bet there is a way to do this with A3 and some other video application. I may be stating the obvious, here, but I wanted to share what came to me just a few minutes ago as an “Aha!” type of realization lol. I hope this helps a bit in your consideration. :thinking:

2 Likes

Edit: I forgot that your original post was regarding two hardware pieces from LZX, but this one might be newer, as it mentions being a successor to the ‘Vidiot’. Anyway, you may already have seen this, but I wanted to post as it felt relevant to the discussion from the other day:

You aren’t the only one thinking about video synthesis, it seems. Check out this new rack mount component from LZX! Way over budget, but still dope AF!

1 Like

I have both K Machine and TouchViZ, they’re like apples and oranges when it comes to their approaches and results. K Machine relies on vertex shaders with variables to react to the sound input. This means needing to use the shaders provided or creating your own which does have a learning curve.

TouchViZ allows you to combine two video clips at a time along with video effects which can controlled via MIDI. It would have been nice if they’d added some more parameters. It doesn’t have any video reactive component and relies upon you controlling the app in a way that creates a relationship between the sound and video. There are apps that can analyze audio input and turn it into MIDI output which can be used to control TouchViZ (e.g. FAC Envolver). You can also use MIDI controlled timing changes related to the music to create these effects.

The developer of SunVox has worked in this area and hopefully at some point he or people using his Pixilang source code will be able to develop more robust VJay apps.

Takete only supports external MIDI so I wouldn’t be able to sequence it with MIDI apps.

You can use video clips from K Machine as fodder for TouchViZ.

At this point I haven’t found a VJay or music video creating app that meets all of my needs. I prefer to cobble together several apps to create results I’m after. The state of music video is similar to the early days of iOS music apps as there are a lot of holes.

Defining your expectations/use case for your look/sound will help to clarify what tools might be useful. Your set of tools is much wider if you’re just wanting to create music videos. The live aspects of VJaying along with its spontaneity are much more difficult to do as are the tools to do so.

4 Likes

@futureaztec I stumbled upon an open source video matrix mixer DIY module by way of Modular Grid today. I immediately thought of this thread, and I think you would might be interested to check it out. Also, with your skills in the ‘build it yourself’ way of doing things, this is the perfect project/price point/solution, imho. I hope you will like what I found. Here is the link! :slightly_smiling_face:

1 Like

This is certainly an avenue I could head down. It could become the center of my 54 hp case. At the same time I find the footage of it less exciting. Adjusting RGB settings seems a bit dull. Contextually, I would be spending quite a bit of money, then time; then I would have just that one module for video effecting purposes. What you have succeeded in doing though is getting me to understand what that is all about. So I think I would now say that that is not something I want to get into. I think that I am more interested in some sort of generative or geometric synthesis.

You know, my intuition tells me this would be really smart for Audulus to get into. I also think that it is probably pretty untapped. I have no idea what could be possible with vectors… :smiley:

Thanks @stevo3985

2 Likes

You might want to checkout VertexShaderArt which is all about creating geometric animations which can also incorporate sound textures as parameters using OpenGL. Lots of games, music visualizers, video effects, animation and other apps use OpenGL or OS specific versions (e.g. metal on iOS) as they’re written to be processed by the devices GPU (graphics processor unit) rather than using the device’s CPU.

2 Likes

Originally Audulus was based on openGL, but Apple deprecated support for openGL a while back, so @taylor moved the Audulus UI to the metal framework in anticipation of losing support for openGL.

3 Likes

Hopefully more developers like @Taylor has and the developer of K Machine will be able to transition to metal internally without negatively impacting their users.

The K Machine developer had some ideas about how to do this and has recognized the significance of supporting OpenGL code as many of the most interesting and largest shader code data bases are written in it. So even though Apple will eventually drop its support for OpenGL, the K Machine developer has recognized the value of continuing to maintain an OpenGL code pathway for his users.

1 Like

There are already libraries available to run OpenGL code using Metal. The two APIs aren’t that different. Here’s one:

Personally, I think Audulus is the template for the future of software. The consumer becomes the developer. Interfacing is the new challenge, until greater standardization. But the trick is not to wait, but to see the opportunity in the curve.

Audulus is the new daw, which is actually the brain for custom interfacing. It’s all about controllers, control signals, cv. The obsession with the ES-8 is a deep psychological craving for “breakout.” Our brains can sense massive potential for efficiency and we crave it like carbs.

1 Like

I actually had user-loadable shaders in Audulus at one point for video synthesis… and then Apple deprecated OpenGL.

Hoping to resurrect it with Metal at some point.

5 Likes

I think that the big move in graphics with A4 would be to have people say, “oh wow, I can totally see how the deep frequency modulation is effecting things now that we have good visuals,” instead of, “look, the geometric object is undulating to the hi-hats.”

Max Cooper is a great example.

We will know if it is on point when the art becomes self-explanatory and there won’t be as much need for a curator to nudge the eye.

Do you have any tips on how to sync TouchViz with midi from other midi apps? Also, how do you use K-Machine and TouchViz in tandem? It sounds sick!