Sorry I took so long to respond to your question.
K Machine can be a good tool for creating video loops for use in TouchViZ. Knowing the bpm of your video loops can allow you to a certain degree to adjust the playback speed of the video clips to match the tempo of your audio although clips created as some sort of multiple of the audio’s tempo is easier to work with. K Machine can definitly be used to create these sorts of loops.
One technique to generate video loops for use in TouchViZ is too get a video clip, crop it to half the length of the video loop you want, copy it and reverse the copy in an app like LumaFusion so that the original and reversed loops can be saved as a perfect video loop which can be manipulated in ways analogous to perfect tempo audio loops. You could for example create evolving visual patterns from selecting video loops which have lengths that are not evenly divisible by each other (e.g. 3 and 5) and then use this same technique for audio elements like when the kick and snare in a track are triggered. Even if you’re not using the same sort of technique in your audio, it’s still a good way to create some visual variation while also maintaining some common visual elements over time.
Since you’re working with looped video clips, you don’t need to really worry about syncing up with the audio so much as the MIDI control of the effects in TouchViZ will create the audio/visual synesthesia effect.
You can use a couple of approaches to synchronize MIDI from other apps with TouchViZ. Since TouchViZ doesn’t have it’s own virtual MIDI port, you can sometimes find situations where you need another MIDI app to act as a bridge like MidiFire where you send MIDI to MidiFire and then send that MIDI back out using MidiFire’s virtual output port which TouchViZ can listen to.
An app like FAC Envolver can use the the volume level of an audio signal over time to generate MIDI CC messages to control TouchViZ. You could adjust this output to get a range of control that works well with the clips you’re using. Alternatively you could use parts of some of the MIDI controls used to control your sound chain to also control TouchViZ by once again scaling the output range to work most effectively with the TouchViZ controls.
You could use an app like AUM to host multiple synth and MIDI generating apps so that you have multiple MIDI outputs to generate controls that correlate with multiple sound chains or elements of your music.
Using TouchViZ as part of how you’re creating a music video, you could use an app like LumaFusion to combine multiple layers of TouchViZ output to create a more nuanced video with a wide variety of visual effects that reflect your musical content. With this approach you can use loop or regions and tracks of a song along with corresponding layers of TouchViZ performances. Tweaking the placement of the audio in the video’s timeline would be something you’d experiment with to achieve the best results. These sorts of layered options and flexibility would not really be possible if you’re doing a live performance.
You can use LumaFusion to stretch or shrink clips to a certain degree to matchup with the bpm of your music.
Using blending modes with masking techniques where you combine more geometric sort of shader output clips produced with something like K Machine with more representational clips can be a nice way to superimpose the visual musical changes to whatever sort of imagery you like either at the level of TouchViZ or during video editing in an app like LumaFusion. You could for example take a highly audio reactive K Machine shader and apply it to your vocal track to add as a blend layer in LumaFusion over more region/rhythm/pattern based content recorded in TouchViZ.
Since TouchViZ only deals with the visual side of things, you can adjust the placement of the audio in the video until it lines up well with the musical changes going on. In audition you may have audio that you use to generate some visuals and change it out for other audio that may be different in timbre for example but share other characteristics in common like its rhythmic patterns.
It’s certainly possible to setup MIDI triggers which you control manually but trigger specific or semi-random patterns on the beat or next measure.
If you have multiple iOS devices you could use Link to tempo sync MIDI and audio sources on one device while being able to see TouchViZ being effected on the other device.
Once Audulus 4 is released with AUv3 and MIDI I/O, I think it’d be an excellent tool for these sorts of projects due to the tight control you’d have over associating sound control sequencing to visual MIDI control sequencing.
There’s certainly room for improvement in terms of VJ apps having better sync with audio transport controls the way K Machine does.
The takete app offers a lot of audio video control in real time but unfortunately it only supports core MIDI control without any virtual MIDI inputs.
If you like TouchViZ , you might want to check out the Glitch Clip app as it has some similar workflows but some slightly different approaches and results which would also work with the techniques described above.
Play with clips having different sorts of characteristics like size of details, changes in color over time, repetion, changes in transparency over time, black and white designs, grayscale designs, textures, and shapes to develop your sense of style to create clips that will facilitate it in conjunction with blend modes and special effects options in TouchViZ. As with audio resampling you can do the visual equivalent by importing clips you exported as recordings from TouchViZ back into the app to layer up visual effects for more pattern based elements of your music.