I’m trying to create a complex music visualizer with a lot of effect that matches the song, and the current audio visualization filters (such as the Dance, Spectrum, and Waveform ones) are great,
But there’s one major caveat in using these filters: they only work with audio within the video.
My idea / solution is to have the ability added to route said audio-reactive filters to specific audio channels, therefore getting rid of that limitation and making said filters way more convenient in usage…
Here’s a photoshopped example of what it could look like (with diagram explanation):
Not quite right. In addition to @MusicalBox’s clever solution, you can also drag the audio file alone onto a Video track. The Visualization filters will work on this track.
Okay, @MusicalBox and @SergeC, thank you, both of your feedback does solve this issue, but only a part of it…
What about the Audio Dance Visualization filter? (P.S. This issue im about to explain ALSO applies to the Audio Light Visualization filter)
What if I wanted to route a Dance filter to an audio-less image or video, but NOT apply the Dance filter to the Output and have it swell everything in the video, but instead only swell a specific video layer?
This Routing suggestion of mine would fix this limitation and ultimately make ALL Audio Visualization filters fully manipulable without requiring an attached audio within the video.
What do you all think? (and hopefully this can be seen / considered by the developing team for Shotcut too)
I think your suggestion could be useful. Especially on the Dance Visualization filter since it is the only one that affects the actual size and position of a clip.
It would make things easier for example when you only want the Dance visualization to affect a text clip or a small image (like a logo or a smiley) overlaid above the main video/audio track(s).