I’ve noticed a new behavior regarding filters. It seems like multiple instances are created and then used in parallell over the timeline. For example, when a 360 stabilize filter is added to a clip, four filter instances are created and then instance 1 is applied to frame 1 and 5, instance 2 to frame 2 and 6, and so on.
I see the point here - you can run filters in parallell and that’s an awesome and cheap way to turn single-threaded code into multithreaded - but for my 360 stabilization filter it’s fatal: it depends on getting the frames in sequence so it can figure out the difference between frame N and N+1.
An option for me would be to have all filter instances delegate to a singleton instance. I think it would work, because although Shotcut creates four instances, the instances are still used in a round-robin fashion with frames being fed to them sequentially. Another would be to dive into the MLT framework and make the stabilization analysis phase a CLI tool - when it comes to applying the stabilization data we can of course do that in parallell.
Is this behavior on purpose, and if so, what are the rules?