I searched the forum but didn’t find any hint on this, so here is my suggestion/request:
I found this RIFE lib to calculate the optical flow necessary for interpolating frames when rendering slow motion video below the original framerate.
E.g. you have a 50 FPS video and a timeline of 25 FPS.
So 50% playpack speed is easy.
But if you want to go slower, frames are missing and need to be interpolated.
This is when the video begins to stutter like a stop-motion film.
Here the AI algorithm from google labs makes a huge difference as it not only interpolates consecutive frames but also takes the motion direction into account for that.
The results look great as you can see when you look for “RIFE/Flowframes”.
So as the lib is open source (but i didn’t check the licence type in detail), i want to suggest to implement it in to shotcut. Bringing this great tool to a next level in playback speed manipulation.
I think openCV is already part of the stabilizer filter and it looks like RIFE just based on that.
Maybe it’s not a huge effort to implement it as a filter or in the streamline before handing frames over to ffmpeg.
Currently i use it standalone to convert some clips to higher framerates.
And maybe i’m not the only one who put clips of various framerates (24,25,30,50 and 60 FPS) into the timeline and got stuttering results.
See the Motion Compensation option in the Convert dialog since version 21.01. Shotcut does not currently integrate OpenCV for anything including stabilization.
Thanks for both informations.
I will have a deeper look how motion comp. works for my clips.
And now i’m curios what kind of optical flow the stabilization filter uses… but that’s offtopic.
It uses minterpolate from here with a user drop down for mi_mode plus the following options