Exported to 60 FPS but still the video is in 24 FPS

My second post provided numerous examples of why a no-impact rate change is desirable.

Regarding the Marvel trailer, this is an example of optical flow interpolation. Adobe Premiere and DaVinci Resolve can do this, among others. Shotcut cannot.

The idea behind interpolation is to take two frames, find edges in common between the two frames, and then mathematically bend the edges to generate in-between frames. The success of this method depends on edges in the two frames being properly discovered and correlated.

So let’s break down the Marvel trailer. The first thing we notice is an extremely selective criteria in deciding which clips made it into the trailer. The majority of clips have very little movement, which makes it fall into the video game recording exception I made at the end of my first post. We see a spaceship slowly floating in space; we see people slowly marching across the floor; we see a soap opera stare; they want us to believe a quinjet is flying fast across the sky but the camera lens is so wide-angle that the overall movement is actually next to nothing. By choosing clips with slow or zero movement, there is no motion blur to retain the 24fps look, and there are no blurry edges to confuse the edge-detection algorithm in the interpolator. The algorithm will have a very high chance of positively correlating edges between frames.

Now let’s look at where it fails. At 1:05, Falcon turns into a dust cloud. Edges don’t exist in a ball of smoke. The interpolation algorithm has a difficult time figuring out the movement of the cloud from frame to frame because there are no hard edges to easily track motion. The result is that the interior of the cloud looks pretty mushy compared to other parts of the frame. It still does pretty well overall because the cloud is moving pretty slow, which means general mesh distortion in the absence of a clean edge will still look decent. But a fast-moving cloud? That would look awful.

Now let’s watch optical flow struggle with motion blur. Step through 1:50 frame at a time and watch Black Widow’s hands as she swaps magazines. Because her hands are moving fast and they were captured at 24fps, there is a lot of motion blur. Interpolation can’t find an edge in the blur, so artefacts like this happen:

OpFlow1
OpFlow2

These artefacts happen pretty much anytime something moves fast on the screen or has motion blur around it. Now you see why the person who made the trailer was extremely selective about picking clips that don’t move fast. It looks awful otherwise.

Another example, from the same YouTube channel, “Rambo: Last Blood” trailer. At 0:11, there is a windmill spinning. The edges of the blades are blurry from movement, and the interpolation algorithm is very confused about which edges match between frames because interpolation algorithms generally don’t understand rotation. They only track straight-line movement. Therefore, blades of the windmill randomly appear and disappear depending on whether the algorithm matched an edge or not. In this picture, the blades are gone in the upper-left. It looks awful to watch in real-time.

OpFlow3

And at 1:14, this soldier has two faces due to unmatched edges in motion blur:

OpFlow4

So let’s summarize. Shotcut changes frame rate by duplicating or dropping frames as necessary to match the target frame rate. Interpolation is a totally separate technique where the computer generates artificial frames by matching edges between frames. Shotcut does not provide interpolation tools. For editors that do provide interpolation, it’s worth noting that it is extremely CPU intensive, and it would not surprise me if it took an entire day to render one of those two-minute trailers. Even with all that processing time, the end result will still have artefacts anytime edges can’t be matched or anytime edges are lost in motion blur. This is why it’s best to capture in 60fps if you want the 60fps look. There are no artefacts and it eliminates interpolation processing time. For people that constantly analyze what they’re seeing, watching interpolated video is not a great viewing experience, and definitely not an emotional one when artefacts break the illusion of reality.

Speaking of which, a question if I may ask without sounding critical because that’s not the goal… Comparing the look of the Marvel movie at 24fps vs 60fps, is there really a strong preference for the 60fps look in cinema? To me, it looks like the humans were taken over by robots and the emotional impact feels totally different… mechanical, almost… maybe tense, too. I realize this is a totally subjective thing. I’m just wondering if other people feel the same way or prefer the 60fps look. I could totally see it working for 3D CGI movies or video game playback.

Think of it this way… If you’re in Blender and you want to make a 4-second intro at 60fps, that’s a total of 240 frames needed for the intro. If four seconds of 24fps footage is dropped onto the timeline, that video only provides 4s*24fps=96 frames, which is less than half of what’s needed to last four seconds at 60fps. Blender absorbs the video frame-for-frame without stretching, duplicating, or interpolating it to the length you intend. This is why Blender has the prerequisite that all sources must first be converted into the FPS of the timeline before editing can begin, and is a big reason I don’t use Blender for editing. So, this means you’ll need to convert that 24fps video to 60fps using an external process. If you use Shotcut, you’re limited to frame duplication which will retain the look of the 24fps source. If you want the 60fps look, you can try optical flow with Premiere or Resolve, but your mileage will vary… a lot.

4 Likes