Video 'Jutter' or not?

This is a bit general question rather than specific to ShotCut.

I recorded a TV show, and it has as a particular fine, weird, choppy look to it on either a computer or a TV. At first, I thought it was ‘jutter’ (movie 24 fps recording being displayed on a 60Hz TV screen), but it may, instead, be a low frame rate from the broadcast station. My recording, itself, has a frame rate 29.97.

I was thinking this is correctable by re-exporting it with a different frame rate, but it comes out virtually the same (I tried 30 and 60 fps settings, and it looked the same).

Now, I am thinking there is no changing it because that is how the original stream was output from the TV station. (For some reason I began thinking all video could be converted back to uncompressed video, and one could then encode it as desired (reading in Ffmpeg)).

I’m completely confused on what is happening here, and comments of it would be appreciated.

Sounds like the TV copy protection technology is working as the Television stations intended it to.

I have found in the last two years personally, that pointing an old video camera at the screen on a tripod was the best way to record shows on tv - but I was just using what I had readily available in the junk room ready to go.

Worked for me filming a show about a nutty old man talking - which was all I cared about at the time. I watch minimum tv these days. I got no jitter.

Upping the fps won’t help and will possibly exacerbate the problem, requiring extra frames to be filled with in-between duplicates. You could instead try lowering the fps in your export settings (the 24fps preset might work, but you could give 20fps a go for testing purposes).

Nope. Looked the same.

This topic was automatically closed after 90 days. New replies are no longer allowed.