Computing offset in clip from filter

I thought that it was possible to compute the clip offset in seconds by dividing the offset frame with the frame rate:

function getFrameRate() {
    return producer.getDouble("meta.media.frame_rate_num", getPosition()) / producer.getDouble("meta.media.frame_rate_den", getPosition())
}

function getClipOffset() {
    return filter.in
}

clipOffsetInSeconds = (getClipOffset() / getFrameRate())

However, I get slightly different results every now and then, and I suspect this has to do with the 29.97fps drop-frame. That is, every now and then there is a frame missing or inserted. Anyone knows how to either tell Shotcut to ignore drop frames on 29.97fps video (just treat it as a sequence of frames, ignore timecode), or how to reliably get the offset?

Ok, now I’m suspecting that Shotcut doesn’t store float filter parameters with greater precision than three decimals, and that it was this that caused the issue.

Note to self: don’t use double values to lookup entries in a map. :sob:

I’m a little fuzzy on which offset you’re trying to get. A clip’s offset from the beginning of the timeline? Or a filter’s offset from the beginning of a clip when filter trimming is applied?

If wanting the timeline offset, then querying producer could be misleading because a clip’s frame rate may not be the same as the timeline’s frame rate (profile.num/den).

I want the offset in seconds from the start of the source clip to the start of the filter.

Like, I place a clip (sample.avi) on the timeline starting at 0:00;0, then I split it at 0:10;0 (timeline at 10 seconds), giving me clip A and clip B. I want something that gives me zero seconds for clip A, and 10 seconds for clip B, so that I can figure out that two seconds into clip B is 12 seconds into sample.avi.

This topic was automatically closed after 90 days. New replies are no longer allowed.