Does video mode influence export speed in terms of CPU usage?

I somehow managed to set my video mode to 1080p, 25 FPS and wondered why the fade out video filter would only allow certain duration values, but that got quickly cleared up with feedback, but in doing so I stumbled upon something curious:

  • With video mode 1080p, 25 FPS and export settings with 1440p, 60 FPS and quality-based VBR with 68% only ~33% of my total CPU was being used
  • With video mode to 1440p, 60 FPS and the same export settings ~98%+ of total CPU is being used

Is the far lower CPU usage on a lower profile intended?

The cause most likely seems to be video mode settings differing from export settings:

  • Video mode 1440p, 60fps, clips with 1080p, 60fps, proxy editing + exporting as 1080p, 60fps → ~25-30% of total CPU being used
  • Video mode 1080p, 60fps while leaving everything else the same → there are some drops down to ~50% total CPU, but the overwhelming majority of time it’s ~95%+ of total CPU

Even it that’s the cause it makes no sense as to why CPU usage should by limited just due to a mismatch of video mode settings vs. export settings.

As a side note, you probably do not want to do this. Shotcut will always render the video using the video mode - and then convert to match the export settings. In this case, Shotcut will create 24fps, and then repeat frames to simulate 60fps. This will result in lower temporal resolution less smooth motion. The only situation where it is useful to change the FPS in the export settings is if you are setting it to a lower value than your video mode and it is a multiple - like 60fps video mode to 30fps export settings.

Yes. This matches my expectation. When you change export settings to mismatch the video mode, you are adding an additional single threaded processing step to the export process.

I don’t: I just stumbled upon said curious case when I somehow managed to change the video mode without me noticing.

If that is how it functions in the background it still makes no sense to me as to why a simple mismatch between video mode and export settings would somehow lead to only a fraction of total CPU being used: garbage in, garbage out, but why should that affect total CPU usage?

The clue is in the the explanation

You don’t give your computer’s specs, but it presumably has multiple multi-threaded processors. Say it has 2 processors, each with 2 threads and Shotcut uses 90% of these, then when it is running multi-threaded (4 threads) you will see 90% (of 2 multi-threaded processors) active. When it is running single threaded (1 thread) you will see it using less than 90% of 1 processor, since it will be using only one of that processor’s threads, i.e. less that 45% of all the CPUs or as you observe 25% to 30%.

This topic was automatically closed after 90 days. New replies are no longer allowed.