Hello to all, i’m new to the forum and i like ShotCut very much !!!
I would ask for a question just seen in the forum, i know… i read many post and i know i cannot expect all word was done by my graphic card…
…but… why in my previous software HitFilmExpress, encoding the same video used in ShotCut my graphic card was used by 80-90 % and in ShotCut only around 5-10 % max ?
I’m using the YouTube original setting profile …just tryed with H.264 High profile and i have the same issue… just activated relevation of hardware and parallel elaboration
Is there something i could do for improve performance during export process ?
I’m using a Nvidia GTX 1650 Super with a Intel I7 4th generation and 16 GB of ram.
Shotcut does not use the graphics card for export by default.
From your question, it is not clear if you checked the “Use hardware encoder” checkbox in the export panel. Click “Configure” to see if it is detected.
Shotcut is usually using a mix of CPU and GPU. CPU mostly for filters and GPU for exporting the final step. You could try to selectively disable filters to see which affect the CPU most or you could try the experimental Settings → GPU effects mode and only use gpu optimized filters.
So while your GPU could theoretically export 5x faster for 100%, the CPU might not provide enough “ready” images for the GPU to be fully utilized.