Shotcut doesn't use GPU

My system

  • Operating system: Arch Linux;
  • Graphic Card: GeForce 1050 GTI
  • ShotCut version: Arch-18.11.04
  • CPU: AMD Phenom ii x3

My issue
When I do a picture-in-picture (two video streams, one above the other, stream on top resized), the playback is extremely juddery (not running smooth).
I am wondering: when the playback comes to the point with the two streams one above the other, all the utilization of my CPU-kernels are going up to almost 100% … but the GPU (GeForce 1050 GTI) and video engine utilization stay unaffected at 0%.

It’s the same, when I export a file … maybe the GPU utilization jumps to 4% max.

Any ideas, why ShotCut isn’t using the power of my graphic cards?

You are using a beta version.

why ShotCut isn’t using the power of my graphic cards?

Because you have GPU Effects disabled, but GPU Effects is experimental and unsupported and hidden now. Why? Because it is difficult to get working in a stable manner, because we suck at software development, and because you are not submitting patches and code to make it work better.

On the one side you wrote “you have GPU Effects disabled”,
on the other side you wrote “GPU Effects [… is] hidden now”.
do you see the contradiction?

You also wrote: “and because your are not submitting […] and [not] code to make it work better” --> :-0

This topic was automatically closed after 90 days. New replies are no longer allowed.