GPU for decoding and encoding

I am using shotcut version 21.03.21, because I was having bugs on running both of the may versions.

But having bugs is not my problem, I just want to know how to use my GPU for decoding and encoding, So there is less load on my CPU and I can render fast on blender.
And I am using an Intel® Core™ i9-10900X X-series Processor (19.25M Cache, 3.70 GHz) with 256 GB Ram (Shotcut was not running before on 256GB but when I re-installed windows it started working), And I have a single GeForce GTX 1660 SUPER on the motherboard and the motherboard is msi creator X299.

1 Like

That appears to be a good combination; the charts say the GeForce GTX 1660 SUPER has Turing architecture, which is important.

Encoding should start right up as soon as you check the Use hardware encoder checkbox.

The operant word being “should”…

In my case, my GeForce GPU would not work with ShotCut until I upgraded the NVidea drivers for the GPU.

1 Like

Thanks for the solution:-

This topic was automatically closed after 90 days. New replies are no longer allowed.