NVENC VBR not working?

I defined VBR as you can see in the image, but when I saved the custom profile, I noticed there was a line “cbr=1” in the parameters, which I deleted. Even so, the file was encoded in CBR. What have I done wrong? Was it a bug?

See post coding data:

  • Video
  • ID : 1
    
  • Format : AVC
    
  • Format/Info : Advanced Video Codec
    
  • Format profile : High@L4.2
    
  • Format settings : CABAC / 2 Ref Frames
    
  • Format settings, CABAC : Yes
    
  • Format settings, ReFrames : 2 frames
    
  • Format settings, GOP : M=3, N=30
    
  • Codec ID : avc1
    
  • Codec ID/Info : Advanced Video Coding
    
  • Duration : 1 h 57 min
    
  • **Bit rate mode : Constant**
    
  • Bit rate : 12.0 Mb/s
    
  • Width : 1 920 pixels
    
  • Height : 1 080 pixels
    
  • Display aspect ratio : 16:9
    
  • Frame rate mode : Constant
    
  • Frame rate : 60.000 FPS
    
  • Color space : YUV
    
  • Chroma subsampling : 4:2:0
    
  • Bit depth : 8 bits
    
  • Scan type : Progressive
    
  • Bits/(Pixel*Frame) : 0.096
    
  • Stream size : 9.83 GiB (97%)
    
  • Color range : Limited
    
  • Transfer characteristics : BT.709
    
  • Matrix coefficients : BT.709

What are you trying to do? Are you using Nvidia GPU acceleration? The default bitrate seems to be 2 MB/S. I suspect you may be trying to push the codec outside its normal parameters. Have you tried using FFmpeg to encode? Perhaps it would not be so limited. Guessing, really. I only use H.264 high profile in all my exports. Good luck!

-=Ken=-

1 Like

I’m trying to use my GPU to encode (I guess it would be faster than using my CPU - got a i5 4460 and a GTX 1050 Ti).

Then I configured a custom preset trying to achieve 1080p 60FPS for YouTube videos. YouTube recommends 12Mbps VBR for that settings.

Any tips?

You have some pretty hefty hardware. But GPU acceleration is experimental. I tried it when I first started using Shotcut but quickly became convinced that not using it was safer, more stable and had more features enabled. You are welcome to try it but it is generally not recommended. Some people are able to use multiple CPU cores for Export and that would be key for faster processing. Multi-core processing is limited by the codecs Shotcut uses. You might also view the documentation for FFmpeg and its support for GPU processing. Not all of FFmpeg’s capabilities are enabled in Shotcut.

Have fun,

-=Ken=-

1 Like

I did some tests and I came to the following conclusions:

  • The “GPU Acceleration” is only for rendering while editing the videos.
  • Video encoding by GPU depends on the codec you choose.
  • When using the libx264 codec, encoding is done only by the CPU (software encoding, not using QuickSync feature), regardless of whether the “GPU Acceleration” option is active or not.
  • CPU usage is very high with libx264 with the settings I chose, easily maintaining 90% to 100% usage of all cores. This makes it impossible to use the computer for other tasks while the video is being encoded.
  • When I use the h264_nvenc codec, the GPU is used (about 15% processing), while CPU usage drops to 50% to 60%, which allows me to use the computer for other tasks.

Also, I tried saving a new profile by changing only the codec to libx264, these are the parameters that were saved:

movflags=+faststart
preset=slow
profile=high
strict_gop=1
f=mp4
acodec=aac
ar=48000
ab=384k
vcodec=libx264
vb=12M
g=30
bf=2
sc_threshold=0
width=1920
height=1080
aspect=1,77778
progressive=1
top_field_first=2
deinterlace_method=yadif
rescale=hyper
r=60
threads=0
pass=1

When I try to save a new profile using the very same settings using h264_nvenc codec, that’s what I got:

movflags=+faststart
preset=slow
profile=high
strict_gop=1
f=mp4
acodec=aac
ar=48000
ab=384k
vcodec=h264_nvenc
cbr=1
vb=12M
v2pass=1
sc_threshold=0
g=30
bf=2
width=1920
height=1080
aspect=1,77778
progressive=1
top_field_first=2
deinterlace_method=yadif
rescale=hyper
r=60
threads=3

Note that pesky cbr=1 included between the parameters. I don’t want to encode the video using CBR (aside from being recommended by YouTube, I get smaller files with higher quality by using VBR). However, I’m not comfortable using the VBR parameter in %, but if there is no other way, would someone suggest me a reasonable value to reach the minimum standard required by YouTube (12Mbps VBR)?

Unfortunately, I don’t have further suggestions for you. I do find it interesting, however, your experience with cpu load sharing. I use AMD cpu in my new rig and AMD APU (CPU/GPU on a single chip) in one that is three years old. I have not had any noticeable slow down when exporting from Shotcut or Kdenlive even though they are consuming 100 percent of the cpu. I can run other apps while export runs in the background and will not notice slow response. I think this has to do with load sharing algorithms. Most people run Intel and many/most run MS Windows - so I wonder about that. Is it Linux or AMD that is doing better load balancing? Just curious.

-=Ken=-

1 Like

In the current Shotcut releases, export jobs run with different priorities on Linux and Windows. On Linux, the exporrt job runs at low priority while Windows jobs run at normal priority. In the next Shotcut release, the Windows build will also lower the priority for export jobs and the system will feel more responsive while exports are going.

1 Like

Excellent. Thanks for the info Brian.

-=Ken=-

Being very new to Shotcut, how do you save Export profile and see the settings? I did find right click on jobs/log brings up some info, but not what you are showing.

Settings>Video Mode>Custom>Add

1 Like

Thanks!

Over at Nvidia Developer site you can download “Using
FFmpeg with NVIDIA GPU Hardware Acceleration” you have to join and sign in to get to it, it’s in the video section. It states that the low latency encode mode uses constant bit-rate mode. If you want to encode with variable bit-rate you have to use Latency tolerant high quality encode mode. The pdf has cli command line examples. If I understand correctly Shotcut is using FFmpeg for encoding, so it is likely it is a error in what is sent to FFmpeg.

1 Like

Thanks for the info. I’ve checked the file, but don’t know how to pass the parameters to Shotcut custom profile (is that even possible?)

Not sure that you could inside Shotcut, but they say they do a update fairly often so I wanted to get the info out there. If you are just wanting to trans-code a file with your gpu you may want to try XMedia Recode

1 Like

No, you set it to ABR. NVENC only offers CBR or VBR modes, but Shotcut has more than that. If you want VBR, then choose VBR! For the integration of NVENC in Shotcut’s rate control options, both ABR and CBR set cbr=1 but only CBR also sets the bufsize option.

The “GPU Acceleration” is only for rendering while editing the videos.

No, it is only use for image processing - the things between decoding and encoding. But, when on, it applies for both preview in the app and during export.

1 Like

I’m sorry for the late reply.

ABR? Hmm… I didn’t pay attention to this (it must be because I’m using Shotcut in Portuguese, hehe). However, the problem here is: I don’t want to rely on quality (%) based VBR, you know? I’d like to have a way to tailor the final setup to fit YouTube requirements while still maintaining good quality.

Could you tell me if the forced VBR option will give me what I want?

As for GPU acceleration: I tested both possibilities. Even when leaving it unchecked, the video card is used when I export using NVENC.

Actually, based on your feedback, in the latest release, I changed the ABR mode of NVENC to not set cbr=1. I think it achieves what you wanted. However, quality % VBR is perfect for YouTube as they transcode everything to something similar to constrained VBR for adpative bitrate streaming anyways. YouTube actually does not have very strict requirements; they would suffer too much complaints and not have gotten as huge as they are if they did. I think they just publish recommendations just to give people some sort of target and recommendations to try to avoid crap, unexpected results, optimal throughput, etc.

Settings > GPU Processing is nothing to do with NVENC.

1 Like