-
What is your operating system?
Win 11
-
What is your Shotcut version (see Help > About Shotcut)?
25.05.11
-
Can you repeat the problem? If so, what are the steps?
I’m using GoPro source clip, shoot in UHD Protune Wide format, transformed into bt709 space using input LUT. After adding Contrast filter in non-GPU mode it exhibits severe banding, even at default 50%. Increasing the contrast further makes it truly horrible. After rendering using hevc_qsv codec, especially downscaled to HD with all default values things get yet worse.
Switching Shotcut to GPU mode and using GPU-enabled filter eliminates the issue.
Before contrast applied, no banding:
After, severe banding and blotches:
HD render:
I can’t speak to that filter specifically, but most CPU filters in Shotcut are 8-bit, and doing tonal adjustments at such limited bit depth can lead to banding like you’re seeing.
GPU filters work in FP16, a floating-point format with greater bit depth, and are better suited to maintaining image quality for this sort of manipulation.
Well probably I found the reason. As @notsmoothsteve correctly noted, this definitely has to do with 8-bit nature of the filter. However this isn’t enough; I been working with several 8-bit NLEs and never seen banding being that bad after applying single contrast filter. In my case, it’s 10-bit source clip what causes extra trouble. Unfortunately, upon importing 10-bit media SC compresses it somehow, so it doesn’t use even full 0-255 range, forget about 1024. It clips at around 210, or 80 IRE. Video scopes confirm that, this is maximum I can achieve without clipping:
Then, when I try to export such clip, passing extra parameters to ffmpeg, it produces correct HDR output but again with the range limited at around 800. Here are the parameters I used (with hevc_qsv hardware codec):
vtag=hvc1
preset=medium
vprofile=main10
mlt_image_format=rgb
pix_fmt=p010le
movflags=+faststart
color_primaries=bt2020
color_trc=arib-std-b67
colorspace=bt2020nc
load_plugin=hevc_hw
Would it be possible to make it use full range with 10-bit source media? Also, any chance to add GPU-enabled LUT filter? Ideally of course – implement complete 10-bit HDR workflow, utilizing also HDR hardware, if present. Yes I know this isn’t anything easy 
Thanks for your effort anyways.
Use Export > Video > Color range = Full.
In fact, there is a bug in latest release to use full range source and limited range output. If you are using GPU effects, it is fixed in the latest daily build. If you are not using GPU effects, HLG HDR output is extremely limited and is primarily cuts-only.
colorspace=bt2020nc
This is wrong. colorspace
is a MLT property, not FFmpeg AVOption, and it should be set to 2020
. Better yet, change the Video Mode to use Rec. 2020, and you do not need to set it explicitly in export.
mlt_image_format=rgb
That is going to break the 10-bit processing through GPU effects as well as HLG HDR.
utilizing also HDR hardware
This works today using a Blackmagic Design SDI/HDMI device for external monitoring. I have not been able to get desktop HDR preview to work. Possibly, the way we use Qt is blocking it.
Basically, in your latest reply, you are jumping ahead to do HLG HDR, which is not yet announced, and I have not yet written about how I am doing it because I have been and still on a vacation. But also this goes off-topic for this thread.
I added HLG HDR editing info to Editing HDR Source Clips.
Thanks @shotcut.
Tried out with suggested settings – yes it does render properly exposed HDR clip, producing full 0-1024 range output. Still there is something that doesn’t really let use it though.
In fact, there is a bug in latest release to use full range source and limited range output.
Well I don’t have access to daily builds, and always used full range (and GPU mode) anyways; this isn’t the problem. What really happens is that adding single non-GPU filter (LUT in my case) causes severe range compression of 10-bit source. Therefore can’t go with your suggestion:
It may be beneficial to use a SDR2HLG LUT on the BT.709 video sources.
So yes, the exposure range is fine now, but the colors are off, and cannot imagine someone doing manual color spaces mapping. Therefore having no GPU-enabled LUT filter is essentially a show-stopper for HDR workflow, given that the most of (if not all) HDR sources do require using this or that LUT. At the moment there is a choice: either limited range or wrong colors. None is good of course.
One more thing:
This is wrong. colorspace
is a MLT property, not FFmpeg AVOption, and it should be set to 2020
.
Wouldn’t it be possible to publish full list of rendering options, applicable on Other
tab?
Thanks again for your effort