I shoot 10-bit video, 4:2:0 or 4:2:2 depending on the camera used and would like to complete a project in Shotcut. Colour correcting using GPU settings works well enough, but many effects do not work in GPU.
As I understand it, adding CPU effects after GPU effects reverts everything to 8-bit and therefore re-introduce “8-bit blues”, meaning I need to complete CC in GPU and render it in 10-bit before re-opening Shotcut in CPU and re-importing to complete, which can then be 8-bit. First question, is it correct that adding CPU effects will make my CC effectively 8-bit?
2. Is it possible to frame serve from GPU to CPU and if so, how?
3. Is it possible to render corrected clips from the timeline separately and if so, how?
To do 10-bit input and output avoid non-GPU effects (or contribute code changes). It is OK to overlay text and other graphics as clips as these are typically 8-bit in origin and the video compositor is on GPU.
meaning I need to complete CC in GPU and render it in 10-bit before re-opening Shotcut in CPU
That is a wrong assumption.
is it correct that adding CPU effects will make my CC effectively 8-bit?
What exactly is “CC?” I know you mean “Colour correcting”, but the answer depends on your filters. Adding CPU filters with GPU Effects turned on does NOT make the GPU effects 8-bit. The GPU effects are actually in 16-bit floating point (the 8-bit is integer). Simply use the GPU filters to do the color correction, and it will preserve the 10-bit input much better than the CPU filters.
Is it possible to frame serve from GPU to CPU and if so, how?
No, as I understand the terminology “frame serving” as a way to get different tools or components to work together–like another term “piping.” I guess you want to have one instance of Shotcut transfer video to another instance of Shotcut, but Shotcut’s engine already does this internally. There is no need for you to do it crudely manually.
Is it possible to render corrected clips from the timeline separately and if so, how?
Yes, but also you do not need to use the timeline. When you Copy a timeline clip, it also goes to the Source player. In Export > From you can choose Source. So, you do not even need to use the Timeline; you can simply open a clip, trim, color correct, and export from source.
Thanks for the prompt reply Dan. I wish I could contribute code changes, but that’s way out of my skill set!
To get this straight in my mind, if I colour correct in GPU I will get the 10-bit advantages locked in, but when I add CPU filters lower in the stack I’m then editing in 8-bit ready for 8-bit output, which it will finish up being anyway? Is there a list of filters to avoid?
One further question regarding LUT3D. When I add it at the point it should be i e. after CC, it instantly gives me back banded sky etc. I happily can CC without it, so does that mean the GPU setting has some form of colour management?
A pipeline with GPU Effects and most CPU filters with 8-bit export will go like:
10-bit YUV Rec. 709 color → 16-bit float linear color → usually 8- or 10-bit RGB(A) sRGB color, could also be YUV 4:2:2 Rec. 709
Rec. 709 in the above is for example, could be other common color spaces such as 601 or 2020.
But there are new exceptions described here:
Is there a list of filters to avoid?
Nearly all except the list in the linked reply, and you need to add mlt_image_format=yuv420p10 or mlt_image_format=yuv444p10 to Export > Other because it and most presets (except the ten_bit category) default to 8-bit. mlt_image_format determines how the images are requested from the pipeline, but pix_fmt determines how the encoder will export. A filter will try to give what was requested, but if it cannot will fall back to something else it chooses. You can ignore the parts about colorspace and color_trc because that gets into HDR. (What I describe in the linked reply is a form of crude HDR edit support where images are primarily passed from the decoder to the encoder.)
In the next version of Shotcut there are new keywords added to the filter descriptions, and you will be able to search for #10bit.
regarding LUT3D. When I add it at the point it should be i e. after CC, it instantly gives me back banded
LUT3D has the #rgba #10bit keyword because it can do >8-bit RGB pixel formats. Keep in mind that the Shotcut preview always asks the pipeline for 8-bit because that is all it knows how to handle and is much more efficient (less data transfer). However, with mlt_image_format=yuv444p10 I can see in the export job log that lut3d is using fmt:gbrp10le. Good!
GPU setting has some form of colour management?
Yes, it converts to and from linear color. It handles SDR color spaces and gammas but not HDR transfer functions.
I just made a test with 10-bit HLG footage that has sky with cloud. I used GPU Effects for Color Grading and Saturation filters followed by LUT: 3D with a HLG2SDR LUT. After I reduced the blown-out highlights I quickly saw banding in the sky in preview. Then, I exported using the HEVC Main Profile preset and full color range. I see the banding.
Thanks for the comprehensive reply, which I have saved to my reference section for later careful study.
From what you have said, I get the impression that once I have done colour correction in 16-bit float linear (from which it doesn’t need a LUT to convert from log to BT709 because it is no longer in log format), if I then add non-GPU filters I am working in 8-bit and can use the majority of the 8-bit filters that are not connected with CC and achieve a banding-free 8-bit output. However if I want 10-bit output I need to stick to the list of filters you linked to. Do I understand correctly?
16-bit float linear (from which it doesn’t need a LUT to convert from log
The GPU effects cannot properly convert LOG or HDR to linear as it does not know how to. It usually thinks LOG is Rec. 709, and HDR is converted using Rec. 2020 color coefficients but not EOTF/gamma. I suppose for HDR, it converts to something close to linear and still better than not converting.
if I then add non-GPU filters I am working in 8-bit and can use the majority of the 8-bit filters that are not connected with CC and achieve a banding-free 8-bit output.
No
However if I want 10-bit output I need to stick to the list of filters you linked to.
Not necessarily. Some people use 8-bit RGB(A) filters with 10-bit output to minimize banding. 8-bit RGB (full range) has more information than limited range, sub-sampled chroma 8-bit YUV.
What I am suggesting is that to maximize the output from the GPU effects, limit your filter usage in the following order of priority:
gpu
#10bit
#rgba
When exporting as 8-bit, you can specify a higher quality mlt_image_format than default to take advantage of #10bit.