CRF = 11 set as the default lowest quality setting is way too high for low resolution input files.
When importing an HD file with VBR which natively had about 10k bitrate, Shotcut asked me to convert it and to choose from 3 resolutions. I chose the lowest and the resulting file had 50k bitrate. As the user Austin suggested in my recent report, 11 CRF compression is too uncompressed for low-resolution videos (and HD isn’t even the best example.)
Side suggestion: make the conversion window show the CRF, as suggested by someone else a while back.
For sake of clarity to any other readers on CRF values since “too high” could be interpreted either way…
The lower the resolution, the closer to zero the CRF needs to be. As the resolution gets lower, every detail becomes vitally more important because there are fewer pixels to describe the scene. Any deviations from the source get magnified when upscaled for playback on large screens, so accuracy becomes paramount, requiring higher-quality CRF values (closer to zero).
At high resolutions like 8K, it doesn’t matter if a few pixels are not perfectly accurate because they are too small to do much visible damage. Also, going from 4K to 8K may quadruple the number of pixels, but it doesn’t quadruple the amount of color variance since the extra pixels in 8K will basically be the same color as neighboring 4K pixels, and that neighboring similarity compresses very well. At low resolution, neighboring pixels are more likely to be radically different colors.
So, the higher the resolution, the closer to 50 the CRF can go. The lower the resolution, the closer to zero the CRF should go.