Does encoding reduce resolution, also which codec to use?

Hi,
Does encoding reduce resolution, also which codec to use? I have an Nvidia Quadro K1100M and am running Windows 10 Pro. Also, for ultimate quality should I export in 100% quality? PS: What is parrel prossesing?

“H.264 High Profile” is very general purpose and should be fine. “HEVC Main Profile” makes a smaller file at similar quality but takes longer to export. These codecs do not reduce resolution, but they may slightly reduce visual detail to get higher compression (smaller files). It will generally not be noticeable.

“Ultimate quality” is a lossless file that can potentially be many gigabytes in size with no visual improvement to the human eye. You probably don’t want or need this. 100% quality also has playback issues with many common media players.

Leave parallel processing turned off for the most reliable results. It attempts to process multiple frames at once, but has potential to cause problems with certain filters.

1 Like

Thanks for feedback. These videos are going to be viewed on 4K displays so what percentage quality should I use? Also what is H264 NVENC.

1 Like

Use the default percent unless there is a specific reason to change it. The human eye generally can’t detect any differences beyond 68%.

NVENC is for hardware encoding (NVidia ENCoder). Personally, I would not recommend using hardware encoding with a K1100M. The result is likely to be a large file that doesn’t look as good as the libx264 software encoder.

I tested the encoder and it takes load off of my CPU. By using the encoder do I lose quality? Also, I am looking to build a pc Will an Nvidia 1650 super be good?

On that particular GPU, yes, definitely.

With hardware prior to 7th gen NVENC, there is also generally a tradeoff. The quality can be made higher, but at the expense of a file that is 3x larger in size than the software encoder would produce. If the file sizes between hardware and software are constrained to be the same, then the quality of the hardware encoder will be much worse than software. The K1100M will not match the quality of libx264 even under the best conditions it can be given.

All of this changed with 7th gen. Also, recent versions of QuickSync are very competent too.

Yes, but it has to be the Super edition, not the GDDR6 variant. Some of the 1650’s use the 6th generation NVENC encoder, which is okay but not great. It isn’t until 7th gen that NVENC becomes amazing. That generally starts with the RTX 20xx cards and the GTX 1660. A few of the 1650 cards have 7th gen, but you have to be really careful with the model number and verify it has 7th gen NVENC. Look under the Encoding section of this Nvidia support matrix to see which cards have which generation of NVENC:

Is libx264 always going to be the best quality?

7th generation NVENC can match libx264. The latest Quick Sync gets really close too. GPUs from previous generations or different vendors cannot match libx264 as of today’s date.

H.264 is a lossy compression scheme by nature. You keep using the words “best” and “ultimate”, none of which H.264 is designed to do. H.264 throws away all the data it can because its priority is small file size, not highest quality. However, H.264 at CRF 16 (Shotcut 68% quality) is “good enough” to match human vision on a 4K screen. If you want quality beyond that, consider ProRes, DNxHR, or a lossless format.

1 Like

So for me, I want amazing quality because it will be viewed on 4k tvs. So h.264 at 75% should do the trick, thanks

This topic was automatically closed after 90 days. New replies are no longer allowed.