I use two PCs of equivalent generation. One equipped with an AMD graphics card and the other with an Nvidia card. (4gb of video memory)
(Same processor, same Ram)
I use the one more often with the AMD graphics card. Having several videos to export (including one to correct, second export), I was surprised by a very different file size.
For the same export settings, on the same video:
Radeon Pro Vega 20 - amf AMD / HEVC: 850mb
GTX 1060i - nvec Nvidia / HEVC: 2.2gb
No difference in quality.
I wonder about so much difference in file size!
(both drivers are very recent)
A different integration of codecs in shotcut?
One challenge is that quality settings are not standardized between hardware manufacturers. Quality at 60% on NVENC is not the same algorithm as 60% on AMF. They have unique encoders, and those quality numbers mean unique things to each card. Also, there are generational improvements, too. Quality at 60% on 3rd generation NVENC is not the same as 60% on 7th generation NVENC. A direct comparison by quality percent is not valid.
A more fair way to judge quality between hardware cards is to get VMAF scores or some other visual quality metric of the encoded files versus the original files. Then, take the NVENC and AMF files of equal VMAF scores and compare what their file sizes and quality settings were. Granted, VMAF is not a perfect system, but this methodology will be way more fair than comparing quality percentages.
Thank you for that answer.
I naively thought the codec implementation was standard. If the value of % quality is not, then there is no need to compare indeed!
In the case of productions, with several different machines, it becomes complex to standardize the renderings!
Indeed it would.
This topic was automatically closed after 90 days. New replies are no longer allowed.