"better codec" more CPU📊

I really wanted to get your opinion🗳️ on the “standard.” You don’t even have to look at it, but it’s curious how h264/HEVC consumes more resources than MPG, it in fact be 44% versus 28% of processor usage.
Even more laughable is that, when it comes to exporting, with the same average, the final video size and quality are identical!


If all three codecs are asked to churn out data at the same rate (12Mbps) for the same amount of time (the length of the video), then it follows that all the resulting files will be the same size. This is by design.

As for being the same quality, 12Mbps is plenty of bitrate for all three codecs to look equally good to the human eye at 720p. Technically, the H.264 and HEVC encodings may have higher quality than the eye can detect, which may not become apparent over MPEG-2 until a heavy Color Grading filter is applied to stretch the tones and colors. Then, we might notice the slower-to-encode files holding up better under heavy stretching.

Essentially, this test is telling you that it’s possible to lower the bitrate on the H.264 and HEVC encodings without noticing a drop in quality. The current settings are wasting disk space on undetectable quality improvements. This is why we usually encode with CRF quality targets rather than try to guess the best bitrate for every unique video.

To summarize, I don’t see any surprises here.

3 Likes

OK Austin thanks, but that “autopilot” used to be something I didn’t change. What you’re mentioning is more related to editing; my objection is about resource consumption.
Why does the final product have to be .mp4 and not .mpg? It’s more a matter of patents or following the fashion trend!

Here’s a metaphor for you: I want to carry my weekly groceries from the big supermarket store on othe city’s outskirts. I have tried my small car and it’s fast and uses very little fuel, so why do big trucks exist that use way more fuel and are more difficult to drive?

Your test is crafted in a very specific way so it makes little difference on whether you use the more advanced codec or the simpler older one.
Do you remember DVD quality movies? They were 4GB in size with terrible resolution (720x480) and low quality. Today with the best codecs you can have a 4K movie that fits a dvd at insane quality increase.

You can of course use the mpeg2 codec preset with no worries, but you’re going to have either bigger files or lower quality videos.

El Bitrate es la clave de la calidad de tus vĂ­deos

I did not translate the entire article but the bitrate is NOT the only thing that matters. It really depends on… you guessed it, the codec used.
Try for yourself a test at 1 mbps with hevc(x265), avc(x264) and mpeg2 and you will see for yourself the advancements in efficiency.

Daniel, I disagree with that, because the facts show me otherwise, because how did they manage to remaster musicals from the 80s in 4k and make them look so good, if the quality was “so bad”?
What we do know is that 4k isn’t necessarily better than 640x360 full screen↕️!

I don’t know what this means.

The description I gave has nothing to do with editing, unless ”editing” refers to editing the export settings.

H.264 has higher consumption than MPEG-2 because it produces higher quality at the same bitrate. Set the bitrate to 2Mbps or lower and the difference will be clear.

.mpg is often a .ts container, which can hold H.264 just fine if exported that way. All three codecs are in patent pools held by the same MPEG LA legal entity, so the “follow the trend” statement makes no sense.

Those musicals were shot on film. They rescanned the film at 4K. They did not upscale a DVD video to achieve 4K. Worst case scenario, they did some AI upscaling of a digital master.

I am speechless.

1 Like

Those musicals were shot on film. They rescanned the film at 4K. They did not upscale a DVD video to achieve 4K. Worst case scenario, they did some AI upscaling of a digital master.

I am speechless.

Austin, we keep going around in circles. The video you record in 8K with your phone must be better than a low-resolution one made by DW documentary 25 years ago. Are you sure it will look better in full screen just because of the resolution and codec? > RAW files havemuch lower resolution:
https://forum.dji.com/thread-134643-1-1.html

This discussion is going nowhere until you actually do this. You are not setting the bar high enough. Both me and Usain Bolt can run 100m if the target is set at 1 hour, but set it at 10 seconds and I’m nowhere close.

1 Like

In terms of those two variables, yes.

If the 25 year old video looks better than an 8K phone, it is because other variables are at work, such as lens selection, sensor size, lighting setup, image noise, dynamic range, 3-chip CCD vs CMOS, coloring in post, or a camera operator that better understands stabilization and proper shutter speeds.

If an equivalent lens is adapted and speed boosted onto a phone, and a scene is given the same lighting control, then the phone is likely to always win against 25-year old gear. Here is an example rig:

This Remarkable Film Was Shot on iPhone Using Anamorphic Cine Lenses | PetaPixel

There are feature films like “Unsane” and “28 Days Later” (which had a $75 million budget) that play in normal movie theaters, and the phone image looks equal or better than anything produced 25 years ago. That’s about as “full screen” as it gets.

Note that I am not suggesting phones compete favorably against modern equipment.

To your point, resolution and codec are not the only parameters of quality. But if all other variables are kept equal, then by nature, resolution and codec do become the key determinants of quality.

RAW files have equal resolution. One could argue that they have lower definition due to the color filter array, but that’s usually described as chroma noise.

The article you provided said the author was accidentally looking at preview resolution rather than full resolution.

1 Like

OK Daniel, I understand. My point is that for streaming, videos uploaded to any platform or website are converted with “the appropriate criteria” to be optimal.
Why do you have to work with something that’s more demanding?
For this reason I try to force YouTube not to show me HD videos, but instead I use low resolution (480p) in full screen, because the improvement is not noticeable.

Austin, Yeah, I think similarly. That’s why I don’t like photography. Those super-powered devices are a waste, because when they end up in print media like a newspaper, they don’t look good. Otherwise, the photographer edits the photos and presents us with wax figures: no blackheads, no orange peel skin, no cellulite, no stretch marks, nothing.

IN SUMMARY, when exporting the video in .mpg, is there a noticeable loss in quality? Usually, what I hear is “I can’t watch the video on my phone.”

Maybe if the sports camera recorded in .avi format, it would heat up less, right? Not all of us want to record in 4K. If the camera’s engineering is good and the video quality is good, you could resize it to make it bigger, and no one would know you didn’t record in 4K.

Maybe you need eye glasses. I got a prescription last year, and it dramatically improved the quality of my HD and 4K TVs when viewed from around 2 or more meters since I am near-sighted. Just because you do not see the improvement does not mean others do not either.

Dan, I have some nearsightedness, but thank God it hasn’t gotten worse because I made the prudent decision not to wear glasses. Why, because the eye atrophies and no longer focuses, it gets used to “seeing well” with the magnifying glass. I used to sit in the first or second row!

I don’t think I see any improvement because I watch the news and series on the computer. However, although I rarely use the TV, once day I used DLNA to play something on the TV, and it didn’t look horrible.
There are still many people in the world who have a 4K TV, but the cable service and the channels don’t have a good signal. I guess you get what you pay for!

I had the same discussion some years ago for 1080p vs 4k, so I guess I sort of understand your point but just a level higher. Even some of my friends see no issue in messenger/whatsapp compressed videos - while others ask me to upload the original and send them the link.

I guess it’s just personal preference after all if the small video details matter or not, so in the end just use what codec it gets you the results you want (but seriously you should at least use the extremely popular x264 with a veryfast preset).

This is a good question, but it is incomplete. None of the codecs discussed will show a meaningful loss in quality IF they are given high enough bitrate to do their best work. The question is how much bitrate are you willing to give them? Are you willing to give 3x the bitrate or more to use MPEG-2? Or are you willing to make a one-time sacrifice during editing and exporting to use HEVC or AV1 to make a file that might be much smaller? If you make a lot of videos like some forum members do, the disk space needed to store these videos represents extra money spent on larger hard drives. With MPEG-2, they would be spending 3x the money on hard drives for quality that isn’t any better. This also means longer upload times to YouTube when publishing videos, with no gain in quality.

Bear in mind, most advances in video technology are to benefit large production companies that are churning through tons of high-end footage. Small differences add up to big savings for them at their scale. Meanwhile, individual users making only a few videos can do whatever they want without consequence.

This is very situation dependent. The work I do is projected on a 16-foot wall. 1080p does not hold up well when stretched that large, regardless of how good the camera was. Meanwhile, for young people with sharp eyesight, they can tell a difference between 720p and 4K on screens as small as seven inches. If you are publishing videos on YouTube for the general population to see, it’s best to assume that the majority of them have healthy or corrected sharp vision, even if you can’t see the difference yourself. There is most definitely a difference when the production crew knows what they’re doing.

Granted, back to the topic of people detecting upscaled video… the discussion is evolving when it comes to AI upscaling. But that’s a bigger and highly context-sensitive topic. AI is good at filling in what it knows. If you create something it’s never seen before, it will do a bad job upscaling. So, that’s an evolving art.

I am near-sighted too, so I sympathize with your situation and care about your health. However, I am not sure this statement is correct. Lack of focus is due to the thickness of the “eye lens” being incorrect for the distance to the back of the eye. Constantly straining to squeeze the lens into the proper thickness leads to more problems, not less. Young people with sharp vision do not get eye muscle atrophy due to seeing well. They get plenty of muscle activity from changing focus from near to far thousands of times a day. Squinting through bad vision just leads to fatigue and increased internal pressure. Since you seem to have a strong stance on the topic, I don’t expect you to believe me, and I won’t say any more about it. But I might suggest doing a test drive with glasses for one day when watching a 4K YouTube video to see if the difference in quality becomes more apparent.

Many news stations broadcast in 720p, even on YouTube. Maybe that’s why quality always looks bad.

There is also the possibility of the screen material. If the screen is a matte 15” business laptop screen, then yeah, those things are horrible and do not properly convey the video quality.

There is a DW documentary that they removed like many others, because it was the “politically correct” thing to do. :warning: > :stop_sign: no :eyeglasses: :x:




Let’s start with vision: people with myopia do not see poorly, but they find it difficult to see objects clearly at long distances -which is not serious-. You don’t have to see a traffic sign half a kilometer away.
Do you know why people become dependent on glasses, why they find it harder to see more than before when they take them off, and even worse, why they never improve?
Now that is operated on, but not even the doctors themselves do it, it takes less than 20 minutes.

Austin, Yes, precisely because of that, people now pay for storage. Haven’t you seen that precisely users complain because the quality of “their good videos” looks so ugly on YouTube or other sites? Well yes, because they lower the bitrate for streaming.

:red_circle: Now most young people watch content on their cell phones, so it is a lie that they would appreciate a higher quality, because of the resolution, less in an image of a web page adjusted by a CMS.
It is paradoxical, because now people watch less TV, but it is when more people buy smart TVs, which are not smart at all, besides being typing something on a remote control is to go crazy. It is basically consumption, like when you visit a bar and there are a lot of “televisions” but in “silent cinema” without volume.

:yellow_circle: I believe in the advantages of the new codecs, as long as the video is strictly for the web. I imagine that the .wmv format has a reason for being.

Unless there is some kind of meaning lost in translation this is one of the weirdest takes I’ve heard recently. I have myopia and I can’t read the bus/subway screens from as far away as my friends, no matter how you put it, this is “seeing poorly” and not “normal”.
It also depends on the specific dioptries, someone with a -3 myopia or worse would literally not see that stop sign from 10 meters if blended with orange tree leafs(leaves?).

People pay for storage because they want high quality videos. I personally look back at my phone video recordings from 10-15 years ago and get sad not being able to see my gradma more clearly because I chose to record in 720x480 as I didn’t have enough storage space to use a higer resolution.

But all this talk made me really want that side by side comparison, so here it is:

mpeg2 vs x265 at 1920x1080p at 5.5mbps average (I wanted to do 4mbps but mpeg2 simply refused to go lower and even at 5.5 there are clear artefacts):

I upped the bitrate at 8mbps and hevc is basically good enough for everyone at this point, mpeg2 would probably need at least 50% more space to match


You might say but wait, mpeg2 is made for dvd resolution, so I made another test at 480p, this time I halved the bitrate at 2mbps for start (because half resolution = half space used, I think that’s fair, right?)

For me these are both unwatchable, but they are almost similar. I think this is where you would say there’s no reason to use something modern.

BUT, let’s halve this again, at 1mbps we get 50% less space needed and hevc is way above mpeg2.

If you still think mpeg2 is better for basically 30 to 50% more “CPU” work (well, this is actually only true for 1080p, as at 480p my CPU was never at 100% usage doing the export so the final time was pretty much the same).

Also I did not use AV1 for this test (even though it would have made even better quality for the same filesize) as I think it’s a bit early to jump into it and really needs double CPU “work”.

2 Likes

After many tests with video codecs, I came to the following conclusions:

  1. The best ratio of image quality and compression is in the av1 codec (of all that shotcut has at the moment).
  2. For my tasks, I will always choose the codec that will be played by hardware on my equipment (if my intel video card can play AV1 without problems, I will encode in it, but if I encode for a device that has only h264 codec as the maximum available, I will encode with this codec).
  3. There is no universal codec, they are constantly evolving.
  4. I do not see the point in encoding video with a constant bitrate, unless it is a let’s play game, where encoding speed is important.
1 Like

…but most people aren’t storing their videos as MPEG-2. They would pay 3x in storage costs if they did, and the quality wouldn’t be any better than an HEVC or AV1 encoding at a third of the MPEG-2 bitrate. @daniel47 did a great job demonstrating the quality/bitrate relationship. This is why MPEG-2 is dead outside of legacy TV, satellite, and DVD systems.

It isn’t just hard drive space being wasted, it’s also network bandwidth, which is precious in third-world countries and over satellite broadcasts. Network bandwidth is also expensive to distribution companies like Netflix and Prime and YouTube who serve videos to millions of customers. Any format that gets the bitrate down without hurting image quality is money saved on bandwidth that goes straight to the bottom line.

That’s only half the story. Lower bitrate is not a problem if codec efficiency improves at the same rate or better, which it has.

Consider this YouTube video of underwater animals:
https://www.youtube.com/watch?v=YFmV_MRSD7M

Here is a screenshot in both 4K and 360p modes, taken at the 1:41:59 mark:

Can you argue in good faith that the 4K version doesn’t look any better than the 360p version? I chose this frame because it has many disadvantages for the 4K image. It’s underwater (refraction issues), it’s a moving animal (some blur), it’s a screenshot on a 1440p monitor rather than full native 2160p, and it’s in VP9 format rather than AV1 (which would look even better). Even with all those problems reducing sharpness, the 4K image still looks significantly better than the 360p image.

Played back in 4K, does this video look ugly? No, it looks spectacular even on a 75" TV. Meanwhile, 360p is unwatchable on a 75" TV. When you hear users complain that their videos look ugly on YouTube, it is generally user error. They had bad production quality, they uploaded a file that was overly compressed, and they probably had less resolution than 1440p. YouTube provides a higher bitrate and/or better codec for videos uploaded at 1440p or higher, which is why the underwater video looks so good. This quality level is available to anyone that actually knows what they’re doing.

If “content” means TikTok brainrot, then I agree with you. But step onto any college campus, and you’ll see that desktops, laptops, and TVs still dominate for video gaming, movies, sports, and academic work like presentations. Quality differences are very noticeable and appreciated there.

I don’t understand this qualifier. ATSC 3.0 is the American standard for broadcast television. It uses HEVC for 4K and 8K broadcast. HEVC is also used for Blu-ray discs. All of these platforms look fantastic. HEVC is not just a “web thing”. I don’t even understand the point you’re trying to make anymore. The only reason I’m replying is because other people new to video encoding may find value in this discussion.

Yes, a bad reason. It was Microsoft’s attempt at gaining dominance in the video encoding industry. They almost got there when VC-1 was ratified. Fortunately, better options came along, and WMV became a dead format.

1 Like