Color/luma adjustments without RGB conversion

I’m coming from AVISynth which provides adjustments for levels, contrast, brightness, and gamma without RGB conversion. So I am looking for the same in MLT/Shotcut.

I see brightness and gamma filters in MLT that work without RGB conversion. However, the levels filter is from frei0r which requires RGB conversion. For contrast, I can use either the lift/gamma/gain function (as provided by Shotcut) or I can use the levels filter, but either way, there’s an RGB conversion.

Also, the MLT gamma filter is not exposed in Shotcut. So I’d have to use the levels filter for gamma adjustment in Shotcut, with the RGB conversion.

Perhaps I am making a bigger deal of the RGB conversion than it deserves. But seeing as how MLT has a gamma filter already written, it would be nice to have it exposed in Shotcut. It would also be nice to adjust levels and contrast without an RGB conversion.

Are there reasons why these adjustments should be done in RGB perhaps?

I’m working with AVCHD source files (which are YUV of course). I put them through the adjustments above + denoising with AVISynth and export to DNXHD for archiving followed by x264/MP4 for viewing. I can do all that without an RGB conversion in AVISynth. Granted, going from YUV to RGB back to YUV once around probably is no big deal, but why do it at all if it is avoidable?

Anyhow, I have a long list of reasons for wanting to move away from AVISynth that I won’t bore you with. I am really liking MLT. I’ve also looked at Kdenlive which provides the MLT gamma filter. But I like other aspects of Shotcut over Kdenlive and would prefer to use Shotcut. Editing and transitions are easier with Shotcut. And the control over limited vs full range + the histogram all work better in Shotcut. And I love the detachable windows in Shotcut. Anyway…I really like Shotcut! I’m just going for “perfect” with the above.

Would love to know your thoughts. Thanks.

What kinds of color adjustments are you looking to make? Without leaving YUV space, you could modify the Y, U and V channels. But if you want to adjust the Red channel, there will be a conversion - whether the filter converts to RBG and back to YUV without you knowing doesn’t really change the match.

It would help to know exactly what mathematical operations you want to perform to determine if they are possible in YUV space.

I generally just make minor adjustments to brightness, contrast, and gamma to make my home videos look better. Nothing fancy. I just have grown accustomed to doing it without an RGB conversion. And in AVIsynth, I just look at the Y-channel to avoid clipping. On the histogram, you can see Y-values that go outside the broadcast safe range. And you can see that pretty well in Shotcut also which I appreciate. In contrast, with Kdenlive, it converts to RGB then plots the histogram so it clips the parts that are outside the broadcast safe range. And it is plain as day when you compare histograms in different tools.

I usually scroll through a clip and identify the high/low points, shift the Y channel a bit (with the Levels filter in AVISynth) to get it in the broadcast safe range as much as possible, then apply a limiter. You get less clipping that way. And for really dark clips, I usually adjust gamma because I get a nicer result that doesn’t lighten the really dark areas as much. But I’m just eyeballing that.

I never do any serious color grading with RGB. That would be too much work for home videos. The only adjustment I’ve ever made to the UV channels is white balancing where I shift them a bit. AVISynth has a histogram for the U and V channels also, and there’s a function that auto-centers them. But it has to be way-off for me to do even that. On that note, I noticed the White Balance filter in Shotcut converts to RGB.

FFmpeg/libav has an eq filter that looks like it can adjust brightness/contrast/gamma without converting to RGB. But it is difficult to use without a histogram and without seeing the video clip. It’s much easier having it in an NLE! Granted, I can keep using AVISynth, it’s just a pain. It only runs on Windows and I’m trying to move to Linux. VapourSynth seems to be taking its place and has many of the same functions but it doesn’t handle audio which is very limiting. And neither AVISynth nor VapourSynth offer proper editors with timelines. And I’ve decided I need that in my life. Even if it means an RGB conversion! :). I’ve just read over and over how lossy it is so I’ve always avoided it.

The “Luminance” channel in the histogram in Shotcut is actually the “Y” channel from the rendered YUV image. The Red, Green and Blue channels are converted from the rendered YUV image. I had considered making a YUV histogram also. But I don’t think most people would know what it is or how to use it. Most (all?) NLE color grading workflows operate in the RGB space.

Conversion from YUV to RGB is not mathematically lossless because not all YUV values can be mapped to the RGB clolor space. But video images that come from a camera originated in RGB because camera sensors are RGB. So unless your source material originated from a non-RGB source and used YUV values that don’t map to the RGB colorspace, you shouldn’t have any pixel values that get grossly modified. Any loss in the conversion should be limited to rounding errors of a single bit.

For Shotcut, we aspire to limit the pixel errors to one bit. There is a relevant thread here:

We also aspire to keep the user interface clean and intuitive. So I think it is unlikely that we will offer multiple versions of the same filter just because they operate in different colorspaces. So we would need a good reason to switch to using a different filter on the backend. I’m not sure if a pixel error of “1” is enough reason to change the filter. Maybe.

While it is mathematically lossy, it is not visually lossy. So I guess you would need to decide where your priorities lie. For Shotcut, this has not been a compelling issue, so we have been fine with colorspace conversions.

While we’re on the subject, how about a full-range lift control in the Color Grading filter as I have suggested several times in the past?

Thank you Brian, this is exactly what I was looking for and it tells me that I should not be so worried about an RGB conversion, particularly if I limit it to one time around. And yes, all my source material originates from a camera. My priority is with simplifying my overall workflow and stop standing on my head so much to run everything through AVISynth.

Even so, the gamma filter is available in MLT that seems like it would be easy enough to add to Shotcut. :wink:

In any case, I appreciate Shotcut because, from a workflow perspective, it does exactly what I want/need in a simple, uncomplicated way. And it runs on both Windows and Linux. It’s good stuff!

If you’re only trying to make home videos look better, why are you worried about broadcast specs?

I’m a perfectionist. Pragmatism is kicking in more and more as I go along though! At the same time though, I’ve just taken for granted the start-to-finish YUV workflow that AVISynth has always had. It’s only been in trying to move away from it that I have discovered nothing else operates that way.

I’m not familiar with Avisynth at all.

I’m working on an ffmpeg script to bring a camcorder output reasonably close to broadcast specs. Will share later.

If you’re going to go to those lengths you need to pay attention to gamma. I recently posted on this and will try to find that post for you.

As promised, there is some discussion of setting gamma toward the end of this thread:

Unfortunately, Shotcut’s scope lacks the graticule lines needed to do this really right. You need graticule lines at digital 235, digital 111 and digital 16. I cheated by writing my own scope code. Contact me if you are interested in it.

Or, you could make gray patches at 235, 111 and 16 and get Shotcut to make you a split screen between your camera video and the gray patches and try to match them visually. Obviously you would make adjustments to your camera video and not the gray patches.

The YpbPr signal in AVCHD is not constant luminance - some of the Luma signal is transferred in the Chroma and vice versa.

If you adjust Luma, you get a hue shift.

You need to convert to something like 1976 CIE Yu’v’ and linearise the Luma to luminance, then you can make luminance adjustments without shifting hue.

Yes! Showing broadcast range on the histogram is a great idea. (AVISynth does this, btw ;)). The issue with AVISynth is that it’s a scripting language. Which has great value in many respects – you have absolute control over the filters, filter settings, the order of the filters (which requires you know what you’re doing). It does Yadif double framerate deinterlacing and reverse telecine, both of which I have used over the years. It does it all. It just sucks for editing. That’s the big gotcha. There is no traditional timeline view with it. You can cut, apply transitions and effects with it, etc, but it’s all in a scripting language, not visual.

For the YUV histogram piece, there is a script editor that shows a preview screen where you can overlay the YUV histogram. And it shows the full range with markers for broadcast safe. And it has a Levels function along with gamma, brightness, contrast, and white balance functions that all operate in YUV and you can see the effect on the histogram as you apply the filters. It’s all very easy. For most of the filters, you have the option of YUV or RGB – there is a parameter where you set that. So it doesn’t have to be YUV; it just doesn’t have to be RGB either.

AVISynth is much like MLT Framework would be without Shotcut or Kdenlive. Which is what brought me to MLT and to Shotcut. Plus MLT runs natively on Linux! Big plus. There is a successor to AVISynth called VapourSynth that runs on Linux that I am toying around with. I’m not sure if it will work yet though – it has some serious limitations with audio. But it has the same YUV histogram and filters as AVISynth. And it has the same difficulties with editing – no timeline, very simple. I can’t have it all, I guess!

1 Like

st599, that’s interesting. I didn’t know that. I’m just getting started with AVCHD actually, so this is not something I’ve observed yet. But I will watch out for it. This could definitely change things. I’ve been working with DV, HDV, and MJPEG (from old 8mm film scans) up to now.

The Waveform Scope in Shotcut has lines at 0 and 100 IRE which map to digital 16 and 235 respectively. Also, if you run your mouse over the scope, a popup appears that displays the digital value and corresponding IRE value. Why do you need 111?

In Shotcut, if you run your mouse over the luminance histogram, a popup will appear showing the digital value and corresponding IRE for that point in the histogram. I toyed with adding lines at 100 and 0 IRE. But it is difficult to see a histogram bar for a solid white or black image because the IRE line would cover it. I could try playing with transparency.

Where? I have the scope open now and am looking and don’t see anything like that.

That’s why I wrote my own scope program. The traces and graticule on mine are nicely visible and readable.

Here is why I added a graticule line at 111. Let me know if I need to explain this.

https://docs.google.com/spreadsheets/d/1wGZd5_OEPSE08CC_RpP5X_hTxOgapWw7pihrKeFOWls/edit#gid=0

View - Scopes - 2019-01-03_13-26-21
Maybe now you can see it. You actually have to move the mouse over the Video Waveform.
2019-01-03_13-22-08

1 Like

You might be using an old version. I added it in November:

I’m using version 18.12.23.

I’m talking about the “0” and “100” graticule lines, not the mouse-over popup. I see nothing like that on my copy.

I understand the spreadsheet. But it doesn’t explain why “Shotcut’s scope lacks the graticule lines needed to do this really right.”. I haven’t seen any other video waveform display that has a graticule line at 43 IRE. 0 and 100 are useful for obvious reasons. And some include 7.5 IRE for NTSC pedestal. But I’ve not seen 43. The whole rest of the world must be getting by somehow without it.

Here is the scope I wrote and use. Note the “IRE” designation at the top left of the scope.