Melt renders fade in/out (alpha) differently

Hello,

I made a shotcut project in which I overlay a png with transparency, semi-transparent black and white over a static flat gif.

If I render it in shotcut, it renders fine: https://youtu.be/QSEcQqpazvE

If I render it with melt, the fade only seems to fade the black but the white part appears immediately/suddenly disappears at the end of the fade: https://youtu.be/2TqNta9pvcQ

melt fadetest.mlt -progress -consumer avformat:test2.mp4 vcodec=libx264 crf=14 rc_lookahead=10 keyint_min=15 keyint=15 weightp=1 bframes=2 subme=6 ref=2 preset=faster

I don’t know whether melt render support is a supported feature, so I’m not sure whether this is a bug.

The fade is 5 frames long.

BG image: commons.wikimedia.org/wiki/File:PM5644-1920x1080.gif
FG image: i.imgur.com/BhLp3Ot.png
MLT file: codepad.org/7CojUKhO
(sorry, new user; can only use 2 links)

melt 6.2.0
ffmpeg version 3.1.4
Shotcut 16.10.01
Linux 64bit

Obviously, there is something different in the build of melt that you are using from a Linux distro, but I do not know what. I do not manage or support those builds (too many of them). Shotcut uses latest unreleased versions of some dependencies such as MLT. You can use the melt that comes with Shotcut by running its launcher script at Shotcut/Shotcut.app/melt (not Shotcut.app/bin/melt). Of course, melt rendering is supported because this is what Shotcut actually runs. However, it runs its own melt and not that of a distro package.

Thanks! I checked the files of shotcut-bin from AUR on Arch Linux and found two melt files. When using the binary one, it will use system libraries and cause the same issues the system-installed melt causes. So you have to use the script file one. The location across the systems are:

Windows:

C:\Program Files\Shotcut\melt.exe

Mac OS X and Linux (portable):

Shotcut/Shotcut.app/melt

Linux (shotcut-bin from Arch Linux AUR at least):

/opt/shotcut/melt