Hi,
I’ve been working with Shotcut for a while now and mostly enjoy the program. Yesterday, I started a new project. In this project I want to add some text and make it fade in and out. I did the usual steps meaning I created a transparent file in an extra track, added the text and then added the fade in and fade out filters. Now for the fade out filter everything worked just fine, but the fade in filter is for some reason not working.
With the standard settings the text should fade in from black meaning it is black first and then becomes white. But in my case the text just remains black and suddenly appears instead of fading. If I want the program to adjust the opacity instead of fading from black the text just completely disappears. As I said, for the fade out filter both versions work accordingly.
What is even worse is that after I gave up on the text I tried the fade in filter on a regular video. And here, it became only more obscur. So whenever I added the fade in filter by dragging the top corner of the track in the timeline, it actually worked. But as soon as I clicked on the filter in the filter menu or tried adding it via the filter menu it doesn’t work anymore meaning the entire video just turns black.
I’ve the to my knowledge newest version Shotcut 21.06.29 and am working on a Windows 10 laptop.
If anyone has similar issues or someone has an idea on how to fix it, please let me know. I’ve tried everything I could think of and am considering using another video editor if the bug persists.
FYI: This does not happen for me with the recommended version of Shotcut (21.03.21).
It would be useful if you would post a screenshot of the whole Shotcut window with the playhead on the clip near the end of the fade-in filter and the filter parameters showing.
I downloaded your recommended version but it still doesn’t work. I have a screenshot of my shotcut window attached. I can also send a screen recording where I show the filters in action if it helps.
It looks like you are using the GPU versions of the fade-in/fade-out filters, which so far as I know do not work properly in the current versions of Shotcut.
Are you using an old project file (.mlt) which has these set, or did you start shotcut with the “–gpu” parameter? If so try again without it.
EDIT: I just tried with the --gpu parameter and the fade-in filter does NOT work correctly, so I am fairly sure that this is your problem.
Yes I’m sure the problem is due to using the old GPU filters that are no longer supported. If I use the non-GPU filters it works, if I use the GPU ones it doesn’t.
Some of the video filters exist in two forms a “CPU” form and a “GPU” form. By default the CPU versions are the ones that you get when you click on “Fade In Video”. However, if you look closely at your image you will see that while most filters have next to them an icon thtat looks like a monitor, these are CPU filters. The “Fade In Video” has an icon that looks like a computer chip. This means that it is a GPU filter.
In previous versions of Shotcut there was a setting that you could select to use GPU filters. The later versions don’t have that setting, as using the GPU filters often caused problems (such as you are experiencing). The only way I know of selecting this setting for later versions of Shotcut is to start the application with the “–gpu” parameter. If you are starting Shotcut via a windows shortcut you probably have this set. Right-click the shortcut, select “Properties” and edit the Target to remove it. this option:
The project where this problem occured is a new project which I created a few days ago. However, it has the .mlt ending.
The weird thing about the whole problem is that the fade out filter is working just fine the entire time. I assumed that the fade in and fade out filters are fairly similar which is why I am even more confused that one is working fine and the other one isn’t.
OK the strangest thing just happened. As an experiment I added the option “–gpu” to my “shotcut.exe” call. Sure enough I got the GPU versions of the filters. I exited Shotcut and started it again without the “–gpu” parameter and I still got the GPU versions of the filters!
I surmise that a flag has been set in the windows registry, so I re-downloaded Shotcut and re-installed it, ticking the box that asked for the registry settings to be cleared. I now have a clean installation of Shotcut that uses the CPU (not GPU) filters, as it should.
So my suggestion to you is to re-install Shotcut 21.03.21 (the recommended version) and make sure you tick the box to clear the registry entries. Then get rid of the project file (.mlt), which references the GPU filters (e.g. fadeInMovit) and start a new project from scratch. This should then work.
If you turn it off Shotcut asks to restart and if “–gpu” is set you are back where you started from. If “–gpu” is not set Shotcut reatarts and then crashes, presumably because of the GPU filter in the MLT file. So the only option is to start the project again, as editing the MLT file to get rid of the movit (GPU) filters is not trivial.
Well apparently that just was the solution to my problem.
Thank you very much, dude. And thanks to everybody else, who took their precious time to try to help me. <3
Edit: I want to mention that everything works in a new project. If I try opening an old project however, the program will crash. But I think this won’t be a big problem as I can just start the project from the beginning again.