Export failing on 2.5 hour project with 8 video tracks

From your Faillog.txt
I:/Giant Hogweed/Giant Hogweed Videos/Giant_Hogweed_Rex2023/°Videos Proxy/Set1_Gesang_Proxy.mp4

Is your I drive a USB, HDD (hard disk drive), or an internal SSD?
Are you exporting to the same drive? If not, what type of drive are you exporting to?
Is this drive getting hot?
Have you tried a different drive?

I can’t seem to find any information on this chip.
I found a i5-1240P, and that is for a laptop.
How hot is your CPU getting during exporting?
What are you doing to cool the CPU?

See, this is why I read the ShotCut forum summaries! Never knew that.

You could try exporting just the top two tracks, then add the third track to the exported video and export again…and the 6th export will have all the tracks.

@hudson555x

I have experimented both on the hard disk on which the data is stored and on a USB stick.
The hard disk gets warm, but not hot.
I can’t say whether the CPU gets hot. At least the fan is not louder than usual.
I have just installed 64 GB RAM. It worked right away. Probably 32 would have been enough, because the utilization was at most 29% (i.e. just under 20 GB).
I’ll keep testing, as I’ve only exported “well” with interpolation so far. But I think more RAM is the solution.

1 Like

The log is crystal clear on that. Get more RAM or use fewer tracks.

That’s what I did :grin:

That is impossible. It appears that you do not understand that memory is allocated in thousands of places in many libraries we do not control and never have any hope of getting change accepted.

According to this article this should be simple and possible!

I did not try this out, but in my carrier as embedded firmware developer it was always possible to override the startup procedure from any gcc. Overriding the startup script is most time not so common but overriding the linker script is a must in case you need a bootloader. And using a bootloader in embedded systems is very common thing and state of the art!

We do not use gcc on macOS, we do not build every library that is included, and it would require patching every library build that we do build. I do not volunteer to do it considering the effort (you underestimate) and it is not much better than pausing the child process.

This is a linker option it should override the dynamic linked stdlib function (start address) for this application.
So I guess it should work with dynamic linked precompiled libraries, too.
This might be also interesting to search for memory leaks when using the logging malloc from glibc for the debug build.

MAC:
I also guess you are using clang for mac. Theoretically than this surprisingly can work too because both compilers seem to use the same GNU Toolchain for linking.

Do you re-boot the computer before using SHOTCUT, and again before rendering?

A normal restart of Shotcut can reduce the RAM consumption of Shotcut and normally removes the memory leaks caused by any application from the OS.
If Shotcut did not close normally or another important application did crash because of low memory a reboot can be helpful.

This might be a dumb question but have you set your project to proxy files? If not, @bentacular created an excellent video tutorial on how to setup your project for multiple HD files.

1 Like

This is implemented for the next version 24.01: pause a running job when the dialog appears, automatically close the dialog when the free memory becomes high enough, and automatically resuming the paused job when auto-closing or click Ignore. Also, added manual Pause and Resume actions in the job context menu.

2 Likes

On Windows isn’t necessary the reboot because can be used a small software to free up the ram RAMMAP by Sysinternals. Link to download RAMMap - Sysinternals | Microsoft Learn. Run rammap.exe after you close all the apps. Click on Empty and choose “Empty Standby list”


This topic was automatically closed after 90 days. New replies are no longer allowed.