Same issue happens to me under certain circumstances. If the video clip and the timeline/project resolution are the same, I get no stutter or speed change, and the CPU usage due to the LUT effect goes up only 2%. But if the video clip and the timeline are different resolutions, the video slows to 75%, the audio stutters, CPU usage goes up 13%, but overall CPU is only 25% and no single core is at 100%.
This even happens if the timeline resolution is 1080p 29.97fps and my video clips are 360p 29.97fps proxies. Playback without the LUT is flawless and uses only 11% CPU. If I change the timeline to 360p, playback with the LUT is flawless too. On a 1080p timeline… no go.
I echo that Shotcut is awesome and 19.02.28 has been very stable for me.
This is normal and not a bug. All filters require processing power. It is not able to consume all your CPU due to lack of full optimization. Again, not a bug. It’s just the way it is in preview.