Not sure if you like what I propose, but there is a different approach possible with the 1.37 firmware:
It comes with and extended omxplayer build already and 3 services: omxplayer, omxplayer1, omxplayer2 that will run the omyplayer3 time at "different layers". The starter scripts for those services are in /boot/data/video/layer[0-2].sh.
Normally those service wont start, as the referenced video files are not present on the sd card. but you can easily change this by either change the layerX.sh scripts or by providing some videos. The layering can be used for smoothing alpha blending of videos.
Everything can also be controlled by TA sound, that's what the various omxplayer commands are for (see lua reference).
The omxplayer(s) are internally controlled via dbus, where each instance has its own dbus address. It is unfortunately and unfinished feature as some video stopped triggers are still missing, or can only be achieved by a lua callback. And I never documented it so far or did a complete demo.
I was working on a video topper, based on some videos from a pup pack and did some demo for the alpha blending:
as you can see it is very smooth. Also the videos start instantly as the player software is already running in the background. There is no playback of images, instead there are "still videos" that are played in endless loops (like the attrack.mp4 from the pup pack). It shows the sparkling enterprise.
PS: and just to be clear, thats not a pre rendered video with a blending transition, that's two separate video files that are alpha blended at runtime with a pi4