Hmmmm.... Just noticed this thread: Let me tell you a little about this (a blast from 1987).
While I used a MIDI based DX7 as a musical typewriter into a PC based DAW (Cakewalk) after I got the composition in the ball park I transcoded the MIDI into a proprietary interactive token based interactive language of my design. Each 'track' of notes had 'opcodes' which could change things about the execution of the notes. Sync to a note clock at some chosen resolution (32nd, 16th, 8th), note transposition based on a global tone center, start other tracks, jump to sequence subroutines (each track had its own 'stack'), randomize playing and on and on.
MIDI is a very linear format and not suited for an interactive environment like a pinball machine. So a composition will not play back the same way twice when it is reacting to pinball rule state. So even if you knew how to parse the tokens of the track streams you would not get the same results as happens when the game is pushing sound requests and the sound system is reacting to them.
A simple example would be some background music playing. The player qualifies a 2X scoring timer for some number of seconds. The game requests the 2X scoring modifier. A sound string would fire up playing a cowbell on each quarter note of the music (no matter which background is currently playing). This lets the player know that they are in this state of grace and is still musical (more cowbell). When the double scoring state ends another sound code is serviced by the sound system and the cowbell track is shut down.
I took full advantage of the fact that we were synthesizing the sound in 8 independent channels at run-time. The sounds were very adaptive and quite agile.