Can you tell me the purpose for this?
In this use case, I'm using Reason to build audio backing tracks for live performance, so the main end product is just an audio file (Reason export to .wav, then to aac for performance playback). However, I also 'play' MIDI events synchronized with the backing track to automate things like vocal fx patches, guitar fx patches, and (my next automation steps) stage lighting (driving a MIDI-compatible DMX controller).
Live playback uses the app 'BandHelper' on iPads to play the audio, display synchronized lyrics / cheat sheets, manage setlists, etc. It also supports issuing MIDI events during playback, which have to be separately added with a rudimentary UI (nothing like a sequencer) within the app, and thus really only practical for a few simple program changes, etc. during a song. However, a future release of the app may soon support playing an SMF along with the backing audio track, lyrics, etc. That will be much better for my workflow, allowing me to maintain the events for the external MIDI devices (fx & lights) within Reason tracks so they can be easily sync'ed to audio within the DAW (not as a separate downstream step). (This is particularly beneficial if editing the audio arrangement, which today is a PIA if you have to also change (aka re-do) MIDI event timing in a separate app.)
However, I've been surprised to find out how restrictive Reason's MIDI export function is, and realizing it will lead to new steps to scrub the exported files so they are useful for live playback.
So all that said, it's different from your objectives and workflow in that I really need to retain some of the MIDI events to support the audio song performance, all happening outside of Reason. So, sorry, I don't think I have any useful tips for you!
![Neutral :|](./images/smilies/icon_neutral.gif)