antic604 wrote: ↑04 Oct 2018
selig wrote: ↑04 Oct 2018
A proper phrase player would open up a sequencer view for editing notes and audio, also allowing drag/drop and copy/paste. This is something any similar solution will be limited by, and another of the "deal breakers" for me personally, as I'm always needing to edit a few notes or drum beats up to the end of production. Anyone got a solution to that issue?
Again, perhaps this is not your own workflow, but most people I know "pencil in" the MIDI data in piano roll and draw automations, so - really - using something like
Panda's Kompulsion to build a pattern of MIDI & CVs is far from "tedious". And maybe it would be possible to somehow import MIDI and automation data from a clip? True, using sampler might be more restrictive that working with audio directly, but you'd still be able to record and edit the audio on the audio track, but then bounce it to sample and import to NNXT - or any other sampler - once you're happy.
I was also assuming that Reason would be able to "record" the performance (launching of clips & scenes, tweaking the parameters with their MIDI controllers, etc.) to the Sequencer, as a stream of clips or patterns - like it is now for Matrix, ReDrum, etc. Many people using Live or Bitwig work exactly this way: they create a rough sketch of the song maintaining this "live performance" feel and then fine-tune and add the details, breaks, more elaborate transitions on the timeline, in more traditional fashion.
So yeah, we can either wait for Props until something complete & ideal can be implemented, or have a solution that's 80% there but is ready for the market.
Hope someone really picks this idea up and I wasn't joking offering participation in funding the development
When I've had to draw in sustain pedals for a long piano or pad performance, it takes time to get them placed correct and is no fun. Drawing in notes is different, it's like how an analog sequencer is programmed and is typically only a bar or two anyway. Drawing in natural sounding pitch bend is also a little tedious and time consuming, if only because it could be "played" in one quick "take" if that was an option. In other words, we are talking about two different things, drawing performance data because it can't be recorded, vs drawing in patterns.
And no, REs and Players cannot record all performance data - that's my point. Only the main sequencer can do this which is why it's the best candidate for a clip system IMO (and it's over 50% in place with Blocks).
That is to say, I cannot record multiple piano "clips" with sustain pedal, or a bass or lead line with pitch bend/mod wheel, and then launch them from a master UI.
Ideally you would either record directly into clips/blocks, or copy from the main timeline to create clips/blocks, or drag existing MIDI or audio files directly into a clip/block - all things impossible to do with either REs or Players using current SDK. Then you would double-click the clip to edit, or make copies to create variations etc., which is also not possible to do unless you skip the first part I mentioned above and draw in a simple series of notes or CV on a grid.
To be clear, the clip "launcher" part is easy and supported by the current SDK. It's the "clip" itself that is the tricky part, unless all you want is a matrix style clip device.