Reason SSL Mixer Hardware Controller

This forum is for discussing Propellerhead's music software. Questions, answers, ideas, and opinions... all apply.
Chris_ocon
Posts: 5
Joined: 15 Apr 2016

Post 04 Oct 2017

Hi Allison - just curious if you've worked on anything else for this?

Hope all is well!

User avatar
Karim
Posts: 741
Joined: 16 Jan 2015
Location: Italy

Post 04 Oct 2017

Image

But with this down below color scheme.. I'll pay 1000$ to have that !! :cool: :puf_bigsmile: :PUF_balance:

Image
Karim Le Mec : Dj / Producer / Video Editor / Label Owner /Reason User since 2000 ( R10)
△ FOLLOW Karim Le Mec
https://soundcloud.com/karimlemec
https://www.facebook.com/karimlemec
https://www.mixcloud.com/lemecdj/

User avatar
Karim
Posts: 741
Joined: 16 Jan 2015
Location: Italy

Post 04 Oct 2017

geremix wrote:
24 Aug 2017
I'm about to cry... I WANT THAT


Enviado desde mi iPhone utilizando Tapatalk
DITTO!!!! :roll:
Karim Le Mec : Dj / Producer / Video Editor / Label Owner /Reason User since 2000 ( R10)
△ FOLLOW Karim Le Mec
https://soundcloud.com/karimlemec
https://www.facebook.com/karimlemec
https://www.mixcloud.com/lemecdj/

User avatar
Karim
Posts: 741
Joined: 16 Jan 2015
Location: Italy

Post 04 Oct 2017

At least 16 channel .pls!!! :puf_smile:

about the 8 channels of my bcf2000.. I have always been tight to the point that I prefer to work with mouse clicks on the sequencer instead.
but a surface with 16 (or more channels) for me would be fantastic and I would find more freedom of "movement". :thumbs_up: :thumbs_up:

geremix wrote:
25 Aug 2017
I'm with Selig in the center channel aproach. If it's build to last I would pay in the range of 3.000 for 12 channels (why it has to be 8?) and the main section.


Enviado desde mi iPhone utilizando Tapatalk
Karim Le Mec : Dj / Producer / Video Editor / Label Owner /Reason User since 2000 ( R10)
△ FOLLOW Karim Le Mec
https://soundcloud.com/karimlemec
https://www.facebook.com/karimlemec
https://www.mixcloud.com/lemecdj/

User avatar
amcjen
Posts: 169
Joined: 14 Apr 2017

Post 06 Oct 2017

Chris_ocon wrote:Hi Allison - just curious if you've worked on anything else for this?

Hope all is well!
Hey Chris!

I have not worked on this lately—I thought I could quickly move my home studio from one room to another and I have gone down the rabbit hole of requiring my patchbay. 12 DB25 plugs needing resoldering. Uuuugh.

But I hope to get back into it shortly here. I need to consider some layout questions—this thread has been considerably eye-opening.

Will have more news shortly!

Allie

User avatar
amcjen
Posts: 169
Joined: 14 Apr 2017

Post 13 Oct 2017

Hi all!

I wanted to give you a quick update on the progress of this project.

First, I’ve received a lot of feedback both in this thread and via PMs that this seems to be something people really want. So much so that I want to spend the necessary effort to do it right. I say this because it all started with my internal, half-baked idea of wanting a full knob-per-function hardware controller for Reason, where cost and size were no object.

But having considered everyone’s input, and chewing on it for a while here over the last month, I’m going to shift gears and focus on making the best hardware controller for the Reason mixer, *for the most number of users*, instead of just for me. I can live with some compromises to the original vision for sure—if it means it’s generally more useful (and useable!) for more people. Also, many ideas were much much better than my ideas, so it’s better all around.

As mentioned, I have a good amount of prototype components on hand. I just purchased a couple more that I needed and once they arrive, I should have everything necessary to begin prototyping.

I am approaching this product design a little differently than is typical in my experience, in that I’m going to work very incrementally, getting small bits working at a time, keeping everything as malleable and modular as possible, and try layouts and use-cases with other Reason users as it progresses. Call it “lean startup” or “customer development” or “user-centered design” or whatever concept fits, but the process will feel very much like a dialogue, a performance between multiple parties, rather than a thing given to.

Another way I like to think of this product development being different is that other products are often designed “at” a user base. Some more mindful products are designed “for” a user base. I want to design this product “with”
the eventual users. It’s messier, more frustrating, and less clear at times, but if we all accept that as part of what it takes to get a better product designed for all of us, then I think we could get something special built here.

This is in contrast to the Richard Feynman approach to solving problems, which is something along the lines of (paraphrased), “step 1, define the problem, step 2, think really hard about the problem, step 3 solve the problem”. :)

So first task for me is to get a solid Remote connection over USB-MIDI, working bidirectionally between a single physical push button and a single Reason mixer button. Then I’ll add the touchscreen support for some on-controller info and UI. From there I’ll expand to pots, and then groups of pots for each section of the center channelstrip (EQ, comp/gate, etc), then to fader, and so on. I intend to keep all sections separate physically so they can be moved around and used in different configuration layouts. My hope is that this will allow the ideal physical layout to emerge through significant use by several Reason users.

I won’t lie, it’s kinda scary to let the proper layout evolve on its own, instead of just dictating what I think it should be—and while there is a point where decisions will need to be made, I feel this will give the best chance at seeing what really works for real-world, everyday mixing duties. I’m not looking to build a toy, I want to have an indispensable hardware controller to drive the act of mixing within the Reason environment.

I’m also considering documenting this project a little more formally on my personal web site vs. on here. I’ll be bringing this up over the next week or two, and then update project progress there. However, I’ll still link updates here so you all can follow along and share comments and ideas here.

Excited for all of you who expressed interest in this project! Time to make a HW controller that makes Reason mixing feel like an extension of your brain rather than your mouse. :)

- Allie

User avatar
friday
Posts: 117
Joined: 17 Jan 2015

Post 13 Oct 2017

Thanks for your effort!

User avatar
amcjen
Posts: 169
Joined: 14 Apr 2017

Post 14 Oct 2017

Ok, had a bit of a second wind tonight after dinner, and made some good progress on a physical button controlling a Reason parameter. Looking pretty good!

https://jura.io/project-tonic/first-midi-handshake/

(I moved some of the pertinent documentation from here to my personal site, b/c I've been burned before with forums going away, and all the great content going with it. I wish I could copy over the entire thread here b/c there is so much good info from everyone! But I tried to capture most relevant info, for those folks who may stumble across it later.)

brand
Posts: 67
Joined: 11 May 2017
Location: US

Post 14 Oct 2017

Wow. VERY cool!

User avatar
hollyn
Posts: 28
Joined: 15 Jan 2015
Location: Portland, OR

Post 14 Oct 2017

Looks great, Allie! 😊

User avatar
miscend
Posts: 1074
Joined: 09 Feb 2015

Post 15 Oct 2017

Those prototypes look fantastic so far. Although it looks like a lot of work for one person to undertake as a personal DIY project. Are you going to commercialise it? Wish you all the best tho.

User avatar
amcjen
Posts: 169
Joined: 14 Apr 2017

Post 15 Oct 2017

miscend wrote:
15 Oct 2017
Those prototypes look fantastic so far. Although it looks like a lot of work for one person to undertake as a personal DIY project. Are you going to commercialise it? Wish you all the best tho.
Oh most definitely is it a lot of work. :) I have crowdfunded products before that ended up hitting their goal, that were then commercialized and sold directly. So hopefully the only issues that remain right now are mostly "known unknowns" and not too many "unknown unknowns", which are the scariest ones!

Undecided yet if I'd commercialize it. It's possible, under the following circumstances:

- enough people want it
- it's useful to enough people
- the cost/price of the thing makes it worth manufacturing at more than a handful
- it's not incredibly esoteric in its manufacturing/QA process, where it'd require me to be onsite at manufacture to ensure QC
- it's easily field-upgradeable, and much of the support can be done via a community forum/mailing list

Here's how I'd address each of these issues, with what I know today:

- enough people want it: crowdfunding is an amazing way to ensure there is sufficient interest
- it's useful to enough people: lots of beta usage with real Reason users in real mixing environments during the design process
- the price of the thing makes it worth manufacturing at more than a handful: unknown until a bill of materials (BOM), uhh, materializes
- it's not incredibly esoteric in its manufacturing/QA process, where it'd require me to be onsite at manufacture to ensure QC: same as above--some of the ideas are amazing (touch screens, etc), but that can be a QA hell--so tried and true and reliable will win the day here
- it's easily field-upgradeable, and much of the support can be done via a community forum/mailing list: it really will have to just work, before it's released, so support isn't overwhelming.

My day job is running a company building industrial, rugged internet of things products, so for better or worse, this is my life these days. :) However, if this thing takes off, it'd need to be run day-to-day by others--I wouldn't be able to do it myself. We'll cross that bridge if/when it comes.

And if it turns out there are a small number of passionate people who want this, but for whatever reason (price, demand, cost, etc) there isn't that sweet spot to kick it off into full production, I'll likely just make it into a kit so others can build it themselves. And the kitting process could be anywhere from "plug these six ribbon cables in and screw it together Korg MS-20 kit style or as complex as the Drip Opto.

No matter what, if this thing is worth making, and more than just myself finds it useful, I'll find a way to get it to those who want it. Right now for me this is not a business endeavor--it's instead an effort to make Reason more amazing than it already is. 👌

User avatar
selig
Moderator
Posts: 7323
Joined: 15 Jan 2015

Post 15 Oct 2017

amcjen wrote:
13 Oct 2017
So first task for me is to get a solid Remote connection over USB-MIDI, working bidirectionally between a single physical push button and a single Reason mixer button. Then I’ll add the touchscreen support for some on-controller info and UI. From there I’ll expand to pots, and then groups of pots for each section of the center channelstrip (EQ, comp/gate, etc), then to fader, and so on. I intend to keep all sections separate physically so they can be moved around and used in different configuration layouts. My hope is that this will allow the ideal physical layout to emerge through significant use by several Reason users.
Something you said above caught my attention. This could totally be the wrong way to go, but allow me to run with it for a bit to see how it "feels".

You mentioned "keeping all the sections separate so they can be moved around and used in different configuration layouts" - THAT sounds VERY interesting to me…

Imagine a VERY modular system, with modules for Faders (with or without mute/solo/pan), EQ/Filter, Dynamics, Input, Master Compressor, Sends, etc.

EACH section of the SSL mixer is separate. IDEALLY, each module could be placed on a grid in vertical or horizontal orientation. More realistically, they would orient vertically in a fixed grid allowing 8 modules horizontally. Connections would be made on the bottom, and the sides of each module would be enclosed. This would allow very clean design - cover plates would be used to hide parts of the grid that weren't populated by modules. You could also do the same thing with the master section, building the transport separate (probably a good starter kit), separate jog wheel, master compressor, insert management, numeric keypad, etc. - all fitting into the main 'grid' system!

Your frame could have a fader bank grid (tall enough for one fader module with mute/solo/pan, and a channel bank grid (taller to accommodate more modules), both able to hold 8 (wide) modules.
(crude drawing attached)
Screen Shot 2017-10-15 at 11.55.00 AM.png
This approach could allow you to scale according to time/investments, and users could likewise scale their setups as well, adding a touch screen when able, starting with one full channel and adding more over time, or starting with 8 faders and just EQs, or whatever - you get the idea!

OK, that should be enough to see if this approach has 'wings' or is TOO segmented and modular.
You do not have the required permissions to view the files attached to this post.
Selig Audio, LLC

User avatar
selig
Moderator
Posts: 7323
Joined: 15 Jan 2015

Post 15 Oct 2017

Taking the modular idea further, what I’m suggesting is definitely not what you intended from the start, but could be more useful to the community at large.

What we need is a foundation to build upon. A protocol for connecting hardware to Reason, with all the tough bits worked out for us! I’d love to build a few modules myself, but not if I ALSO had to deal with the power supply, the chassis, the communications etc. It will be enough work just to build a module!

Sure, I’d also love to have all the modules built for me, but with the option to choose which ones to add and when to add them. Sort of like how modular synth guys like MOTM (module of the month) and many others did it. You can buy the kits, or you can buy the modules - we also provide the power supply! Or you can design your own modules and integrate them into our system.

It’s a much easier system to scale up if demand increases. It also focuses your timeline: first the protocol/communication, then the mainframe, then a fader, then a compressor or EQ or touch screen.

Is it making any sense to do things this way instead of creating a more ‘finished product’ that can never exactly appeal to everyone’s needs? I’m thinking the modular approach answers more of your above questions positively than your original approach, no?


Sent from some crappy device using Tapatalk
Selig Audio, LLC

User avatar
amcjen
Posts: 169
Joined: 14 Apr 2017

Post 15 Oct 2017

Selig!
selig wrote:
15 Oct 2017
Taking the modular idea further, what I’m suggesting is definitely not what you intended from the start, but could be more useful to the community at large.
That's what I'm after at this point. I've given up on my original vision because, quite frankly, it wasn't well thought out, and this thread revealed a lot more consideration than my "hey all you know what would be cool?" ramblings early on. So let's dig into this!
selig wrote:
15 Oct 2017
What we need is a foundation to build upon. A protocol for connecting hardware to Reason, with all the tough bits worked out for us! I’d love to build a few modules myself, but not if I ALSO had to deal with the power supply, the chassis, the communications etc. It will be enough work just to build a module!
Okay, now you're talking. I think this is a really cool approach for a couple of reasons. One, the Reason mixer in the program itself will continue to change (delay comp being one of the most recent ones. A mono button will certainly show up soon enough too). So this lets single modules be swapped out/upgraded as needed.

Two, like you said, you can ease into it a little at a time. Very much with a standard form factor, comms, power, etc.
selig wrote:
15 Oct 2017
Sure, I’d also love to have all the modules built for me, but with the option to choose which ones to add and when to add them. Sort of like how modular synth guys like MOTM (module of the month) and many others did it. You can buy the kits, or you can buy the modules - we also provide the power supply! Or you can design your own modules and integrate them into our system.
Yep, exactly like this. I'm not into modular synths but familiar enough with the idea that it could work really well to standardize on one of these existing layouts that most closely matches the mixer channel widths. I know there's the 500 series, then all the modular stuff (MOTM, Eurorack, Moog, etc). In quickly looking at it, I think MOTM is a preferred choice b/c it's already 1.75" wide, and 8 of them gets you just over the 13" for an 8-fader bank.

I like too that if this is designed correctly, you'll be able to mix and match exactly what you want, where you want it.
selig wrote:
15 Oct 2017
It’s a much easier system to scale up if demand increases. It also focuses your timeline: first the protocol/communication, then the mainframe, then a fader, then a compressor or EQ or touch screen.
Yep, exactly. I feel comfortable about protocols/comms. Need a little coordination and research to determine what the appropriate bandwidth should be between modules, and across the whole system. Reason supports 64 channels IIRC, what does Remote bandwidth requirements look like when 64 meters are bouncing along? I've no idea, but really want to know, as this will dictate what protocol to use.

If I had to shoot for a choice this moment, I'd say CAN Bus/OpenCAN @ a 1Mbit/sec throughput with support up to 254 devices total would be a pretty solid choice (esp. since CAN requires tight timing requirements for where it's usually used--in control systems for automobiles).
selig wrote:
15 Oct 2017
Is it making any sense to do things this way instead of creating a more ‘finished product’ that can never exactly appeal to everyone’s needs? I’m thinking the modular approach answers more of your above questions positively than your original approach, no?
It totally does.

And your thought just made me realize something, and this could be freakin' amazing:

To my understanding, the Remote protocol--though designed to be fairly open, only has one host AFAICT, which is Reason. Of course, there are lots of controllers available, but having a good solid machine/software protocol like Remote makes this a lot more doable. And it would let other DAWs (someday) support Remote as well so this controller framework could work with them too.

But what if--

What if this foundation supported not only power rails (I dunno, +/- 15V and a +5V like MOTM?) and module communication (CAN Bus), but also supported at least a balanced audio input/output feed, per something like the 500 Series "specs". Why, you ask?

Because then people like you, Selig, can start making Rack Extensions that actually control real analog hardware such as preamps, etc. It's a perfect bidirectional automation capability that would start to merge the hardware/software world even more. I don't know many products that can do this today (a couple synths of course, and then you have the UAD stuff. Focusrite's LiquidChannel used to do this too back in the day).

But damn, that would be an amazing system. You could have your Reason SSL compressor right next to an actual 500-series module that you have audio patched through, and could automate it from an RE.

This could work really well with Peterson's DIYRE projects too--could use that as a good DIY path for experimenting with this.

(Mind you, I recognize that featurecreep is an enemy of any project, but now is the time you spec this stuff out, so you have it ready for later, even if you didn't use it yet.)

I'm down, let's figure this out!

-Allie

User avatar
Catblack
Posts: 746
Joined: 15 Apr 2016

Post 16 Oct 2017

Wow Allie, I finally understand the scope of this! I'm totally down with an extensible euro-rack-esque modular midi controller. I never thought about using those size/power standards. Amazing. I'll help however I can with the Remote side of things.

On a side note, speaking as someone who delves into Remote and can't help buying random midi controllers (seriously, I've got at least 5 that need codecs sitting right here, and more in storage,) I think part of the general problem is that one doesn't know just is mapped to what. I believe there's a great solution (that a couple of other Remote devs, not me, have gotten working) in using WebMidi on a localhost served web page to display data values and remotable names out of Remote. The potential for this would be to have a web app that uses css to arrange the data and show you what your controllers are controlling, as you cruise around the rack. It's going to be sometime next year before I get around to working on this; I'll make a thread asking for help at some point.
Do you sing like Olive Oyl on purpose? You guys must be into the Eurythmics.

User avatar
amcjen
Posts: 169
Joined: 14 Apr 2017

Post 16 Oct 2017

Catblack wrote:Wow Allie, I finally understand the scope of this! I'm totally down with an extensible euro-rack-esque modular midi controller. I never thought about using those size/power standards. Amazing. I'll help however I can with the Remote side of things.
Awesome! Will definitely need the Remote help—so glad you know it well!
“Catblack” wrote:On a side note, speaking as someone who delves into Remote and can't help buying random midi controllers (seriously, I've got at least 5 that need codecs sitting right here, and more in storage,) I think part of the general problem is that one doesn't know just is mapped to what. I believe there's a great solution (that a couple of other Remote devs, not me, have gotten working) in using WebMidi on a localhost served web page to display data values and remotable names out of Remote. The potential for this would be to have a web app that uses css to arrange the data and show you what your controllers are controlling, as you cruise around the rack. It's going to be sometime next year before I get around to working on this; I'll make a thread asking for help at some point.
This sounds really cool, and I was following what you described up until you mentioned showing what your controllers are controlling while cruising around the rack. Would you explain a little more? My curiosity is piqued!

-Allie

User avatar
Catblack
Posts: 746
Joined: 15 Apr 2016

Post 16 Oct 2017

amcjen wrote:
16 Oct 2017

This sounds really cool, and I was following what you described up until you mentioned showing what your controllers are controlling while cruising around the rack. Would you explain a little more? My curiosity is piqued!

-Allie
There's a pair of Utility functions, remote.is_item_enabled() and remote.get_item_name() that can be paired together with the "Constant Value" (as the Remote dev manual refers to it) trick to detect when your device changes. Then you could tell what has been assigned to what controller in the .remotemap file, and pass the name of the remotable out to your display as ascii sysex. This is what the Novation Automap codec is very concisely doing. You can take a look yourself, look at remote_set_state(). That codec, bit hard to read, defines 256 knobs, 256 buttons and then passes it back to the Automap program for display. (Generating a more complete Automap .remotemap or Automap xml files is one of my backburner projects.) The big plus is that you only have to define what is mapped inside the .remotemap and then parse that once per device each time the script runs as the device gets selected. A little extra coding for a second constant value for the group variation and you could use the .remotemap groups feature too.

It was just an aside, but I imagine being able to use an old monitor and a raspberry pi accepting midi to give me a view of what my controllers are mapped to. I would need help with the Webmidi/javascript side of things, but I do believe it's a project that would benefit a large amount of Reason users. I was planning on starting on it next year, but if any javascript programmers want to PM me, maybe we can get it rolling a little early. I do think this idea could pair well with your project.

There are some gotchas with Remote to watch out for. Mainly that Remote gets a low priority and is suspended when song files or samples load. You can't really get an accurate display of the transport bar position. And also what I refer to internally as "the wall", which is that Remote won't fire off messages back into itself independently, (ala remote.handle_input(),) and needs an inputted midi signal to come into remote_process_midi(). I call it "the wall" because sooner or later you'll come up with a cool Remote idea and hit it.
Do you sing like Olive Oyl on purpose? You guys must be into the Eurythmics.

User avatar
amcjen
Posts: 169
Joined: 14 Apr 2017

Post 16 Oct 2017

Catblack wrote:
16 Oct 2017
There's a pair of Utility functions, remote.is_item_enabled() and remote.get_item_name() that can be paired together with the "Constant Value" (as the Remote dev manual refers to it) trick to detect when your device changes. Then you could tell what has been assigned to what controller in the .remotemap file, and pass the name of the remotable out to your display as ascii sysex. This is what the Novation Automap codec is very concisely doing. You can take a look yourself, look at remote_set_state(). That codec, bit hard to read, defines 256 knobs, 256 buttons and then passes it back to the Automap program for display. (Generating a more complete Automap .remotemap or Automap xml files is one of my backburner projects.) The big plus is that you only have to define what is mapped inside the .remotemap and then parse that once per device each time the script runs as the device gets selected. A little extra coding for a second constant value for the group variation and you could use the .remotemap groups feature too.
I think I'm getting it but it's become clear to me that I have some work to do to get caught up on the Remote protocol (I didn't know there were even a groups feature, for instance).
Catblack wrote:
16 Oct 2017
It was just an aside, but I imagine being able to use an old monitor and a raspberry pi accepting midi to give me a view of what my controllers are mapped to. I would need help with the Webmidi/javascript side of things, but I do believe it's a project that would benefit a large amount of Reason users. I was planning on starting on it next year, but if any javascript programmers want to PM me, maybe we can get it rolling a little early. I do think this idea could pair well with your project.
This sounds amazing. I'm sure there will be Javascript devs who will reach out. If you come up short, I could help with a quick and dirty interface for this (though I have my work cut out for me with this comms/power/audio protocol now!)

Catblack wrote:
16 Oct 2017
There are some gotchas with Remote to watch out for. Mainly that Remote gets a low priority and is suspended when song files or samples load. You can't really get an accurate display of the transport bar position. And also what I refer to internally as "the wall", which is that Remote won't fire off messages back into itself independently, (ala remote.handle_input(),) and needs an inputted midi signal to come into remote_process_midi(). I call it "the wall" because sooner or later you'll come up with a cool Remote idea and hit it.
Woah, that last one sounds horrible. How do you work around it? Do you just feed it an Active Sense (sysex 11111110) every 300ms or something so it keeps processing?

User avatar
Catblack
Posts: 746
Joined: 15 Apr 2016

Post 17 Oct 2017

amcjen wrote:
16 Oct 2017

Catblack wrote:
16 Oct 2017
There are some gotchas with Remote to watch out for. Mainly that Remote gets a low priority and is suspended when song files or samples load. You can't really get an accurate display of the transport bar position. And also what I refer to internally as "the wall", which is that Remote won't fire off messages back into itself independently, (ala remote.handle_input(),) and needs an inputted midi signal to come into remote_process_midi(). I call it "the wall" because sooner or later you'll come up with a cool Remote idea and hit it.
Woah, that last one sounds horrible. How do you work around it? Do you just feed it an Active Sense (sysex 11111110) every 300ms or something so it keeps processing?

I didn't mean to misstate anything. Remote runs all the time. You get remote_set_state() that fires off when something in Reason changes, and remote_deliver_midi() running frequently when you want to send something to your device. But you can only run the utility function remote.handle_input() inside of of remote_process_midi(). remote.handle_input() is the one that allows you to tie directly into the remotables in the .remotemap.

The last time I hit 'the wall' was when I had 6 buttons that I wanted to turn into a pitch wheel. Sure I could just send the proper signal for each button as it was pressed or released, but I wanted a little snapback, say 250ms while the pitch returned to zero. But I couldn't do that because I don't have incoming midi for those 250ms.

So then you think, well the Remote script is in there and running, I'll just tie something into the song position remotable and when it changes, remote_set_state() is going to fire off... and I'll just fire off that pitchwheel remotable then. But you can only use remote.handle_input() inside of remote_process_midi(), and there's no way to get around that. (Though you could try doing dedicated midi loopback ports, but then you start adding a lot of lag into the system. Also, remote_process_midi() doesn't fire off for midi clock signals!)

About a year ago I suggested adding "remote.generate_input()" to get around this. If they added something like that, you'd be able to do my 6 button pitch wheel, or auto generating drum patterns from inside Remote or a host of other things. I doubt they are going to change much about the Remote system other than make it more efficient, like they did with the last 9.5 point upgrade.
Do you sing like Olive Oyl on purpose? You guys must be into the Eurythmics.

Chris_ocon
Posts: 5
Joined: 15 Apr 2016

Post 17 Oct 2017

I am really digging the modular approach!! It allows people to fine tune to their needs and budget. And you can hone in how YOU want to mix-- if you never use inserts don't use the money or space on that module.

It also makes me think, (along the lines of what Allison said about the RE controlling some hardware modules that are installed along with the mixer modules) you could maybe make some modules work to control other things in Reason too. For example, I have an audio splitter after my main out giving me two stereo pairs. Each go into their own Selig Gain then to a hardware out pair 1/2 & 3/4. The first is for my monitors and the second for headphones. I then have the faders of the Selig Gains mapped to some faders on my Akai controller allowing me quick hands on monitor level control. Seems like I could just work a couple extra faders into the mixer for this. Possibly even motorized so they would reset accordingly when opening a new project.

It further makes me think about (and this might be pushing it) making almost dedicated mini effects controllers. Example, if you always keep a Tsar-1 in your first send you could maybe put together a module that you map to the parameters on there, allowing you the feel of having outboard effects.

Everything about this project is awesome!! I'm happy to help any way I can

User avatar
amcjen
Posts: 169
Joined: 14 Apr 2017

Post 17 Oct 2017

Catblack wrote: About a year ago I suggested adding "remote.generate_input()" to get around this. If they added something like that, you'd be able to do my 6 button pitch wheel, or auto generating drum patterns from inside Remote or a host of other things. I doubt they are going to change much about the Remote system other than make it more efficient, like they did with the last 9.5 point upgrade.
Ah wow, yeah, that sounds like a mess. I don’t yet see how this could bite us on this project, but from the sound of it, it very well will happen. Hmmm.

User avatar
amcjen
Posts: 169
Joined: 14 Apr 2017

Post 17 Oct 2017

Chris_ocon wrote:I am really digging the modular approach!! It allows people to fine tune to their needs and budget. And you can hone in how YOU want to mix-- if you never use inserts don't use the money or space on that module.
Yep! This is how I’m thinking of it now too. :)
Chris_ocon wrote:It also makes me think, (along the lines of what Allison said about the RE controlling some hardware modules that are installed along with the mixer modules) you could maybe make some modules work to control other things in Reason too. For example, I have an audio splitter after my main out giving me two stereo pairs. Each go into their own Selig Gain then to a hardware out pair 1/2 & 3/4. The first is for my monitors and the second for headphones. I then have the faders of the Selig Gains mapped to some faders on my Akai controller allowing me quick hands on monitor level control. Seems like I could just work a couple extra faders into the mixer for this. Possibly even motorized so they would reset accordingly when opening a new project.
Oh totally. The way you describe it, I could see specific modules for sections of the mixer (compressor, EQ, etc), but then probably also general knob/button modules. Might have to consult with Catblack on how to make those knobs re-mappable to devices dynamically without editing Remote map files, but def could be awesome.

This approach also solves for the ever-important center channel details. Clearly all of us have very specific ways of mixing—our hardware controllers should reflect that. For instance, it would let Selig have all of his buttons that map to the 10-key keys. And others to have their custom-mapped knobs. Niiiice.
Chris_ocon wrote: It further makes me think about (and this might be pushing it) making almost dedicated mini effects controllers. Example, if you always keep a Tsar-1 in your first send you could maybe put together a module that you map to the parameters on there, allowing you the feel of having outboard effects.
I’m so glad you mentioned this! When I was thinking about this project on Sunday, I was like “I would love to have a single Pulveriser control—I use it all the time”. Let all the hardware effect modules bloom. :). A Tsar-1 would be amazing too.
Chris_ocon wrote: Everything about this project is awesome!! I'm happy to help any way I can
Awesome! So glad to hear. I’m sure there will be lots of work ahead.

One of the most helpful things right now might be to start thinking about what layouts you would prefer. Hacking around Photoshop isn’t the easiest but something like that may help to show what modules will be most important to address first.

User avatar
Catblack
Posts: 746
Joined: 15 Apr 2016

Post 18 Oct 2017

I hope my Remote digressions haven't been too confusing.

From the Remote side, I'd like to see the main module respond to the standard midi sysex request. ("f0 7e 7f 06 01 f7" I think?) and have the return sysex ID string have a couple of extra bytes that will let us determine how many submodules are connected to the main module. Then we could tailor things to have multiple secondary sysex requests to the main module -- one for each module connected -- so that the sysex returned from the device isn't as large. I'm imagine it'll be 2 or 3 bytes for each control, which could add up. Keeping sysex strings smaller is always better, I think.

As long as we have a way to query and build up an internal picture of what we have to work with, we should be able to map controls to Remotables. So if we had a request:

Code: Select all

F0
<device id bytes>
<byte for module config request>
<byte for module number, 0 is main module>
F7
and the return:

Code: Select all

F0 <device id bytes> 
then for each control
<byte for control's channel>
<byte for control's CC or note #>
<byte consisting of bits telling us about control: 
	is_input/is_output/bits for type of control, button, delta, value, ...?>
I'm pretty sure if we set it up this way we'll be able to define items dynamically in the lua. If not (possibly because of the loading order of remote_init) we'll define a slew of items and then map them to the expected midi that the sysex describes. (Another side note: You can define all the Items in the lua you want, you only get an error if an Item is in the .remotemap that isn't in the lua.)

Oh and don't forget when thinking up modules that having external Undo and Redo buttons is really quite awesome!
Do you sing like Olive Oyl on purpose? You guys must be into the Eurythmics.

User avatar
amcjen
Posts: 169
Joined: 14 Apr 2017

Post 18 Oct 2017

Catblack wrote:I hope my Remote digressions haven't been too confusing.

From the Remote side, I'd like to see the main module respond to the standard midi sysex request. ("f0 7e 7f 06 01 f7" I think?) and have the return sysex ID string have a couple of extra bytes that will let us determine how many submodules are connected to the main module. Then we could tailor things to have multiple secondary sysex requests to the main module -- one for each module connected -- so that the sysex returned from the device isn't as large. I'm imagine it'll be 2 or 3 bytes for each control, which could add up. Keeping sysex strings smaller is always better, I think.

As long as we have a way to query and build up an internal picture of what we have to work with, we should be able to map controls to Remotables. So if we had a request:

Code: Select all

F0
<device id bytes>
<byte for module config request>
<byte for module number, 0 is main module>
F7
and the return:

Code: Select all

F0 <device id bytes> 
then for each control
<byte for control's channel>
<byte for control's CC or note #>
<byte consisting of bits telling us about control: 
	is_input/is_output/bits for type of control, button, delta, value, ...?>
I'm pretty sure if we set it up this way we'll be able to define items dynamically in the lua. If not (possibly because of the loading order of remote_init) we'll define a slew of items and then map them to the expected midi that the sysex describes. (Another side note: You can define all the Items in the lua you want, you only get an error if an Item is in the .remotemap that isn't in the lua.)
. This sounds amazing

I have a very lightweight 1-wire protocol that was developed for a previous hardware platform that had pluggable, user-created hardware modules that could be uniquely identified and deliver capabilities. This could work perfectly for this. You could even embed Remote-ready data into the response, when then sends that data all the way back to the Lua script on the host.

This is sounding good.
Oh and don't forget when thinking up modules that having external Undo and Redo buttons is really quite awesome!
Indeed! If you had per-module undo/redo, It would let you A/B any change on any module. You could then A/B a different change on another module simultaneously.

Holy crap. This is bringing out some of the things I like most from SW mixing and bringing it into the hardware land.

  • Information
  • Who is online

    Users browsing this forum: Adabler and 6 guests