Reason Rack Plugin Effects 64 sample latency
- chimp_spanner
- Posts: 3017
- Joined: 06 Mar 2015
So I had a quick play around after work. I loaded 8 channels up in Logic with as many empty RRPs as I could and then played a separate software instrument track with nothing on it. Far as I could tell, it was still basically instant. There were enough instances there that my equivalent buffer size would've been 4098 were they all added together. Certainly wasn't that. This in Logic. I didn't have low latency mode engaged either.
You wouldn't add them all together though, because compensated latency is calculated using the "worst path" through the system. Independent channels don't stack. You only add instances that are routed in sequence (e.g. inserts on the same channel, aux channels, etc).chimp_spanner wrote: ↑06 Aug 2024So I had a quick play around after work. I loaded 8 channels up in Logic with as many empty RRPs as I could and then played a separate software instrument track with nothing on it. Far as I could tell, it was still basically instant. There were enough instances there that my equivalent buffer size would've been 4098 were they all added together. Certainly wasn't that. This in Logic. I didn't have low latency mode engaged either.
You can mouse over the effect plugin in Logic to check its actual latency:
https://support.apple.com/guide/logicpr ... 1997ba/mac
It's also entirely possible the AU plugin behaves differently from the VST.In a project, different channel strips often have different plug-ins, resulting in a different overall latency for each channel strip. Some channel strips might also be routed to aux channel strips via sends containing plug-ins that could add even more latency. However, you can compensate for those latencies using plug-in latency compensation to ensure that the audio output is perfectly synchronized. Logic Pro detects the channel strip with the maximum latency, and delays the other channel strips in real time by an individual amount so that they all play back in sync.
- chimp_spanner
- Posts: 3017
- Joined: 06 Mar 2015
Maybe I'm understanding the original post wrong! I got the impression that every instance of RRP effect, regardless of where it is in the project, was stacking/adding latency per instance. If that's the case and I misunderstood it I'm very glad haha. I'll be honest I find the whole thing a bit confusing sometimes. I just chuck stuff in the project and hope for the best.Pepin wrote: ↑06 Aug 2024You wouldn't add them all together though, because compensated latency is calculated using the "worst path" through the system. Independent channels don't stack. You only add instances that are routed in sequence (e.g. inserts on the same channel, aux channels, etc).chimp_spanner wrote: ↑06 Aug 2024So I had a quick play around after work. I loaded 8 channels up in Logic with as many empty RRPs as I could and then played a separate software instrument track with nothing on it. Far as I could tell, it was still basically instant. There were enough instances there that my equivalent buffer size would've been 4098 were they all added together. Certainly wasn't that. This in Logic. I didn't have low latency mode engaged either.
You can mouse over the effect plugin in Logic to check its actual latency:
https://support.apple.com/guide/logicpr ... 1997ba/macIt's also entirely possible the AU plugin behaves differently from the VST.In a project, different channel strips often have different plug-ins, resulting in a different overall latency for each channel strip. Some channel strips might also be routed to aux channel strips via sends containing plug-ins that could add even more latency. However, you can compensate for those latencies using plug-in latency compensation to ensure that the audio output is perfectly synchronized. Logic Pro detects the channel strip with the maximum latency, and delays the other channel strips in real time by an individual amount so that they all play back in sync.
Heheh, I mean they all add latency even with an empty rack which is my main complaint. But they will only increase your overall latency beyond +64 samples if you chain them in series.
Most of my VST effects add 0 latency unless they are are doing some lookahead.
Most of my VST effects add 0 latency unless they are are doing some lookahead.
- chimp_spanner
- Posts: 3017
- Joined: 06 Mar 2015
Oh weird that's what I was trying! It was like;
Track 1
RRP
RRP
RRP
RRP
RRP
RRP
...
Track 2
RRP
RRP
RRP
...
Like I said I did that like 8 times. So if it was stacking, I'd definitely know about it. But then yeah as Pepin said it might be how AU's work vs VSTs, or how Logic handles it. I can test in Cubase 13 Pro later.
As for there being a 64 sample base level of latency, I miiiight be pulling this out of my butt, or maybe someone else has already addressed it, but I believe this is something to do with CV? I seem to remember that being the case back when VST performance was *awful* and no amount of raising the buffer was helping. That was why we ended up getting the option to run at audio interface buffer size. The tradeoff being that CV modulation would be delayed by your buffer, vs always in time, regardless of what it was set to.
Someone correct me if I'm wrong on this. I'm just recalling from memory!
I think logic has some hybrid buffering going on, so you'll only see the actual latency in the track you're monitoring through. And testing with multiple tracks won't give you any info because every track in the "background" already runs with a much larger buffer. I'm pretty sure AUs add latency in the same way VSTs do.
I think if this problem was caused by the way CVs are handled you'd see the same latency in the regular plugin as well, and that is not the case.
I think if this problem was caused by the way CVs are handled you'd see the same latency in the regular plugin as well, and that is not the case.
Last edited by spacepluk on 06 Aug 2024, edited 1 time in total.
I think you can hear the latency if you try something like this:
- Disable latency compensation
- Add two tracks with synths outputting the same simple square wave and play the same midi clip in both of them.
- Add one or more RRPFX to one of the tracks
- Hit play
You should hear some phasing that should change as you add/remove RRPFX instances in that track.d
- Disable latency compensation
- Add two tracks with synths outputting the same simple square wave and play the same midi clip in both of them.
- Add one or more RRPFX to one of the tracks
- Hit play
You should hear some phasing that should change as you add/remove RRPFX instances in that track.d
-
- Posts: 18
- Joined: 05 Jun 2024
I wonder why only the effects version reports the latency of 64 samples. The normal version reports a latency of zero samples (Reaper 7/VST3).
Sources don’t have latency, such as synths or drum machines. The MIDI signal from outside the DAW can have latency, the output of the plugin can have latency added. But a source doesn’t have latency itself.Tiefflieger Rüdiger wrote: ↑08 Aug 2024I wonder why only the effects version reports the latency of 64 samples. The normal version reports a latency of zero samples (Reaper 7/VST3).
Take an RRP with ReDrum playing a pattern - where is the latency going to come from?
With an audio plugin you have to take an input, process it, and deliver an output. This takes time. If it’s done fast enough you can have the data ready to go before the next sample is expected, so effectively no delay. But if it takes more time then the current sample rate allows, you introduce latency. Make sense?
With regards to RRP as an effect, I can stack multiples in LUNA and not cause any latency. The only time latency is introduced is if the RE I’m using intentionally introduces it, since this is not (always?) reported to the host.
Selig Audio, LLC
-
- Posts: 18
- Joined: 05 Jun 2024
Your arguments make sense from a DSP POV. But since the Reason Rack plugin has audio ins for both versions, the normal version and the effect version, I think from a design point of view this is a weird choice.
Also, I don't think that simple effects like the DDL actually require any latency. It seems the devs took a shortcut here to avoid calculating the total latency of the device graph but I'm guessing for why they report a static latency of 64 samples for the effect version.
Also, I don't think that simple effects like the DDL actually require any latency. It seems the devs took a shortcut here to avoid calculating the total latency of the device graph but I'm guessing for why they report a static latency of 64 samples for the effect version.
That's surprising. With Reaper 7 and the VST3 and AU version of Reason 13, even an empty rack of the Reason Rack effect plugin reports a latency of 64 samples. I have no idea how LUNA works but maybe they hide the latency of plugins until they're beyond the current buffer size?
I might be wrong but I think the reported latency refers to the latency of the rendered audio. So you can process the audio stream in blocks of any size (buffer size) but the audio going through effects that add latency will always be delayed regardless of the buffer size.
In my testing every instance of RRPFX reports and also adds a minimum of 64 samples of latency. And it goes up from there depending on what you put in the rack.
In my testing every instance of RRPFX reports and also adds a minimum of 64 samples of latency. And it goes up from there depending on what you put in the rack.
If you are just using multiple Racks on different tracks in parallel, I wouldn't think it would be a big issue. I think 64 samples adds a millisecond.chimp_spanner wrote: ↑05 Aug 2024I'll have to do some tests with this to see if Logic's audio monitoring/low latency mode is enough to compensate, and how much it affects MIDI input when recording drums etc.
Biggest issue I would see if you need to create an effect chain where you would need a Reason effect, then a Native effect, and then another Reason effect in series. This would quickly create issues.
It would be nice if they could eliminate the 64 sample latency in the effect as I could see people reaching for other tools in Live audio situations where they would want to minimize latency.
For casual home studio work though this is probably "ok" but yes it could be better.
Yeah, my desired use case would be at least one per track for the SSL channel strip. Probably some bus compression on Drums and other groups of tracks, and then a couple of effect sends Reverb/Delay. So the baseline there is the buffer + 192 additional samples even before I need to do what you said in a single track. This is very very noticeable when compared to just the 32 sample buffer latency that I get normally especially when tracking guitars through virtual amps, and it's the reason I don't use the FX plugin at all
But every DAW tested here seems to easily compensate for this latency, so I’m not sure I’m understanding the problem here?jlgrimes wrote: ↑17 Aug 2024If you are just using multiple Racks on different tracks in parallel, I wouldn't think it would be a big issue. I think 64 samples adds a millisecond.chimp_spanner wrote: ↑05 Aug 2024I'll have to do some tests with this to see if Logic's audio monitoring/low latency mode is enough to compensate, and how much it affects MIDI input when recording drums etc.
Biggest issue I would see if you need to create an effect chain where you would need a Reason effect, then a Native effect, and then another Reason effect in series. This would quickly create issues.
It would be nice if they could eliminate the 64 sample latency in the effect as I could see people reaching for other tools in Live audio situations where they would want to minimize latency.
For casual home studio work though this is probably "ok" but yes it could be better.
Selig Audio, LLC
That is a different issue - i always remove stuff with latency when recording (or don’t add it in the first place until I’m done recording).
I’ve played drums professionally my entire life, I can’t “feel” latency until it gets many milliseconds long (MANY!).
I guess I’m also used to playing synths with slow attack settings, or punching in on older analog machines where you have to ‘anticipate’ the timing a tiny bit. The only time latency really prevents me from playing well is when I accidentally leave mastering on a song I’m re-visiting to add something new, which is typically a huge amount of latency!.
When exactly do you feel it is NECESSARY to add latency, and why do you feel Reasons’ latency unnecessary?
Selig Audio, LLC
This definitely depends on the person. I'm very sensitive to latency and I feel it the most when playing guitar. I sure can adjust my playing and adjust the timing but it's less fun/satisfying. With synths it's less problematic because you're skipping the ADC latency and the actual overall latency is always lower than when monitoring audio sources. I also disable all the mastering effects while tracking otherwise it's really bad!selig wrote: ↑18 Aug 2024That is a different issue - i always remove stuff with latency when recording (or don’t add it in the first place until I’m done recording).
I’ve played drums professionally my entire life, I can’t “feel” latency until it gets many milliseconds long (MANY!).
I guess I’m also used to playing synths with slow attack settings, or punching in on older analog machines where you have to ‘anticipate’ the timing a tiny bit. The only time latency really prevents me from playing well is when I accidentally leave mastering on a song I’m re-visiting to add something new, which is typically a huge amount of latency!.
I consider latency from compressors with lookahead and similar *necessary* latency. But an empty Reason Rack FX shouldn't add any latency, nor the basic processing I'm actually interested in using (SSL channel strips, simple compressors, delays, reverbs...) that's why I'm saying it's unnecessary and why in practice I use other plugins that do the same without adding any latency. But it's obvious most people don't care about it at all so I doubt it will be fixed any time soon
Someone pointed out a potential point of miscommunication.
There are two types of latency to be concerned with:
1. The number of samples behind an input impulse would appear in the output. So if an impulse is fed to the input at sample 0, a latency of 64 frames will mean that the first 64 samples of the output are 0 then the impulse will appear on the output at frame 64.
2. The amount of time it takes to process the signal (but I would much rather call this turnaround time).
As I understand it, effects will report latency (not turnaround time, as it's the latency that is compensated for). The turnaround time only impacts how much CPU time is available for more processing during the DSP processing window.
I suspect Reason is misreporting its latency (with a pending task to report latency changes but just haven't gotten around to it) as I can't see any reason good for its latency to be more than 0 with an empty rack (especially if they don't do it for instruments). Does it change if you stack rack devices with latency? I'll have a look when I get a chance.
There are two types of latency to be concerned with:
1. The number of samples behind an input impulse would appear in the output. So if an impulse is fed to the input at sample 0, a latency of 64 frames will mean that the first 64 samples of the output are 0 then the impulse will appear on the output at frame 64.
2. The amount of time it takes to process the signal (but I would much rather call this turnaround time).
As I understand it, effects will report latency (not turnaround time, as it's the latency that is compensated for). The turnaround time only impacts how much CPU time is available for more processing during the DSP processing window.
I suspect Reason is misreporting its latency (with a pending task to report latency changes but just haven't gotten around to it) as I can't see any reason good for its latency to be more than 0 with an empty rack (especially if they don't do it for instruments). Does it change if you stack rack devices with latency? I'll have a look when I get a chance.
Are you hearing latency with stacked empty racks, or is it more a case that you are concerned about what's being reported? Because they might have been a little lazy on correctly reporting latency, so if you've not heard any latency with empty racks, it will be worth testing it in case RRP is misreporting.spacepluk wrote: ↑18 Aug 2024I consider latency from compressors with lookahead and similar *necessary* latency. But an empty Reason Rack FX shouldn't add any latency, nor the basic processing I'm actually interested in using (SSL channel strips, simple compressors, delays, reverbs...) that's why I'm saying it's unnecessary and why in practice I use other plugins that do the same without adding any latency. But it's obvious most people don't care about it at all so I doubt it will be fixed any time soon
Each rack device with latency (e.g. Softube Amp) does add to the base 64 samples of latency reported. It's not just reporting a constant 64.
That said, I don't know whether those 64 samples of base latency reported reflect the actual behavior of the plugin. I'm not sensitive enough to latency given my typical setup and workflow to notice it.
It's worth noting that even if RRP is misreporting the 64 samples with an empty rack, it could still lead to perceived latency on other tracks since they will have latency compensation applied accordingly.
I would argue that the correct way of reporting latency is what they do in the instrument plugin.
This could be an unreported bug.
The only "good" reason I could see for this is a logical conflict with VST and Jukebox messages that force effects plugins to be a 64-frame batch behind to feed some value/message to Jukebox or back to the VST, otherwise it sounds like a bug.
Worth reporting to them.
The only "good" reason I could see for this is a logical conflict with VST and Jukebox messages that force effects plugins to be a 64-frame batch behind to feed some value/message to Jukebox or back to the VST, otherwise it sounds like a bug.
Worth reporting to them.
-
- Information
-
Who is online
Users browsing this forum: No registered users and 0 guests