Latency: is there soemthing I am missing?

Want to talk about music hardware or software that doesn't include Reason?
Post Reply
User avatar
mimidancer
Posts: 669
Joined: 30 Sep 2021

02 Jan 2024

If you know me from here, you know I record a lot. Much of what I record comes from hardware synths, bass, guitar, and drums. In Reason, my tracks have very low latency. That being said I can still feel it. To fix it I count in everything with met and nudge the met into place by hand after recording. Is there a way I can calculate the latency and have Reason just put the audio in the right spot? It is not a huge deal but, I can't help but think I am doing something wrong. Thanks in advance smart people.

User avatar
Jackjackdaw
Posts: 1400
Joined: 12 Jan 2019

03 Jan 2024

I am keen to understand this better as well. I am recording sequences from my hardware sequencers into Reason. With Reason as the master clock and the midi coming in I just hit record and it records it in sync but 9ms off the grid. I’m wondering if I can set something to compensate for that 9ms without having to post quantise it.

PhillipOrdonez
Posts: 3760
Joined: 20 Oct 2017
Location: Norway
Contact:

03 Jan 2024

Can’t help you with that, but latency is the reason the hybrid approach sucks! My solution is I record to an external mixer that’s midi synced, that way I don’t worry about any of that latency nonsense 🤷‍♂️

User avatar
selig
RE Developer
Posts: 11747
Joined: 15 Jan 2015
Location: The NorthWoods, CT, USA

03 Jan 2024

@Jackjackdaw: look into the Sync tab under prefs and use the "Output Offset" to bring things closer - the amount of offset will vary by the size of the current audio buffer.

@mimidancer: much my work is recording audio instruments including drums/percussion, bass, guitar and my hardware keyboards/synths. Not sure I have a solution, but you can try adjusting the Recording Latency Compensation by the amount you are moving your tracks after recording.

As a side note, my personal solution to this issue has me giddy - I moved up to Apollo interfaces and their latency is <2-3 ms (pretty equal to the first digital machine I worked on, the 3M 32 track). I stopped monitoring in Reason and monitor in the Console app (or just record in LUNA!). I can use the API and Neve preamps (currently loving the Neve on my drum kit) etc on the way in with the Unison pre amps in the Apollos, or record through my hardware LA610 or 6176, and the important thing is I don't have to adjust latency in Reason – I can use ANY buffer size in Reason and even add EQ/compression when tracking as I'm used to doing, and all the while my live input latency never changes. Everything lines up nicely in Reason no matter is I'm at 64 samples or 2048 samples! Even MIDI feels tighter because of the reduced audio latency, a win all around. :)
Selig Audio, LLC

avasopht
Competition Winner
Posts: 3948
Joined: 16 Jan 2015

03 Jan 2024

selig wrote:
03 Jan 2024
As a side note, my personal solution to this issue has me giddy - I moved up to Apollo interfaces and their latency is <2-3 ms (pretty equal to the first digital machine I worked on, the 3M 32 track). I stopped monitoring in Reason and monitor in the Console app (or just record in LUNA!). I can use the API and Neve preamps (currently loving the Neve on my drum kit) etc on the way in with the Unison pre amps in the Apollos, or record through my hardware LA610 or 6176, and the important thing is I don't have to adjust latency in Reason – I can use ANY buffer size in Reason and even add EQ/compression when tracking as I'm used to doing, and all the while my live input latency never changes. Everything lines up nicely in Reason no matter is I'm at 64 samples or 2048 samples! Even MIDI feels tighter because of the reduced audio latency, a win all around. :)
Not sure if you've heard the latest, but apparently with the Apple M series of CPUs, lower buffer sizes can offer better performance and stability.

User avatar
mimidancer
Posts: 669
Joined: 30 Sep 2021

03 Jan 2024

selig wrote:
03 Jan 2024


As a side note, my personal solution to this issue has me giddy - I moved up to Apollo interfaces and their latency is <2-3 ms (pretty equal to the first digital machine I worked on, the 3M 32 track). I stopped monitoring in Reason and monitor in the Console app (or just record in LUNA!). I can use the API and Neve preamps (currently loving the Neve on my drum kit) etc on the way in with the Unison pre amps in the Apollos, or record through my hardware LA610 or 6176, and the important thing is I don't have to adjust latency in Reason – I can use ANY buffer size in Reason and even add EQ/compression when tracking as I'm used to doing, and all the while my live input latency never changes. Everything lines up nicely in Reason no matter is I'm at 64 samples or 2048 samples! Even MIDI feels tighter because of the reduced audio latency, a win all around. :)
I use PreSonus Studio 1824c with a bordbrain optx adat interface. I am pretty sure I am monitoring through reason. but can monitor directly through the 1824. I do use a fair amount of efx in reason so direct monitoring sounds different. The minor latency during tracking is not an issue. but my notes being off on the grid drives me insane. can I not correct it so that it will show on grid? Should I just keep nudging?
Last edited by mimidancer on 03 Jan 2024, edited 1 time in total.

User avatar
selig
RE Developer
Posts: 11747
Joined: 15 Jan 2015
Location: The NorthWoods, CT, USA

04 Jan 2024

mimidancer wrote:
03 Jan 2024
selig wrote:
03 Jan 2024


As a side note, my personal solution to this issue has me giddy - I moved up to Apollo interfaces and their latency is <2-3 ms (pretty equal to the first digital machine I worked on, the 3M 32 track). I stopped monitoring in Reason and monitor in the Console app (or just record in LUNA!). I can use the API and Neve preamps (currently loving the Neve on my drum kit) etc on the way in with the Unison pre amps in the Apollos, or record through my hardware LA610 or 6176, and the important thing is I don't have to adjust latency in Reason – I can use ANY buffer size in Reason and even add EQ/compression when tracking as I'm used to doing, and all the while my live input latency never changes. Everything lines up nicely in Reason no matter is I'm at 64 samples or 2048 samples! Even MIDI feels tighter because of the reduced audio latency, a win all around. :)
I use PreSonus Studio 1824c with a bordbrain optx adat interface. I am pretty sure I am monitoring through reason. but can monitor directly through the 1824. I do use a fair amount of efx in reason so direct monitoring sounds different. The minor latency during tracking is not an issue. but my notes being off on the grid drives me insane. can I not correct it so that it will show on grid? Should I just keep nudging?
Did you try adjusting Recording Latency Compensation?
Selig Audio, LLC

User avatar
sublunar
Posts: 507
Joined: 27 Apr 2017

06 Jan 2024

You say you have "low latency" but also say you can "still feel it".. so it must not be all that low.

You also haven't said what your settings are so it's difficult to say what the specific issue is, but the first thing you should check is, obviously, your Audio settings Buffer Size.

Incidentially, you have the same audio interface (1824c) that I'm using. Most of what I record is live instruments but I also use Arturia (and Jiggery Pokery) virtual organs/synths. And I've never had a latency issue that wasn't fixed by keeping the buffer size as low as possible.

For reference, all of my projects using the 1824c are:

48K @ 64 samples resulting in 4ms input and 2ms output latency. Which is for all intents and purposes, "zero" latency.

I'll record a guitar track over some MIDI drums one minute and the next I'll record new MIDI drums via my E-drum kit which is controlling Superior Drummer inside of Reason and it all works seamlessly without any noticeable latency.

-

While we're on this topic.. (and I've made this argument before in the relevant "Reason Benchmark" threads):

If you use those Reason Benchmark threads as your project settings, you're going to have a bad time with latency. I have no idea why anyone would use 44.1 @ 1024 samples with 25ms latency as any sort of "Benchmark".

I've gone through numerous audio interfaces over the years and what I've found is that basically ANY crappy soundcard/interface can record/playback reasonably well at 44.1 with 1024 samples. Pretty sure I was achieving that back in the turtle beach sound cards days. Those settings are in no way taxing to modern systems. As a result, the person who made those benchmark files had to make them CRAZY HUGE (which are themselves taxing on a system just to even open them). Those settings are a very poor measuring stick to use. IMHO.

In my opinion: zero latency is "the Benchmark" to strive for. Specifically: <10ms latency at 48K.

The side effect of using a higher sampling rate with lower latency means that it wouldn't require anywhere near the size of those previous benchmark files to get results in the form of actionable data.

I realize most Reason users seem to fall under the umbrella of "electronic" music involving primarily virtual instruments/synths etc so maybe latency doesn't affect those types of projects nearly as much? Maybe 44.1 @ 1024 samples was chosen because most systems can be thus configured and it's more guaranteed that more users can at least use that as a common starting point? IDK. But for those of us who record live instruments primarily, my systems live or die based on their latency and 1024 samples/25ms of latency is completely useless and tells me nothing about how that system will perform at the levels necessary to facilitate recording/jamming along with live instruments.

Yes, I realize I could make my own Benchmark thread and specify my own settings as above, instead of just b!tching about it from afar, but like.. that sounds like a lot of work, man. And I also choose to discuss it this way because I don't know everything and maybe I'm missing something obvious so like... why go through that much work if I'm just an idiot?

Either way: zero latency or bust!

I know your interface can do it because it's what I use. The only caveat is maybe you have some weird chipset/driver issue going on. That happened to me on my last system; no matter what I did, the audio performance was not great and that was with a current CPU/Motherboard. Once I tried everything including pulling my hair out and curling up in the fetal position crying on the floor, I got a different CPU/MOBO and voila! Same interface but significantly better results.

User avatar
mimidancer
Posts: 669
Joined: 30 Sep 2021

16 Jan 2024

[/quote]

Did you try adjusting Recording Latency Compensation?
[/quote]

I did but it is still off

User avatar
mimidancer
Posts: 669
Joined: 30 Sep 2021

16 Jan 2024

sublunar wrote:
06 Jan 2024
You say you have "low latency" but also say you can "still feel it".. so it must not be all that low.

You also haven't said what your settings are so it's difficult to say what the specific issue is, but the first thing you should check is, obviously, your Audio settings Buffer Size.

Incidentially, you have the same audio interface (1824c) that I'm using. Most of what I record is live instruments but I also use Arturia (and Jiggery Pokery) virtual organs/synths. And I've never had a latency issue that wasn't fixed by keeping the buffer size as low as possible.

For reference, all of my projects using the 1824c are:

48K @ 64 samples resulting in 4ms input and 2ms output latency. Which is for all intents and purposes, "zero" latency.

I'll record a guitar track over some MIDI drums one minute and the next I'll record new MIDI drums via my E-drum kit which is controlling Superior Drummer inside of Reason and it all works seamlessly without any noticeable latency.

-

While we're on this topic.. (and I've made this argument before in the relevant "Reason Benchmark" threads):

If you use those Reason Benchmark threads as your project settings, you're going to have a bad time with latency. I have no idea why anyone would use 44.1 @ 1024 samples with 25ms latency as any sort of "Benchmark".

I've gone through numerous audio interfaces over the years and what I've found is that basically ANY crappy soundcard/interface can record/playback reasonably well at 44.1 with 1024 samples. Pretty sure I was achieving that back in the turtle beach sound cards days. Those settings are in no way taxing to modern systems. As a result, the person who made those benchmark files had to make them CRAZY HUGE (which are themselves taxing on a system just to even open them). Those settings are a very poor measuring stick to use. IMHO.

In my opinion: zero latency is "the Benchmark" to strive for. Specifically: <10ms latency at 48K.

The side effect of using a higher sampling rate with lower latency means that it wouldn't require anywhere near the size of those previous benchmark files to get results in the form of actionable data.

I realize most Reason users seem to fall under the umbrella of "electronic" music involving primarily virtual instruments/synths etc so maybe latency doesn't affect those types of projects nearly as much? Maybe 44.1 @ 1024 samples was chosen because most systems can be thus configured and it's more guaranteed that more users can at least use that as a common starting point? IDK. But for those of us who record live instruments primarily, my systems live or die based on their latency and 1024 samples/25ms of latency is completely useless and tells me nothing about how that system will perform at the levels necessary to facilitate recording/jamming along with live instruments.

Yes, I realize I could make my own Benchmark thread and specify my own settings as above, instead of just b!tching about it from afar, but like.. that sounds like a lot of work, man. And I also choose to discuss it this way because I don't know everything and maybe I'm missing something obvious so like... why go through that much work if I'm just an idiot?

Either way: zero latency or bust!

I know your interface can do it because it's what I use. The only caveat is maybe you have some weird chipset/driver issue going on. That happened to me on my last system; no matter what I did, the audio performance was not great and that was with a current CPU/Motherboard. Once I tried everything including pulling my hair out and curling up in the fetal position crying on the floor, I got a different CPU/MOBO and voila! Same interface but significantly better results.
it is between 7 and 11 seconds. And I can feel it. I will just keep nudging my tracks.

User avatar
sublunar
Posts: 507
Joined: 27 Apr 2017

28 Mar 2024

mimidancer wrote:
16 Jan 2024

it is between 7 and 11 seconds. And I can feel it. I will just keep nudging my tracks.
What is your sample rate and size?

You never mentioned your settings and those settings are what directly affect your latency. I would never settle with nudging my tracks to fix something that should not be happening to begin with, with the correct settings applied.

I have the same interface as you and running at 48K @ 64 samples, I have zero noticeable latency.

User avatar
mimidancer
Posts: 669
Joined: 30 Sep 2021

29 Mar 2024

sublunar wrote:
28 Mar 2024
mimidancer wrote:
16 Jan 2024

it is between 7 and 11 seconds. And I can feel it. I will just keep nudging my tracks.
What is your sample rate and size?

You never mentioned your settings and those settings are what directly affect your latency. I would never settle with nudging my tracks to fix something that should not be happening to begin with, with the correct settings applied.

I have the same interface as you and running at 48K @ 64 samples, I have zero noticeable latency.
I meant milliseconds not seconds. I can feel it. So I nudge the tracks. I just wish it would suto nudge them for me. But I can't figure out how. Thanks for trying to help.

User avatar
miyaru
Posts: 626
Joined: 28 Oct 2019
Location: Zaanstad, The Netherlands

29 Mar 2024

To avoid “feeling” latency, I use my old Lexicon MPX 1 in the monitorpath. I use a Focusrite Scarlett 18i20 gen 3.

This way I can provide some fx for myself or guest musicians or vocalists. I patched in the MPX 1 digitally so monitor quality is as good as it gets.

I use a buffer setting of 64 samples @ 44.1 KHz. I never noticed the benefits of 48 KHz, but I’m slowly getting to loose my hearing at som frequencies……….

Anyway, this method avoids latency while tracking. The FX’s might be a bit differend than the ones used in the final mix, but hasslefree tracking gives me so much piece of mind, that I will continue to use it like this.

A unit like a Lexicon MPX 1 can be found second hand for less than €300,= It’s cheaper siblings can be found for even less money, but depends on what one needs.
Greetings from Miyaru.
Prodaw i7-7700, 16Gb Ram, Focusrite Scarlett 18i20 3rd gen, ESI M4U eX, Reason12, Live Suit 10, Push2, Presonus Eris E8 and Monitor Station V2, Lexicon MPX1,
Korg N1, Yamaha RM1x :thumbup:

Post Reply
  • Information
  • Who is online

    Users browsing this forum: Ahrefs [Bot] and 62 guests