Hi folks,
I've been using an ancient 24" Dell 1080p monitor for the past several years and decided it was finally time to upgrade to something larger/sharper. I bought a well-reviewed Asus 4K monitor as I figured I might as well make use of the hi-def graphics in R12 and also just have everything on the screen look... nicer? I don't use the computer for gaming or anything, so refresh rates didn't seem super important. Honestly, anything would be an improvement from what I have.
So, ffwd to today, I hook the monitor up and projects that were playing fine yesterday are stuttering, glitching, and generally not working. DSP monitor completely redlined, "disk overload" light on, the works. Adjusting the display scaling in system prefs didn't help much if at all.
My computer is a few years old but I'm genuinely surprised at the full system meltdown that happened.
I've got a 2018 MacMini running Mojave (10.14.6), 3.2 i7 6-core, 32GB RAM, lots of space available on the HD. I feel like it should be able to handle whatever I throw at it within reason - for example, a project with about 26 tracks of audio (drumkit with 10 mics, some guitars, bass, vocals, etc), a reasonable amount of REs/VSTs (2/3 per channel) is playing with no issues on the 15-year old Dell monitor.
What was truly surprising was seeing the difference in CPU load for individual devices with the different monitors plugged in. Ekssperimental's 295 EQ was one of the worst offenders - it went from about 7% CPU load (surprising for an EQ!!) to around 15%.
I'm pretty far out of my depth on these things, but I know Reason uses CPU to process graphics rather than GPU, which I imagine means that all those things like the animated reel of tape on the Tapefunk M10 are going to affect overall performance and drag the system down as a whole. It also seems like there's no way to downgrade the graphics inside reason to eat up less CPU.
A couple of quick Qs -
Have others experienced such radical downgrades in performance when switching monitors?
Should I expect R12 to run ok on a newer 1080p or 1440p monitor? I'd really like to move to something bigger than a 24"...
Does R12 run more efficiently on a newer apple OS?
I don't mind returning the Asus and getting something less fancy, but I'm loathe to go through this again.
It'll be at least another year or two before I buy a new desktop computer, and I really love using Reason. Any insight you kind folks have on optimizing what I have (and getting a monitor that doesn't make R12 poop its pants) would be truly appreciated!!
Monitor upgrade broke reason 12 - advice?
Reason actually uses GPU rendering as of version 12.
In the macOS display settings, if you option-click "Scaled", are you able to choose any low resolution options? (like shown here). That might help narrow down the cause. Not sure exactly how it looks in Ventura's new settings app.
In the macOS display settings, if you option-click "Scaled", are you able to choose any low resolution options? (like shown here). That might help narrow down the cause. Not sure exactly how it looks in Ventura's new settings app.
The 4K display may just be demanding too much of the GPU, so instead of thrashing the CPU you’re thrashing the GPU? I’m surprised though, but then shared GPU utilization has always been a black art, and now it’s being used for number crunching too…?
Software: Reason 13 + Objekt, Vintage Vault 4, V-Collection 9 + Pigments, Vintage Verb + Supermassive
Hardware: M1 Mac mini + dual monitors, Launchkey 61, Scarlett 18i20, Rokit 6 monitors, AT4040 mic, DT-990 Pro phones
Hardware: M1 Mac mini + dual monitors, Launchkey 61, Scarlett 18i20, Rokit 6 monitors, AT4040 mic, DT-990 Pro phones
I experienced something like that too. Try reducing the resolution to something lower than 4k and reduce the refresh rate
Get more Combinators, Patches and Resources at the deeplink website
This all makes sense - I was thinking about the problem from the wrong direction yesterday. It makes sense that the display is maxing out the anemic mac mini GPU, which is consequently taxing the CPU, which means there's less CPU available for R12, which in turn means that the load per device is going to go up as *a percentage* of what's available. I was hung up on the idea that the monitor was *increasing* R12s CPU demands, rather than it *decreasing* the available CPU while R12s demands stay the same.
-
- Posts: 5
- Joined: 09 Aug 2023
Upgrading to a higher resolution monitor, such as 4K, can put additional strain on your computer's resources, including the CPU and graphics processing. The increased pixel count requires more processing power, potentially affecting overall system performance. It's not uncommon to experience a performance decrease when moving to a higher resolution display.
splatoon 3
splatoon 3
-
- Information
-
Who is online
Users browsing this forum: No registered users and 108 guests