Props to the author for putting in what looks like ton of work trying to navigate this issue, shame they have to go to these lengths to even have their case considered.
I went to hell and back trying to get PIP/PBP monitors on my 57" g9 ultrawide to work with my M2 pro. ended up having to use a powered hdmi dongle, displaylink cable, and displayport, with 3 virtual monitors via betterdisplay. Allowing resolutions outside of macs limitations setting in BD is what did the trick. I don't envy OP. Having 5120x1440 @ lodpi was the worst, just ever so slightly too fuzzy but perfect UI size but eventually got a steady 10240x2880 @ 120hz with HDR. I literally laughed out loud when I read the title of the thread. Poor guy.
Thanks, it was a good portion of my weekend bashing my head against the keyboard trying to figure out what was going on and if there was a workaround I could use (there isn't that I've found).
The post reminded me how I investigated a similar issue having no idea. Using Claude or GPT to investigate this kind of hardware issue is fast and easy. It gives you next command to try and then next one and you end up with similar summary. I wouldn’t be surprised that author didn’t know anything about displays before this.
I thought I was going crazy when my new m4 seemed "fuzzier" on my external 4ks. I tried replicating settings from my old MacBook to no avail.
I wonder if Apple is doing this on purpose except for their own displays.
It's a bit nit-picky on my part, but this bizarre world of MacOS resolution/scaling handling vs. other operating systems (including Windows 11 for crying out loud) is one of my biggest gripes with using Apple hardware.
I remember having to work hard to make my non-Apple display look 'right' years ago on an Intel-based mac due to weirdness with scaling and resolutions that a Windows laptop didn't even flinch handling. It was a mix of hardware limitations and the lack of options I had to address the resolution and refresh rates I had available over a Thunderbolt doc that I shouldn't have to think about.
I honestly hope they finally fix this. I would love it if they allowed sub-pixel text rendering options again too.
This reminds me of this comment, which I feel is a somewhat unsatisfying explanation, given that despite these difficulties, Windows somehow makes it work.
I don't understand what is fuzzy about "lodpi". I've been using it for 8 years on a 4k 43" screen with 1x scaling. Can't say I noticed any difference when switching several times per day between Linux and an M1 MBP, nor any difference when upgrading to an M4 MBP.
Yes, I would actually be surprised to learn that mode is available on any system. I’ve never seen that anywhere, though I only have a M1 Pro and an M4 Pro (and various Intel Macs).
You’re rendering to a framebuffer exactly 2x the size of your display and then scaling it down by exactly half to the physical display? Why not just use a 1x mode then!? The 1.75x limit of framebuffer to physical screen size makes perfect sense. Any more than that and you should just use the 1x mode, it will look better and perform way better!
Then complain about that. That would make a much more sensible blog post and discussion. Asking for a crazy workaround to a sane problem isn't a great way to get good results, especially with Apple. Beyond the obvious performance pitfall, this scale up to scale down approach will also destroy the appearance of some controls. There is some UI that aims for 1px lines on hidpi modes that will get lost if you do this. It's hardly a perfect mode.
The crazy workaround only needs to be done because of what Apple did probably around a decade ago and probably already heard a bunch of crying about and didn't care. No one removed subpixel antialiasing on their own, we do this bullshit because Apple forced us to to make text look halfway decent.
I can tell you that inside Apple, they have something called the standard question, and it goes something like this: “What are you really trying to do?”
If you haven’t personally filed a bug report at feedbackassistant.apple.com, I recommend that you do so. Title it something like “Poor text quality on LoDPI display”, file it in the Displays component, and in the description explain what you’re seeing. Here’s the critical part: you want to attach images showing what looks bad and what looks better, and why the current behavior is a regression and since when (earlier macOS versions for subpixel AA, earlier GPUs for 2x 1x mode). If possible, use the same display, but get an image of historical macOS when it had subpixel AA, macOS with this 2x 1x mode, Windows 11, and then current macOS at the standard 1x mode. I’m not sure screenshots will capture it, you’ll probably need to use a camera.
I know how they think at Apple. If you come at them with a bug written like OP’s blog, they are going to say it behaves as designed. To get them to fix something, you have to be descriptive about what the real problem actually is: the text rendering looks bad. Then you have to explain what used to work and what you’ve tried and bring receipts (the images). Don’t write a novel; write the shortest bug that fully describes the real problem, includes all of the relevant information including macOS versions, hardware info, and display model, and the evidence of the problem, but don’t include a bunch of emotional text or extraneous information (like SkyLight framework reverse engineering stuff).
Now you might say, “I’m not Apple’s free QA”, and you’ll be right. But, consider that you’re spending this time complaining about a problem online and you’ve spent good money on a display you’d like to use and it’s not working the way you want. Fair or not, you care about the outcome, and at this point you might as well take my advice and file a strong bug to make your case. Dupes help, OP should file one too, but be descriptive about the real problem, not proscriptive about bringing back the crazy workaround that they likely intentionally disabled because on the face of it, it makes no sense.
I do know that they read user bugs in the Displays component, because I have filed a few in there recently and they got fixed and they followed up with me about where they were fixed.
Yeah I'm not sure what the point of this article is really or am I probably misunderstanding something? There's no such thing as 4K HiDPI on a 4K monitor. That would be 2160p @ 2x on an 8K monitor. 4K at 100% scaling looks terrible in general across every OS.
Yeah. I don't get it. If you've got a 3840x2160 display, intended use on macOS as a 1920x1080@2x display, what is the advantage of using a 7680x4320 buffer? Everything is drawn at twice the width and height - and then gets scaled down to half the width and height. Is there actually a good reason to do this?
(I use my M4 Mac with 4K displays, and 5120x2880 (2560x1440@2x) buffers. That sort of thing does work, though if you sit closer than I do then you can see the non-integer scaling. Last time I tried a 3840x2160 buffer (1920x1080@2x), that worked. I am still on macOS Sequoia though.)
> what is the advantage of using a 7680x4320 buffer? Everything is drawn at twice the width and height - and then gets scaled down to half the width and height. Is there actually a good reason to do this?
Text rendering looks noticeably better rendered at 2x and scaled down. Apple's 1x font antialiasing is not ideal.
Especially in Catalyst/SwiftUI apps that often don't bother to align drawing to round points, Apple's HiDPI downscaling has some magic in it that their regular text rendering doesn't.
Yes but Apple got to drop subpixel anti-aliasing support because this workaround is "good enough" for all of their built-in displays and overpriced mediocre external ones, so we all get to suffer having to render 4x the pixels than we need.
This is not a normal retina configuration. This is a highly unusual configuration where the framebuffer is much larger than the screen resolution and gets scaled down. Obviously it sucks if it used to work and now it doesn't but almost no one wants this which probably explains why Apple doesn't care.
I don’t know why this was downvoted, I agree that this is a highly unusual configuration. Why render to a frame buffer with 2x the pixels in each direction va the actual display, only to then just scale the whole thing down by 2x in each direction?
Supersampling the entire framebuffer is a bad way to anti-alias fonts. Especially since your font rendering is almost certainly doing grayscale anti-aliasing already, which is going to look better than 2x supersampling alone. And supersampling will not do subpixel rendering.
To be frank, it's kind of embarrassing if an entry-level Windows laptop with a decent integrated GPU handles this without much effort.
Apple is free to make its own choices on priority, but I'm disappointed when something that's considered the pinnacle of creative platforms sporting one of the most advanced consumer processors available can't handle a slightly different resolution.
In my case it's a standard LG UltraFine 4K monitor plugged into a standard 16" M5 MacBook Pro via standard Thunderbolt (via USB-C) - not sure what's not normal about this? I've confirmed it with other monitors and M5 Macbook Pros as well.
In macOS display settings, what scaling mode are you using? This bug appears to only affect 4K monitors that are configured to use the maximum amount of screen space (which makes text look uncomfortably tiny unless you have a very large monitor). Most people run at the default setting which gives you the real estate of a 1080p screen at 2x scale, hence the "not normal" part of this configuration.
Actually, I don't even think it's possible to run HiDPI mode at the native resolution scale from within the macOS settings app, you'd need something like `Better Display` to turn it on explicitly.
If you use the middle screen scaling you're given absolutely huge UI elements and it's the case for the inbuild 16" screen as well as external displays but when you get up to 32" displays it's almost comical how large the UI is on the middle / default setting.
Yeah, on larger monitors it's more common to run at the monitor's native resolution without scaling but even so macOS will not turn on HiDPI mode - you'd still need to do this explicitly via another app (I didn't even know it was possible to turn on HiDPI mode at native scaling until reading this article)
I use a 43" 4k tv at the standard non-retina 4k with an m1 pro. I tried your 8k supersampling but it doesn't seem to improve on the default 4:4:4 8bit rgb non-retina for me. (smoother but not as crisp outside terminals?)
The TV is unusable without BetterDisplay because the apple default negotiation preference. I hope waydabber can figure something out with you.
This is what us proles on third-party monitors have to do to make text look halfway decent. My LG DualUps (~140ppi if I recall) run at 2x of a scaled resolution to arrive at roughly what would be pixel-doubled 109ppi, which is the only pixel density the UI looks halfway decent at. It renders an 18:16 2304 x something at 2x, scaled down by 2.
It's also why when you put your Mac into "More Space" resolution on the built-in or first-party displays, it tells you this could hurt performance because thats exactly what the OS is going to do to give you more space without making text unreadable aliased fuzz, it renders the "apparent" resolution pixel doubled, and scales it down which provides a modicum of sub-pixel anti-aliasing's effect. Apple removed subpixel antialiasing a while back and this is the norm now.
I have a 4K portable display (stupid high density but still not quite "retina" 218) on a monitor arm I run at, as you suggest, 1080p at 2x. Looks ok but everything is still a bit small. If you have a 4K display and want to use all 4K, you have the crappy choice between making everything look terrible, or wasting GPU cycles and memory on rendering an 8K framebuffer and scaling it down to 4K.
I'm actually dealing with this right now on my TV (1080p which is where I'm writing this comment from). My normal Linux/Windows gaming PC that I have hooked up in my living room is DRAM-free pending an RMA, so I'm on a Mac Mini that won't let me independently scale text size and everything else like Windows and KDE let me do. I have to run it at 1600x900 and even then I have to scale every website I go to to make it readable. Text scaling is frankly fucked on macOS unless you are using the Mac as Tim Cook intended: using the built-in display or one of Apple's overpriced externals, sitting with the display at a "retina appropriate" distance for 218ppi to work.
Send it to Tim Cook email. It worked for me fixing DisplayPort DSC bug. After Catalina, later MacOSes lost ability to drive monitors at higher than 60Hz refresh.
Apple support tortured me with all kinds of diagnostics, with WontFix few weeks later. Wrote email and it got fixed in Sonoma :)
Fucking with DP 1.4 was how they managed to drive the ProDisplay XDR.
If your monitor could downgrade to DP 1.2 you got better refresh rates than 1.4 (mine could do 95Hz SDR, 60Hz HDR, but if my monitor said it could only do 1.2, that went to 120/95 on Big Sur and above, when they could do 144Hz HDR with Catalina).
I would be absolutely unsurprised if their fix was to lie to the monitor in negotiation if it was non-Apple and say that the GPU only supported 1.2, and further, I would be also unsurprised to learn that this is related to the current issue.
Ahh, true, I now have 120Hz top, but it's fine, why I said fixed :) I now recall in Catalina I had full 144Hz and VRR options! Monitor is Dell G3223Q via Caldigit TS4 DP.
I was using 2 27" LG 27GM950-Bs (IIRC), that could do up to 165Hz and VRR on a 2019 cheesegrater Mac Pro, wasn't the cables, or the monitors, or the card.
People at the time were trying to figure out the math of "How did Apple manage to make 6K HDR work over that bandwidth?" and the answer was simply "by completely fucking the DP 1.4 DSC spec" (it was broken in Big Sur, which was released at the same time). The ProDisplay XDR worked great (for added irony, I ended up with one about a year later), but at the cost of Apple saying "we don't care how much money you've spent on your display hardware if you didn't spend it with us" (which tracks perfectly with, I think, Craig Federighi spending so much time and effort shooting down iMessage on Android and RCS for a long time saying, quote, "It would remove obstacles towards iPhone families being able to give their kids Android phones").
I don't expect emails to get through to busy CEOs of huge companies like Apple unless you're really lucky and they make it through some automation, but I have dropped him an email just in case. I guess you never know.
This was maybe 20 years ago. I was looking for a job as a recruiter and just called him. He referred me to an HR rep and I did get an interview from it. Didn’t get the job, but hey, I got a shot!
I think you'd want to offer morethan a problem statement when taking CEO time. Yes it's broken because shareholders demand M$ products have AI feature so that the share price has the 'AI' multiple.
It's pretty hard to justify the stock price, even with the current, high earling from the cloud so they are looking for hte next golden goose
I once had a terrible experience dealing with my local Apple Store and then a hostile call with an Apple Retail manager after I left critical feedback.
I emailed Cook, mostly just to shout into the void. Within a week I got a call from Apple Corporate, they gave me an appointment the next day and my hardware issue was suddenly solved over-night.
Well, it sounds like a real issue, but the diagnosis is AI slop. You can see, for example, how it takes the paragraph quoted from waydabber (attributing the issue to dynamic resource allocation) and expands it into a whole section without really understanding it. The section is in fact self-contradictory: it first claims that the DCP firmware implements framebuffer allocation, then almost immediately goes on to say it's actually the GPU driver and "the DCP itself is not the bottleneck". Similar confusion throughout the rest of the post.
Agree. I started reading the article until I realized it wasn’t even self-coherent. Then I got to the classic two-column table setup and realized I was just reading straight LLM output.
There might be a problem but it’s hard to know what to trust with these LLM generated reports.
I might be jaded from reading one too many Claude-generated GitHub issues that look exactly like this that turned out to be something else.
And prior to Apple’s re-entry into the display market, everybody internally was likely on 2x HiDPI LG UltraFine displays or integrated displays on iMacs and MacBooks.
Fractional scaling (and lately, even 1x scaling “normal”) displays really are not much of a consideration for them, even if they’re popular. 2x+ integer scaling HiDPI is the main target.
But to be fair, until last year there were no retina monitors in the market except the Apple ones. In 2025, the tides turned, there are now way more options both for 5k and 6k retina displays.
Tbh I'm not even sure what the issue is here. I have a personal M1 macbook and a work M4 and a 4k display. I don't see any issues or differences between them on my display. The M4 seems to be outputting a 4k image just fine.
The article could just be AI slop since it just contains hyper in depth debugging without articulating what the problem is.
Right, I just went though all of the scale options on my M4 with 4k monitor and none of them rendered blurry. Might be a very situational bug. Doesn't seem as widespread as the title makes out to be.
- 24" you need 4k
- 27" you need 5K.
- 32" you need 6k.
Windows subpixel aliasing (Clear Type) manages a lot better with lower pixel density. Since Windows still has a commanding market share in enterprise, you might be right about the industry standard for HiDPI but for Apple-specific usage, not really.
This still baffles me. Never mind Windows; I can get sub-pixel font rendering with the ability to fine-tune it on virtually any major Linux distro since around 2010.
Meanwhile, Apple had this but dropped it in 2018, allegedly under the assumption of "hiDPI everywhere" Retina or Retina-like displays. Which would be great...except "everywhere" turned out to be "very specific monitors support specific resolutions".
Totally agree with those resolution suggestions. Personally I have a 32" 4k, I wanted a 5k or 6k back then (just too expensive) - but now I wish I had just got a 27" which is better suited to 4k - regardless it was a LOT better on the M2 Max with HiDPI working.
OP dances around the key context that this isn’t hidpi, but rather a 3rd party hack that uses hidpi rendering to supersample their “native” 4k resolution by 2x, since the end result looks more pleasing to them than the native 4k render.
It’s actually around 1.5x for the default resolution out of the box and 1.3x for “more space” setting on m1/m2 MacBook Air. 1.1x supersampling on Macs makes it worse because down sampling to pixel alignment becomes a hot mess.
Thanks for the feedback, I'll try to take some photos, it's not an easy thing to do accurately without a good camera setup, but I'll reply here after work if I get something setup and added to the post.
I'm sure you've already given this a crack via some other technique (I just Cmd-F for it and didn't find) but I have had monitors with confusing EDIDs before that MacOS didn't handle well and the "screenresolution" CLI app https://github.com/jhford/screenresolution always let me set an arbitrary one. It was the only way to get some monitors to display at 100 Hz for me and worked very well for that since the resolution is mostly sticky.
Hey, thanks I hadn't tried screenresolution but that seems to simply set the resolution and refresh rate without controlling the scaling which is what is needed for configuring the HiDPI mode scaling.
Sadly I have the issue on a new m5 air. I have a 60hz 4k work monitor and two high refresh 4k gaming displays. The 60hz pairs fine with either gaming monitor, but the two gaming ones together and one just doesn't get recognized. Spent way too long trying new cables before realizing it's a bandwidth limitation.
I use a 4K 32'' Asus ProArt monitor and didn't notice any difference between my M2 Pro and my M4 Pro (on Sequoia). I will admit my eyesight is not the best anymore but I think I would notice given I'm a bit allergic to blurry monitors.
Anyway I will run the diagnostic commands and see what I get.
The ideal work/coding resolutions and sizes for macOS that I would suggest if you are going down this rabbit hole.
24 inch 1080p
24 inch 4k (2x scaling)
27 inch 1440p
27 inch 5k (2x scaling)
32 inch 6k (2x scaling)
Other sizes are going to either look bizarre or you’ll have to deal with fractional scaling.
Given that 4k is common in 27/32 inches and those are cheap displays these kinds of problems are expected. I have personally refused to accept in the past that 27 inch 4k isn’t as bad as people say and got one myself only to regret buying it. Get the correct size and scaling and your life will be peaceful.
I would recommend the same for Linux and Windows too tbh but people who game might be fine with other sizes and resolutions.
32" 4k display at fraction scaling of 1.5 (150%) is fine for my day-to-day work (Excel, VS Code, Word, Web browsing, Teams etc.). It delivers sharp enough text at an effective resolution of 2560x1440 px. There are many 32" 4k displays that are affordable and good enough for office workers. I work in a brightly lit room, so I find that monitor brightness (over 350 nits) is the most important monitor feature for me, over text sharpness, color accuracy, or refresh rate.
For me it would be 16-27" 4k is fine, but and as you go up to 32" I'd be wanting 5 or 6k ideally as it's quite noticable for text (even when high DPI scaling is working and across operating systems).
I have dual 27" monitors, both at work and at home. At work, they're 4K monitors, because that's all they have in this size for some reason (LG if it makes a difference). At home, my own monitors are ASUS ProArt 1440p monitors. I run Linux in both places.
I really like my 1440p monitors at home more than the 4K monitors at work. At work, I'm always dealing with scaling and font size issues, but at home everything looks perfect. So I think you're onto something here: 1440p just seems to be a better resolution on a 27" panel.
If you actually care about this stuff you are going to run something like https://github.com/waydabber/BetterDisplay which easily allows for HiDPI @ 4K resolution, it does not "look bizarre" or "require fractional scaling". This is what the OP is about. I do the same thing, I run native res w/ HiDPI on a 27" 4K screen as my only monitor, works great.
Sure, and that is the real tragedy here. The person I'm replying to is just pointing out that native support for high res sucks, which is true, but the real problem is what limits there are on 3rd party support.
That's what I'm pointing out. The person I replied to thinks it does: "I have personally refused to accept in the past that 27 inch 4k isn’t as bad as people say and got one myself only to regret buying it. Get the correct size and scaling and your life will be peaceful. I would recommend the same for Linux and Windows too tbh but people who game might be fine with other sizes and resolutions."
TFA doesn't say -- does anyone know if this applies to 5k and 6k monitors? On my 5k display on a M4 Max, I see the default resolution in system settings is 2560x1440. Which is what I'd expect.
If the theory about framebuffer pre-allocation strategy is to hold any water, I would think that 5k and 6k devices would suffer too, maybe even more. Given that you can attach 2x 5k monitors, the pre-allocation strategy as described would need to account for that.
I believe it will, it won't be until you push up to an 8k display that you'll get the old level of scaling back (could be wrong though as I don't have a way to test this).
Just another case of Apple intentionally going against established open standards to price gouge their users.
I wouldn't mind it as much if I didn't have to hear said users constantly moaning in ecstasy about just how much better "Apple's way" is.
High quality desktop Linux has been made real by KDE, and the AI-fueled FOSS development boom is accelerating this eclipse of proprietary nonsense like this.
If you're a developer, you should be using a system that isn't maintained by a company that intentionally stabs developers in the back at every turn. (Unless you're into that. U do u.)
This might be a dumb question: Is the author looking to run 4k display at HiDPI 8k framebuffer and then downscale? What's the advantage of doing so versus direct 4k low-DPI? Some sort of "free" antialiasing?
From what I understand, the main goal is to fix the problem that non-native (1:1 pixel mapping) resolutions and scaling look worse than native. This is a problem when you ship high-dpi displays that need UI scaling in order for things to be readable. Apple's solution was to render everything at a higher, non-native resolution so that images were always downscaled to fit the display.
So to oversimplify, Windows can have a problem where if you are running 1.5X scaling so text is big enough, you can't fit 4K of native pixels on a 4K display so videos are blurry. If instead you were rendering a scaled image to a 6K framebuffer and then downscaling to 4K, there would be minimal loss of resolution.
> From what I understand, the main goal is to fix the problem that non-native (1:1 pixel mapping) resolutions and scaling look worse than native.
That would be my instinct as well, but the author seems to be delibarately doing the exact opposite. Trying to force a 2x HiDPI and then downscaling to native display resolution whereas he could have just done a 1:1 LoDPI rendering. What you get in the end is some equivalent of hack/brute-force smoothing/antialiasing of what was rendered in the downsample.
The author said that the problem is that Apple has introduced a size limit for the display (3360x1890) that is lower than the size of the actual display, which is a standard 4k display (3840x2160).
So 1:1 rendering can cover only a part of the screen, while the remainder remains unused.
If the maximum size limit is used but applied to the entire screen, it does not match the native resolution so interpolation is used to convert between images with different resolutions, blurring the on-screen image.
All the attempts were done with the hope that there is some way to convince the system to somehow use the greater native image size instead of the smaller size forced by the limits.
I do not know who was the moron that first used scaling in conjunction with displays having a higher resolution, but this is a non-solution that should have never been used anywhere.
Already more than 35 years ago the correct solution was used. For text and for graphics, the sizes must be specified only in length units, e.g. in typographic points or millimeters or inches, e.g. by configuring a 12-point font for a document or for an UI element. Then the rasterizer for fonts and for graphics renders correctly everything at a visual size that is independent of the display resolution, so it is completely irrelevant whether a display is HiDPI or not.
To combat the effect of rounding to an integer number of pixels, besides anti-aliasing methods, the TTF/OTF fonts have always included methods of hinting that can produce pixel-perfect characters at low screen resolutions, if that is desired (if the font designer does the tedious work required to implement this). Thus there never exists any reason for using scaling with fonts.
For things like icons, the right manner has unfortunately been less standardized, but it should have been equally easy to always have a vector variant of the icons that can be used at arbitrary display resolutions, supplemented by a set of pre-rendered bitmap versions of the icons, suitable for low screen resolutions.
I am always astonished by the frequent discussions about problems caused by "scaling" on HiDPI displays in other operating systems, because I have been using only HiDPI displays for more than a dozen years and I had no problems with them while using typefaces that are beautifully rendered at high resolution, because I use X11 with XFCE, where there is no scaling, I just set the true DPI value of the monitors and everything works fine.
> This aligns with our findings. The M4/M5 DCP firmware implements a conservative framebuffer pre-allocation strategy that:
> Caps the HiDPI backing store to approximately 1.75x the native resolution (6720x3780 for 3840x2160 native), rather than the 2.0x needed for full HiDPI (7680x4320)
So, that could be an off by one bug? That might be testable by tweaking the system to think the display supports an even higher resolution.
Also, instead of messing with the Display Override Plist, patching drivers, etc, did they try using the “Advanced…” button in the “Displays” UI? They don’t mention they did.
For me (with a 27 inch 4K monitor not on M4 or M5) that replaces the 5-way choice by one with a list of 11 choices. With the then appearing “Show all resolutions” toggle, that becomes 18.
I’m not talking of BetterDisplay. I’m talking about Apple’s UI. Scroll down the content of the “Displays” pane, and you find a row of buttons “Advanced…”, “Night Shift…” and “?”
Unlike the article, I'd assume it's hardware related rather than software.
Assuming the article is correct and the hardware can do 7680x4320 @60, which requires 8GB/s memory bandwidth, in theory it should be able to do the same to read the same memory and interleave every other line for the down-sampling. However, it's possible that the new memory controller can't support 2 simultaneous burst streams (because the 2 lines are 30KB apart in memory), or if it's doing a single burst and buffering the first line until the second line is available, then maybe the cache is smaller than 30KB.
Another possibility is that previously the scale averaged pairs of pixel horizontally and cached them until the next line was available to average with that, and for some reason it was changed to average all 4 at the same time and so the cache isn't sufficient (although it'd be weird as 25.25KB is a fairly weird size to limit the cache to)
Alternatively, looking at clock rates needed for the sampler, 3360x1890 @60 is 381MHz, 3840x2160 @60 is 497MHz. It's quite possible that they've lowered the base clock on some hardware and not considered that it'd impact the maximum effect on the scaler.
But whatever IMHO, it's unlikely to be a software bug with an easy fix.
wronglebowski | 8 hours ago
MarcelOlsz | 7 hours ago
moeadham | 6 hours ago
jakehilborn | 6 hours ago
[OP] smcleod | 6 hours ago
dostick | 5 hours ago
skullone | 7 hours ago
NBJack | 6 hours ago
I remember having to work hard to make my non-Apple display look 'right' years ago on an Intel-based mac due to weirdness with scaling and resolutions that a Windows laptop didn't even flinch handling. It was a mix of hardware limitations and the lack of options I had to address the resolution and refresh rates I had available over a Thunderbolt doc that I shouldn't have to think about.
I honestly hope they finally fix this. I would love it if they allowed sub-pixel text rendering options again too.
Sir_Twist | 5 hours ago
This reminds me of this comment, which I feel is a somewhat unsatisfying explanation, given that despite these difficulties, Windows somehow makes it work.
paozac | 4 hours ago
semi-extrinsic | 4 hours ago
eckelhesten | 3 hours ago
bsimpson | 7 hours ago
TheCoreh | 7 hours ago
TheTon | 7 hours ago
You’re rendering to a framebuffer exactly 2x the size of your display and then scaling it down by exactly half to the physical display? Why not just use a 1x mode then!? The 1.75x limit of framebuffer to physical screen size makes perfect sense. Any more than that and you should just use the 1x mode, it will look better and perform way better!
wpm | 6 hours ago
I have a 32:9 Ultrawide I would love to use on macOS but the text looks awful on it.
TheTon | 6 hours ago
wpm | 5 hours ago
TheTon | an hour ago
If you haven’t personally filed a bug report at feedbackassistant.apple.com, I recommend that you do so. Title it something like “Poor text quality on LoDPI display”, file it in the Displays component, and in the description explain what you’re seeing. Here’s the critical part: you want to attach images showing what looks bad and what looks better, and why the current behavior is a regression and since when (earlier macOS versions for subpixel AA, earlier GPUs for 2x 1x mode). If possible, use the same display, but get an image of historical macOS when it had subpixel AA, macOS with this 2x 1x mode, Windows 11, and then current macOS at the standard 1x mode. I’m not sure screenshots will capture it, you’ll probably need to use a camera.
I know how they think at Apple. If you come at them with a bug written like OP’s blog, they are going to say it behaves as designed. To get them to fix something, you have to be descriptive about what the real problem actually is: the text rendering looks bad. Then you have to explain what used to work and what you’ve tried and bring receipts (the images). Don’t write a novel; write the shortest bug that fully describes the real problem, includes all of the relevant information including macOS versions, hardware info, and display model, and the evidence of the problem, but don’t include a bunch of emotional text or extraneous information (like SkyLight framework reverse engineering stuff).
Now you might say, “I’m not Apple’s free QA”, and you’ll be right. But, consider that you’re spending this time complaining about a problem online and you’ve spent good money on a display you’d like to use and it’s not working the way you want. Fair or not, you care about the outcome, and at this point you might as well take my advice and file a strong bug to make your case. Dupes help, OP should file one too, but be descriptive about the real problem, not proscriptive about bringing back the crazy workaround that they likely intentionally disabled because on the face of it, it makes no sense.
I do know that they read user bugs in the Displays component, because I have filed a few in there recently and they got fixed and they followed up with me about where they were fixed.
[OP] smcleod | 4 hours ago
armadyl | 7 hours ago
tom_ | 7 hours ago
(I use my M4 Mac with 4K displays, and 5120x2880 (2560x1440@2x) buffers. That sort of thing does work, though if you sit closer than I do then you can see the non-integer scaling. Last time I tried a 3840x2160 buffer (1920x1080@2x), that worked. I am still on macOS Sequoia though.)
kalleboo | 7 hours ago
Text rendering looks noticeably better rendered at 2x and scaled down. Apple's 1x font antialiasing is not ideal.
Especially in Catalyst/SwiftUI apps that often don't bother to align drawing to round points, Apple's HiDPI downscaling has some magic in it that their regular text rendering doesn't.
halapro | 7 hours ago
wpm | 6 hours ago
metabagel | 5 hours ago
tonyedgecombe | an hour ago
TheTon | 7 hours ago
[OP] smcleod | 4 hours ago
wmf | 7 hours ago
sgerenser | 7 hours ago
mlyle | 6 hours ago
eptcyka | 6 hours ago
Rohansi | 5 hours ago
phonon | 6 hours ago
wmf | 6 hours ago
NBJack | 6 hours ago
Apple is free to make its own choices on priority, but I'm disappointed when something that's considered the pinnacle of creative platforms sporting one of the most advanced consumer processors available can't handle a slightly different resolution.
[OP] smcleod | 6 hours ago
petersellers | 5 hours ago
Actually, I don't even think it's possible to run HiDPI mode at the native resolution scale from within the macOS settings app, you'd need something like `Better Display` to turn it on explicitly.
[OP] smcleod | 5 hours ago
petersellers | 5 hours ago
big_toast | 4 hours ago
The TV is unusable without BetterDisplay because the apple default negotiation preference. I hope waydabber can figure something out with you.
wpm | 6 hours ago
It's also why when you put your Mac into "More Space" resolution on the built-in or first-party displays, it tells you this could hurt performance because thats exactly what the OS is going to do to give you more space without making text unreadable aliased fuzz, it renders the "apparent" resolution pixel doubled, and scales it down which provides a modicum of sub-pixel anti-aliasing's effect. Apple removed subpixel antialiasing a while back and this is the norm now.
I have a 4K portable display (stupid high density but still not quite "retina" 218) on a monitor arm I run at, as you suggest, 1080p at 2x. Looks ok but everything is still a bit small. If you have a 4K display and want to use all 4K, you have the crappy choice between making everything look terrible, or wasting GPU cycles and memory on rendering an 8K framebuffer and scaling it down to 4K.
I'm actually dealing with this right now on my TV (1080p which is where I'm writing this comment from). My normal Linux/Windows gaming PC that I have hooked up in my living room is DRAM-free pending an RMA, so I'm on a Mac Mini that won't let me independently scale text size and everything else like Windows and KDE let me do. I have to run it at 1600x900 and even then I have to scale every website I go to to make it readable. Text scaling is frankly fucked on macOS unless you are using the Mac as Tim Cook intended: using the built-in display or one of Apple's overpriced externals, sitting with the display at a "retina appropriate" distance for 218ppi to work.
toxik | 5 hours ago
nuker | 7 hours ago
Apple support tortured me with all kinds of diagnostics, with WontFix few weeks later. Wrote email and it got fixed in Sonoma :)
https://egpu.io/forums/mac-setup/4k144hz-no-longer-available...
nerdsniper | 7 hours ago
FireBeyond | 7 hours ago
Fucking with DP 1.4 was how they managed to drive the ProDisplay XDR.
If your monitor could downgrade to DP 1.2 you got better refresh rates than 1.4 (mine could do 95Hz SDR, 60Hz HDR, but if my monitor said it could only do 1.2, that went to 120/95 on Big Sur and above, when they could do 144Hz HDR with Catalina).
I would be absolutely unsurprised if their fix was to lie to the monitor in negotiation if it was non-Apple and say that the GPU only supported 1.2, and further, I would be also unsurprised to learn that this is related to the current issue.
nuker | 7 hours ago
FireBeyond | 6 hours ago
People at the time were trying to figure out the math of "How did Apple manage to make 6K HDR work over that bandwidth?" and the answer was simply "by completely fucking the DP 1.4 DSC spec" (it was broken in Big Sur, which was released at the same time). The ProDisplay XDR worked great (for added irony, I ended up with one about a year later), but at the cost of Apple saying "we don't care how much money you've spent on your display hardware if you didn't spend it with us" (which tracks perfectly with, I think, Craig Federighi spending so much time and effort shooting down iMessage on Android and RCS for a long time saying, quote, "It would remove obstacles towards iPhone families being able to give their kids Android phones").
[OP] smcleod | 6 hours ago
MikeNotThePope | 6 hours ago
You could always try calling, too! I cold called Marc Benioff at Salesforce and he actually picked up the phone.
krackers | 6 hours ago
harikb | 6 hours ago
not_your_vase | 5 hours ago
MikeNotThePope | 3 hours ago
nuker | 5 hours ago
sph | 3 hours ago
What if that was all it took.
Temporary_31337 | an hour ago
_diyar | 2 hours ago
I emailed Cook, mostly just to shout into the void. Within a week I got a call from Apple Corporate, they gave me an appointment the next day and my hardware issue was suddenly solved over-night.
extr | 5 hours ago
arvinsim | 4 hours ago
jwong_ | 3 hours ago
This was also going from Sequoia to Tahoe.
PedroBatista | 7 hours ago
Tim Apple's Apple has been fu#$%& me again..
comex | 7 hours ago
xbar | 7 hours ago
As an article, it is not 100% coherent, but there is a valid data and a real problem that is clear.
Aurornis | 6 hours ago
There might be a problem but it’s hard to know what to trust with these LLM generated reports.
I might be jaded from reading one too many Claude-generated GitHub issues that look exactly like this that turned out to be something else.
[OP] smcleod | 5 hours ago
whatever1 | 7 hours ago
MBCook | 7 hours ago
They’re likely all on Studio Displays.
cosmic_cheese | 7 hours ago
Fractional scaling (and lately, even 1x scaling “normal”) displays really are not much of a consideration for them, even if they’re popular. 2x+ integer scaling HiDPI is the main target.
robertoandred | 7 hours ago
whatever1 | 5 hours ago
But to be fair, until last year there were no retina monitors in the market except the Apple ones. In 2025, the tides turned, there are now way more options both for 5k and 6k retina displays.
Gigachad | 7 hours ago
The article could just be AI slop since it just contains hyper in depth debugging without articulating what the problem is.
whatever1 | 6 hours ago
Gigachad | 6 hours ago
jiveturkey | 6 hours ago
https://bjango.com/articles/macexternaldisplays/
Windows subpixel aliasing (Clear Type) manages a lot better with lower pixel density. Since Windows still has a commanding market share in enterprise, you might be right about the industry standard for HiDPI but for Apple-specific usage, not really.NBJack | 6 hours ago
Meanwhile, Apple had this but dropped it in 2018, allegedly under the assumption of "hiDPI everywhere" Retina or Retina-like displays. Which would be great...except "everywhere" turned out to be "very specific monitors support specific resolutions".
[OP] smcleod | 6 hours ago
brigade | 4 hours ago
whatever1 | 3 hours ago
brigade | 3 hours ago
raihansaputra | 2 hours ago
brigade | 2 hours ago
whatever1 | an hour ago
When you set it to "more space" it becomes noticeably slower, but not blurry.
ErneX | 2 hours ago
5K at 27 inch or 6K at 32 inch would though, specially on a Mac.
mil22 | 7 hours ago
[OP] smcleod | 6 hours ago
pier25 | 7 hours ago
The article doesn't mention it.
[OP] smcleod | 6 hours ago
pier25 | 6 hours ago
[OP] smcleod | 5 hours ago
arjie | 7 hours ago
[OP] smcleod | 5 hours ago
LuxBennu | 7 hours ago
keyle | 7 hours ago
They've got a good thing going, but they keep finding ways to alienate people.
compounding_it | 7 hours ago
saagarjha | 2 hours ago
pier25 | 7 hours ago
Anyway I will run the diagnostic commands and see what I get.
compounding_it | 6 hours ago
24 inch 1080p 24 inch 4k (2x scaling) 27 inch 1440p 27 inch 5k (2x scaling) 32 inch 6k (2x scaling)
Other sizes are going to either look bizarre or you’ll have to deal with fractional scaling.
Given that 4k is common in 27/32 inches and those are cheap displays these kinds of problems are expected. I have personally refused to accept in the past that 27 inch 4k isn’t as bad as people say and got one myself only to regret buying it. Get the correct size and scaling and your life will be peaceful.
I would recommend the same for Linux and Windows too tbh but people who game might be fine with other sizes and resolutions.
jbellis | 6 hours ago
wmf | 6 hours ago
stefanfisk | 6 hours ago
danny8000 | 6 hours ago
[OP] smcleod | 6 hours ago
shiroiuma | 5 hours ago
I really like my 1440p monitors at home more than the 4K monitors at work. At work, I'm always dealing with scaling and font size issues, but at home everything looks perfect. So I think you're onto something here: 1440p just seems to be a better resolution on a 27" panel.
tern | 5 hours ago
extr | 5 hours ago
extr | 5 hours ago
[OP] smcleod | 5 hours ago
extr | 3 hours ago
mkl | 5 hours ago
polyterative | 2 hours ago
mkl | an hour ago
jiveturkey | 6 hours ago
If the theory about framebuffer pre-allocation strategy is to hold any water, I would think that 5k and 6k devices would suffer too, maybe even more. Given that you can attach 2x 5k monitors, the pre-allocation strategy as described would need to account for that.
[OP] smcleod | 6 hours ago
lovegrenoble | 6 hours ago
spoaceman7777 | 6 hours ago
Just another case of Apple intentionally going against established open standards to price gouge their users.
I wouldn't mind it as much if I didn't have to hear said users constantly moaning in ecstasy about just how much better "Apple's way" is.
High quality desktop Linux has been made real by KDE, and the AI-fueled FOSS development boom is accelerating this eclipse of proprietary nonsense like this.
If you're a developer, you should be using a system that isn't maintained by a company that intentionally stabs developers in the back at every turn. (Unless you're into that. U do u.)
chaostheory | 5 hours ago
jval43 | 5 hours ago
That one also wasn't a hardware limitation as it ran my displays just fine in bootcamp, but macOS would just produce fuzzy output all the way.
It's infuriating.
tgma | 5 hours ago
mono442 | 5 hours ago
LarsAlereon | 5 hours ago
So to oversimplify, Windows can have a problem where if you are running 1.5X scaling so text is big enough, you can't fit 4K of native pixels on a 4K display so videos are blurry. If instead you were rendering a scaled image to a 6K framebuffer and then downscaling to 4K, there would be minimal loss of resolution.
tgma | 2 hours ago
That would be my instinct as well, but the author seems to be delibarately doing the exact opposite. Trying to force a 2x HiDPI and then downscaling to native display resolution whereas he could have just done a 1:1 LoDPI rendering. What you get in the end is some equivalent of hack/brute-force smoothing/antialiasing of what was rendered in the downsample.
adrian_b | an hour ago
So 1:1 rendering can cover only a part of the screen, while the remainder remains unused.
If the maximum size limit is used but applied to the entire screen, it does not match the native resolution so interpolation is used to convert between images with different resolutions, blurring the on-screen image.
All the attempts were done with the hope that there is some way to convince the system to somehow use the greater native image size instead of the smaller size forced by the limits.
adrian_b | an hour ago
Already more than 35 years ago the correct solution was used. For text and for graphics, the sizes must be specified only in length units, e.g. in typographic points or millimeters or inches, e.g. by configuring a 12-point font for a document or for an UI element. Then the rasterizer for fonts and for graphics renders correctly everything at a visual size that is independent of the display resolution, so it is completely irrelevant whether a display is HiDPI or not.
To combat the effect of rounding to an integer number of pixels, besides anti-aliasing methods, the TTF/OTF fonts have always included methods of hinting that can produce pixel-perfect characters at low screen resolutions, if that is desired (if the font designer does the tedious work required to implement this). Thus there never exists any reason for using scaling with fonts.
For things like icons, the right manner has unfortunately been less standardized, but it should have been equally easy to always have a vector variant of the icons that can be used at arbitrary display resolutions, supplemented by a set of pre-rendered bitmap versions of the icons, suitable for low screen resolutions.
I am always astonished by the frequent discussions about problems caused by "scaling" on HiDPI displays in other operating systems, because I have been using only HiDPI displays for more than a dozen years and I had no problems with them while using typefaces that are beautifully rendered at high resolution, because I use X11 with XFCE, where there is no scaling, I just set the true DPI value of the monitors and everything works fine.
Someone | 5 hours ago
> Caps the HiDPI backing store to approximately 1.75x the native resolution (6720x3780 for 3840x2160 native), rather than the 2.0x needed for full HiDPI (7680x4320)
So, that could be an off by one bug? That might be testable by tweaking the system to think the display supports an even higher resolution.
Also, instead of messing with the Display Override Plist, patching drivers, etc, did they try using the “Advanced…” button in the “Displays” UI? They don’t mention they did.
For me (with a 27 inch 4K monitor not on M4 or M5) that replaces the 5-way choice by one with a list of 11 choices. With the then appearing “Show all resolutions” toggle, that becomes 18.
[OP] smcleod | 4 hours ago
Someone | 26 minutes ago
tmsh | 5 hours ago
tonyedgecombe | an hour ago
ralferoo | 22 minutes ago
Assuming the article is correct and the hardware can do 7680x4320 @60, which requires 8GB/s memory bandwidth, in theory it should be able to do the same to read the same memory and interleave every other line for the down-sampling. However, it's possible that the new memory controller can't support 2 simultaneous burst streams (because the 2 lines are 30KB apart in memory), or if it's doing a single burst and buffering the first line until the second line is available, then maybe the cache is smaller than 30KB.
Another possibility is that previously the scale averaged pairs of pixel horizontally and cached them until the next line was available to average with that, and for some reason it was changed to average all 4 at the same time and so the cache isn't sufficient (although it'd be weird as 25.25KB is a fairly weird size to limit the cache to)
Alternatively, looking at clock rates needed for the sampler, 3360x1890 @60 is 381MHz, 3840x2160 @60 is 497MHz. It's quite possible that they've lowered the base clock on some hardware and not considered that it'd impact the maximum effect on the scaler.
But whatever IMHO, it's unlikely to be a software bug with an easy fix.