zokier 4 days ago

The problem with all these font rendering discussions is that we are missing some ultra-high quality (offline) reference rendering system to compare against. Not only does that make these discussions pretty unproductive (just subjective back and forth), but also practically that drives font designers to build fonts that look good on their preferred platform(s), rather than something that is built to look good on spec. This drives then feedback loop where other platforms then need to start emulating the major popular platforms with their flaws instead of aiming for the highest quality; for example this stem-darkening is almost certainly just inspired by macos but doesn't really have justification outside that.

  • winternewt 4 days ago

    On a related note, I thought that by 2024 we would have displays with 250+ dpi resolution, but to my disappointment I'm still waiting for this to be a reality on anything but small laptop screens. A lot of the rendering tricks that corrupt the appearance of fonts have to do with how few pixels are available for rasterisation. We should have been getting print-quality text by now.

    • musicale 3 days ago

      > I thought that by 2024 we would have displays with 250+ dpi resolution,

      The "retina" iMac 27" (5K display, ~218 PPI) came out in 2014.

      I've been using this resolution since 2016. 32" 8K seems like the next logical step but it's disappointingly expensive/unavailable.

    • craftkiller 4 days ago
      • winternewt 3 days ago

        Right, I have seen the Dell screen before. But it is currently an extremely niche product which means it is wildly outside my price range. And the VG3281 is still not available in my country even though it was released in 2023.

      • bartvk 3 days ago

        Is that ViewSonic monster actually for sale?

        • moe_sc 2 days ago

          > The monitor is currently listed on Chinese retailer Taobao for the equivalent of ~$2400 USD.

          Seems like it is

    • mproud 2 days ago

      The future is now.

      Apple and Microsoft both make high DPI displays over 260 ppi on their largest 15-inch+ notebooks and tablets.

      • t0bia_s 2 days ago

        As graphic designer I'm able to work on 106-120 PPI displays. Anything higher means upscaling which is problematic for UI design, especially on web.

        • ttarr a day ago

          We can't hold back the world because of some lazy UI Devs, I can't believe we still facing scaling issues in 2024!

          My solution, under Linux, I get laptop screens with 2X scaling, i.e. 2.8K and 3.2K for larger laptops.

      • winternewt a day ago

        I'm well aware, but that hasn't translated to similar resolutions on desktops even though laptops have had them for almost a decade.

hyperhello 4 days ago

Why can’t I see the difference? Does this only work for older, standard resolution screens?

  • Cort3z 4 days ago

    Click the picture to get a full resolution version. On my phone it's quite obvious. But to my eyes it just looks like someone made it bolder, not less blurry.

    • KronisLV 4 days ago

      Off: https://blog.aktsbot.in/img/stem-darkening-off.png

      On: https://blog.aktsbot.in/img/stem-darkening-on.png

      Personally, I never had issues with fonts on any OS that much except when I connected my M1 MacBook to a 1080p monitor, then it felt like the fonts had no anti-aliasing at all.

      • setopt 4 days ago

        That’s because MacOS got rid of subpixel antialiasing sometime after launching Retina screens, which makes non-hiDPI screens have quite awful font rendering.

        I sometimes switch to a bitmap font like Fixedsys Excelsior or GNU Unifont when using MacOS with a low-resolution monitor to compensate (with antialiasing off so the bitmap font looks crisp).

        Also, JetBrains Mono somehow looks good on lowres screens even though it’s not a bitmap font, it seems to not blur as much as other fonts when it gets antialiased.

        • zozbot234 4 days ago

          Subpixel antialiasing is going to be problematic anyway on newer displays that don't necessarily feature a plain RGBRGB (or similar) subpixel arrangement. For example, many OLED screens use RGBG/BGRG or even more complex "PenTile" subpixels.

          • delta_p_delta_x 4 days ago

            > Subpixel antialiasing is going to be problematic anyway on newer displays that don't necessarily feature a plain RGBRGB (or similar) subpixel arrangement.

            This will then mean making the subpixel anti-aliasing algorithm aware of different subpixel layouts. And this ought to be done anyway, because most anti-aliasing is usually at least somewhat hardware-aware. In my opinion, regardless of how subpixels are laid out, more resolution is always better.

          • necovek 3 days ago

            There was a quick option to change subpixel layout way-back-when in Linux (GNOME 2.x series), eg. a quick search gave me a screenshot at this page: https://askubuntu.com/questions/88528/how-to-switch-on-sub-p...

            Today it seems it's hidden as a dconf option:

              $ dconf read /org/gnome/desktop/interface/font-antialiasing
              'rgba'
            
            But this is an issue that applies to VA panels as well (cheaper than IPS, worse viewing angles, but better contrast ratio), and I have a 27" 4k VA screen that works just fine with it turned on in Linux — text is so much clearer with it on than off, and attaching a MacBook to a 4k screen at 27" or 32" IPS makes me hate MacOS for killing subpixel rendering off.

            As for "retina" resolutions, I've tried 24" at 4K as soon as it came out (with that Dell monitor that required 2 DP 1.1 connections for 60Hz IIRC), and turning subpixel rendering off made text and lines jagged — that was ~190 ppi at a normal viewing distance with vision corrected to better than 20/20 (which is what I usually have — can't really work without glasses anyway, and worse correction leaves me with headaches). For the record, 5k at 27" and 6k at 32" is roughly ~216 ppi, so not much better than ~190 ppi: subpixel rendering probably achieves 2x the increase in text clarity for those not sensitive to colour fringing (I am not).

            So, subpixel rendering is really not an issue on any displays, but Apple will happily tell you what's the limit of your vision and upsell you on their monitors.

          • RealStickman_ 3 days ago

            Fontconfig on Linux has an option to set the subpixel layout, though currently only rgb, but, vrgb and vbgr are supported. Maybe this could be extended for OLED monitors

          • lostmsu 4 days ago

            For this Windows had ClearType tuner since before PenTile existed.

        • NoGravitas 3 days ago

          Subpixel antialiasing is kind of overrated anyway. On screens that are low enough DPI to benefit from it, it can cause color fringing (especially for people with astigmatism) that is worse than the blur from grayscale antialiasing.

          • necovek 3 days ago

            I disagree: I am not susceptible to colour fringing, and I can tell if subpixel rendering is on or off on 24" 4k screens (~190 ppi) at a regular or even further viewing distance (~70cm/27") — I specifically got that display hoping for subpixel rendering to be turned off.

            Haven't tried Apple's big "retina" screens but considering they are ~215 ppi, pretty confident 10% increase in PPI wouldn't make a difference that subpixel rendering does. Laptop screens have higher resolution, but haven't really paid attention to whether M1 Air 13" or 4K 14" X1 Carbon work for me without subpixel rendering (I prefer to be docked).

            Before anyone jumps on "you've got incredible vision": I wear either glasses or contacts, and with that my vision corrects to better than 20/20 — slightly lower correction induces headaches for me. Without glasses, I'd probably be happy with 640x480 on 32" so they are kind of a must. :)

          • setopt 3 days ago

            On medium-DPI screens, I find that subpixel antialiasing make fonts significantly less blurry than grayscale antialiasing without causing obvious color fringing. On actual low-DPI screens, bitmap fonts are IMO the only really usable option. (YMMV, but I have mild astigmatism and use glasses.)

        • WesolyKubeczek 4 days ago

          The “sometime” happened in macOS Big Sur. Prior to that in Mojave and Catalina, you could enable it back by twiddling a hidden preference with the word “legacy” in it. It somehow was worse than what you got in High Sierra and prior anyway.

        • jillesvangurp 4 days ago

          They don't sell anything without hiDPI for quite some time now (a decade?). Making their software look good on obsolete screens is understandably not a priority for them. And if you are happy to plug in something that old, you are kind of signaling that you don't really care about what things look like anyway. So, why bother to make that look good?

          • KronisLV 4 days ago

            > They don't sell anything without hiDPI for quite some time now (a decade?). Making their software look good on obsolete screens is understandably not a priority for them. And if you are happy to plug in something that old, you are kind of signaling that you don't really care about what things look like anyway.

            My apologies for buying 1080p monitors that had no issues with neither my Linux, nor my Windows computers, I guess. I can understand that they might not care about what I care about (supporting the hardware that I have, rather than going out of my way to buy a new monitor just because of a new computer deciding not to work with it well), I'd argue that maybe that's even fine because it's their device and ecosystem, but jeez, that tone is super uncalled for.

            As an aside, I use the M1 MacBook at a scaled resolution of 1440x900 because anything finer is hard for me to see. That's a visible PPI of around ~130 because of the 13.3 inch screen. A 1080p monitor of 21.5 inch diagonal size would have a physical PPI of around ~100, so that's around 80% of the pixel density. That's not to say that the panel on the MacBook is not much nicer, but rather that with software anti-aliasing it could definitely be okay. Somehow I don't want to buy a new monitor just for the weekends when I visit the countryside.

          • baq 4 days ago

            Reality distortion field is strong with this one.

            I have a perfectly good normie dpi 25x16 display which is extra crisp on windows. On macOS I had to install betterdisplay just to make it not miserably bad; it’s just plain bad now. As far as I can tell Apple removed the feature because of greed and laziness.

          • setopt 4 days ago

            There are plenty of non-hiDPI screens from other vendors on the market, especially “large” screens that are “medium” in price. In an office you’re not always free to order a screen from any vendor you want (due to their framework agreements), unless of course you’re paying for that hardware privately.

            I care about how things look, and have spent more time than I want to admit configuring MacOS apps to look good on the screens available to me. I just don’t care enough to buy an expensive office screen with my own cash if my employer can’t provide one.

            • jillesvangurp 4 days ago

              I was talking about Apple. Apple stopped selling non hiDpi screens some time last decade. T

              • ffsm8 4 days ago

                So in a nutshell:

                Apple specifically wants that you cannot use non-apple displays by artificially worsening the experience for the user while strengthening the illusion that Apple's hardware looks better - even though the only reason it does is because Apple themselves made sure to make other displays look unnecessarily bad.

                It's hilarious there are people that actually think this is totally okay and not just plain anti-competitive with just enough plausible deniability to get away with it

                • lproven 3 days ago

                  Well, in a word, no.

                  In a few more words: not at all, not even slightly.

                  To explain briefly:

                  > Apple specifically wants that you cannot use non-apple displays

                  No. Apple does not make or sell or offer non-HD displays and has not done for over a decade. Apple mainly sells phones and laptops with built-in hiDPI screens. Desktop computers that use external screens are a small part of its range, and it sells its own very high-quality screens for those.

                  Because font antialiasing is pointless on a hiDPI screen, and it only offers hiDPI screens, it removed antialiasing from its OSes.

                  However, the kit does still support old screens and you are free to use them. The antialiasing feature is gone, but to my (not very strong) eyesight it doesn't matter and stuff looks fine.

                  > artificially worsening the experience for the user

                  No. This is paranoia.

                  > It's hilarious there are people that actually think this is totally okay

                  People think it's okay because your interpretation is paranoid.

                  > not just plain anti-competitive

                  How is REMOVING features anti-competitive? In what universe does taking something out of your products hurt your competition? That is absurd.

                  • ffsm8 2 days ago

                    > How is REMOVING features anti-competitive? In what universe does taking something out of your products hurt your competition? That is absurd.

                    You're unironically arguing that EEE isn't anti competitive?

                    The whole strategy is about removing support/features at the right time when users cannot realistically leave, putting the nail in the competitors coffin.

                    Simply put:

                    1. initial product supports both equally

                    2. People start using your product

                    3. Competitors product work less well

                    4. People will use the better working product. Despite the fact that the downgrade in quality is artificial.

                    Or is it only anti-competitive if Microsoft does it, Apple being the last bastion of healthy competition on the market, with groundbreaking examples like the AppStore and the green/blue bubbles in their chat app?

          • gond 4 days ago

            > you don't really care about what things look like anyway.

            That statement has no connection to the premise.

            There are multiple reasons to use an old screen besides the mentioned reason of not caring for x.

      • pjerem 4 days ago

        MacOS have always had the best font rendering on HiDPI screens and also the worst on Low-DPI.

        Windows is historically very good on Low-DPI but they also managed to be great on HiDPI.

        Linux, well, it depends on so much things … you can achieve good on both but you’d better be ok with integer scaling and not have multiple displays with different DPI.

        • cehrlich 4 days ago

          Macs used to have good font rendering on Low-DPI displays (I would say the best, but I suppose that’s a matter of opinion)

          Then Mojave switched to Metal and removed subpixel AA, and now it’s the worst.

          Thread from when it happened: https://news.ycombinator.com/item?id=17476873

          • kelnos 4 days ago

            It's funny to read that thread (from 6 years ago, wow time flies) and see complaints that a lot of people have low-dpi external displays. But I think some of the rebuttal comments in that article rang true even then, and certainly do now: if you are spending money on an Apple laptop or workstation, Apple is going to expect you to spend money on hi-dpi external monitors.

            • necovek 3 days ago

              I'll reiterate a comment I made elsewhere in this thread. With my vision corrected, 4k at 24" (~190ppi) needs subpixel rendering for crisp fonts. I would expect 5k at 27" or 6k at 32" (both around ~215ppi) would be the same, so my only option for comfortable external use with a Mac is really 8k at 32". I know that I am an outlier as I get my vision corrected with glasses/contacts to better than 20/20 (outside that 5%-95% group I guess), but I was perfectly well served by subpixel rendering (and continue to be).

              Luckily, Mac is only for work, and it is passable with 32" at 4k, but I can use Linux everywhere else for much nicer fonts. Unluckily, work is 8+h :)

              • shiroiushi 3 days ago

                Sounds like you need to get a new job where you can use Linux. (For reference, I'm typing this at work, on a Linux laptop.)

            • nasretdinov 4 days ago

              Yeah, the issue is that 6 years ago your only option for a highdpi monitor with the correct scale (e.g. for 27'' it needs to be 5k, not 4k) would be the iMac or the XDR display that costs over $5k...

              Now that Apple sells their own (very decent) monitor at somewhat more affordable price it makes sense to use it as an external display, I agree.

        • lproven 3 days ago

          > MacOS have always had the best font rendering on HiDPI screens and also the worst on Low-DPI.

          No it hasn't.

          Maybe to you "always" means "since 2014" but if so that means you are very young and you should not generalise from that.

          I've been using Macs since 1988 and Mac OS X since 2001 and it used to be great on SD screens. I used to use Safari on Windows XP because its font rendering was so much better than the built-in Windows Truetype renderer.

          This change is new and recent.

          It is absolutely not "always".

        • orangeboats 4 days ago

          You can have displays with different DPI on Linux and achieve good font rendering. But it requires the latest versions of your favourite DE (like GNOME 45+ and KDE 6.x) and you'd need to give up X11 (which does not support mixed DPI very well).

          • michaelmrose 4 days ago

            X11 handles mixed DPI fine. Monitor configuration GUI for DE don't support the required option however both X and nvidia-settings do

            • orangeboats 2 days ago

              X11 quite literally doesn't have the notion of DPI, there is only a single coordinate space. Xrandr is per-output and not per-window.

              • michaelmrose 2 days ago

                I'm literally sitting at a station with 3 monitors 2 high DPI and 1 low DPI wherein UI elements of a window moved from one monitor to the other are identically sized and this was the case when all 3 monitors were different sizes and DPI as well.

                In this case the UI is scaled up by an integer factor on all screens so as to look nice on the highest DPI screen. It is scaled down from the higher resolution by a decimal factor that could be but needn't be an integer. If the factor is proportional to the difference in DPI the result is UI elements being sized precisely the same size across different size and dpi monitors.

                All monitors share a singular scaling factor and DPI. Apps thus need to support high DPI but needn't do anything smart to support scaling because it happens outside of the apps remit.

                This can be achieved again with xrandr --scale OR in the nvidia-settings GUI by setting viewport in to a higher resolution than viewport out. No the result isn't blurry.

                • orangeboats 2 days ago

                  First thing first, the X11 protocol is not aware of DPI. Period. And that has implications. You can apply all sorts of hackish solutions on top of it but (1) the solutions will most likely be out-of-band (not pure X11) and involve DBus/envvars/Xft.dpi, and (2) per-output.

                  Out-of-band solutions are effectively bandaids and they unnecessarily increase development difficulty of GUI programs on Linux, since developers now have to be aware of the various side channels (fragmentation!) that communicate the DPI scale.

                  This is why SDL2 for the longest period did not support HiDPI on X11, but does on Wayland, MacOS, and Windows. A COSMIC dev just recently made a complaint about XSettings too! [0] You can't just ignore those problems, "Linux is hard to develop for, blah blah fragmentation blah blah" I am sure you have heard of.

                  Another thing. Per-output HiDPI is fine when all your programs support high DPI, but it's unworkable if you want to mix LoDPI and HiDPI applications in a single screen, i.e. if an application has better user experience if it is upscaled and blurry (!), you are SOL unless you want to apply scaling to your entire desktop.

                  You also lose the opportunity to implement some neat features like temporarily scaling up a window if it is being magnified or screenshot. (The idea's been floating about in the KDE community)

                  Finally, we can argue for days, but the HiDPI page on Arch wiki already says a lot when a good 90% of the article is about acheiving good DPI scaling on X11 [1]. Even the Wayland sections have an XWayland subsection in them...

                  [0]: https://tech.lgbt/@drakulix/113029499719670970

                  [1]: https://wiki.archlinux.org/title/HiDPI#Xorg

                  • michaelmrose 2 days ago

                    Scaling is a built in feature of X for over 20 years intended for this use case no bandaids of any sort or awareness by the applications of differing DPI.

                    From the perspective of apps there is only one DPI with all scaling down handled by X at a layer below the apps level. There really aren't any lodpi apps on a modern X system only Wayland has issues scaling X apps correctly. Gtk Apps not updated this decade can handle scaling up by integer factors and because X scales the lower dpi monitor down to the appropriate size the app doesn't need to handle anything else.

                    Its very odd for folks to argue that my mixed DPI system using only basic boring old xorg.conf somehow doesn't exist. I mean would you like to come to my apartment and open a few different apps?

                    Its in a way ironic that between pure X, pure Wayland, and Wayland + xwayland only the last can't handle mixed DPI.

                    • orangeboats 2 days ago

                      > There really aren't any lodpi apps on a modern X system only Wayland has issues scaling X apps correctly.

                      The problem starts at XWayland, not Wayland. And XWayland is just your typical X server with all its X intricacies.

                      Only if we can somehow communicate our intended scale to the X clients through XWayland... oh, we can't!

                      > Its very odd for folks to argue that my mixed DPI system using only basic boring old xorg.conf somehow doesn't exist.

                      You keep on confusing per-window and per-output DPI. I can only assume you are feigning ignorance at this point. Good luck with attitude.

                      I still have a couple GTK2 apps installed that doesn't support HiDPI, those are best used upscaled but... they can't. Those apps are unable to communicate to the server they are rendered at 1x (rather, the server doesn't have the notion of per-window scales at all).

                      • michaelmrose a day ago

                        The desirable quantity is the result not the mechanics specifically that one can simply open windows on any display and move them between monitors without worrying about scaling. Ideally an element on one ought to be the same number of mm should you pull out a ruler.

                        This is easily achievable on X11 and its easily done with Wayland.

                        You mentioned not being able to do scaling per window. Well only with xwayland do you ever need to do so. You need it to be smarter than it is and so scaling on xwayland sucks in a way that isn't a factor in X or Wayland.

                        Since Wayland is dependant on xwayland for unsupported apps and mixed DPI + xwayland sucks effectively only Wayland sucks at mixed DPI.

                        • orangeboats a day ago

                          Sigh.

                          > The desirable quantity is the result not the mechanics specifically that one can simply open windows on any display and move them between monitors without worrying about scaling.

                          Then next time don't bring up compatibility when you talk about the benefits of staying on X11. :) You are breaking the legacy programs anyway by forcing LoDPI programs to remain un-upscaled and therefore unusable on HiDPI displays.

                          > Well only with xwayland do you ever need to do so.

                          I am sorry, but the problem I mentioned exists as well on a pure X11 setup.

                          And the "solution" for this problem, on this setup, is to scale up the entire desktop (blurriness everywhere) which is decidedly worse than XWayland where only a single window is affected, assuming your other HiDPI-aware programs run natively on Wayland (which, from my experience, is pretty much true -- after all, HiDPI programs most likely are still actively developed).

                          The only other viable choice on a pure-X11 setup is a non-solution: just live with the tiny UI whenever you are using HiDPI-unaware programs.

                          Either you bring up documentation showing people that X11 has something akin to Wayland's buffer_scale event, or consider this discussion finished. It is tiring to talk to a brick wall.

                          • michaelmrose a day ago

                            >The only other viable choice on a pure-X11 setup is a non-solution: just live with the tiny UI whenever you are using HiDPI-unaware programs.

                            What programs? Something from 2003? There are a plethora of X only apps but a microscopic number at this point that aren't suitable for hidpi.

                            > blurriness everywhere

                            Nope I'm sorry I have a machine on my desk which disagrees

        • lostmsu 4 days ago

          Does it have anything to do with DPI? I thought basically the state is Windows rules, MacOS and Linux suck on non-integer scaling ratios. For integer nobody has problems AFAIK.

        • michaelmrose 4 days ago

          X has supported scaling forever. Integer scale up everything scale down the lower DPI monitor or monitors.

          xrandr --scale, xorg.conf, or nvidia-settings GUI and save to xorg config

      • rubslopes 4 days ago

        BetterDisplay HiDPI feature (available on the free version) much improves font rendering on external displays: https://github.com/waydabber/BetterDisplay

        This helped me a lot, I was about to ditch my external monitor because of blurriness.

    • hedora 3 days ago

      Yeah; something seems wrong with their setup.

      I’ve seen Linux render fonts like that, but not recently. I’d probably look into debugging my setup if that’s what my desktop looked like.

  • __mharrison__ 4 days ago

    I could only see the difference when I viewed the images at 250x.

  • konart 4 days ago

    Look at the 'e' for example. Night and day difference.

    PS: talking about screenshots in the article.

bad_username 4 days ago

Am I the last person on the Earth to turn off font smoothing completely and use fonts perfectly hinted to the pixel grid? Can't get any sharper than that, and really helps with eye strain.

  • nielssp 4 days ago

    No, at least that's what I'm going for as well, at least for monospace fonts in editors and such. I can't stand the slightly blurry look of antialiased fonts and absolutely prefer the crisp edges of traditional bitmap fonts.

  • NoGravitas 3 days ago

    I do this only for specific fonts at specific sizes.

  • globular-toast 4 days ago

    Do you have a working config for this on Linux?

    • arexxbifs 4 days ago

      Try this:

        $ cat $HOME/.config/fontconfig/fonts.conf 
        <match target="font">
            <edit name="rgba" mode="assign">
                <const>none</const>
            </edit>
            <test qual="any" name="size" compare="more">
                <double>1</double>
            </test>
            <test qual="any" name="size" compare="less">
                <double>22</double>
            </test>
            <edit name="antialias" mode="assign">
                <bool>false</bool>
            </edit>
            <edit name="hintstyle" mode="assign">
                <const>hintfull</const>
            </edit>
        </match>
    • bad_username 4 days ago

      This is a regular control panel setting in Linux Mint (Cinnamon). Not sure about other distros.

elwebmaster 4 days ago

Second screenshot looks blurrier to me than the first. Both are bad. It’s not a Mac.

  • Lorkki 4 days ago

    Apple have always preferred preserving letter forms over hinting, so Macs are therefore also "blurry" on lower DPI displays. The reason they aren't these days is because the DPI is higher.

    Usually when people complain about this, the comparison is to Windows, which prefers strong hinting, i.e. snapping the font shapes to within (sub)pixel boundaries.

    There also used to be patent encumbrance issues around subpixel rendering, making Linux font rendering on TFTs overall worse by default, but as of 2019 those have expired. Some distributions had already enabled those features anyway.

    • zozbot234 4 days ago

      With any reasonable resolution screen (1080p or more), you can have both good preservation of letter forms and a non-blurry image, simply by applying a sharpening filter after antialiasing. Some image downsampling methods, such as Lanczos filtering, effectively do this for you. The tradeoff is one can detect some very fine 'ringing' artifacts around the sharpened transitions, but these don't impact readability in any way.

  • MindSpunk 4 days ago

    Mac font rendering on non retina displays is pretty awful. Mac 'cheats' somewhat with hDPI screens + aggressive super sampling as MacOS doesn't do fractional scaling and instead rounds up to the nearest integer multiple. At my current display scaling settings my MBP is using a 5120x2880 buffer for my 4k (3840x2160) as it's set to scale to a logical size of 2560x1440.

    Under a fair comparison on a 1080p display with no scaling even Windows demolishes MacOS these days. Apple dropped support for subpixel AA years ago which really hurts on standard DPI rendering.

    • funcDropShadow 4 days ago

      This idea of scaling graphics up and down instead of rendering at the display native resolution with size measurements in dpi-independent units was so bad. I understood it to be a neat trick, when the first retina displays were relased. But everything afterwards is just awfull.

      • nasretdinov 4 days ago

        With fractional scaling you can get objects misaligned otherwise, and I personally found it very annoying e.g. in VSCode with 1.25x zoom, where the objects would move slightly when you interact with them, due to imperfect and inconsistent size calculations.

        IMO the way Apple does this is quite brute force and unconventional, but at least the picture doesn't drift with different scale.

  • ahoka 4 days ago
    • omnimus 4 days ago

      When i look at the first picture the Mac is blurrier but at same time the Windows doesnt keep the balance of the font. Look at all the horizontal vs verzical strokes in ZLF they are for sure drawn to be same optically thickness but on windows they are very wrong.

      I dont think windows is such clear winner here. Seems like different philosophies.

      • lostmsu 4 days ago

        Just hold the phone at some distance, you will see how bad MacOS'es one is in reality.

      • ahoka 3 days ago

        You don’t see the obvious artifacts?

    • ack_complete 2 days ago

      Forget the font rendering, it looks like a poorly tuned scaling filter with excessive ringing is being applied in the macOS case. That halo around the letters is the result of a sharpening or bicubic+ resampling filter that's turned up too high.

makz 4 days ago

I kind of prefer the fonts in the first screenshot

  • larschdk 4 days ago

    Both use subpixel rendering is that it is specific to the display where it is shown. You need the same subpixel ordering and you can't really scale it. Your preference may be correct on your display, while mine is correct here: I strongly prefer the second image.

    The first image has less hinting, causing the top of the x-height to be rendered at a half pixel. This makes it seem less intense that the rest of the letters. The second image aligns way better and have a more consistent intensity and sharp edges, but gives an overall slightly more bold appearance, and also compromises a tiny bit on the actual shape of letters.

  • otabdeveloper4 4 days ago

    This, I like the blurry fonts. They should fix kerning bugs for certain popular fonts instead.

  • qwertox 4 days ago

    I agree. The second one looks like it has `font-weight` increased, making it all somewhat bolder, reducing the distance to the bold font.

  • muppetman 4 days ago

    Same I wonder if they're mixed up

mjevans 4 days ago

On the one hand... OK great for consumption.

On the other... Wouldn't this lead to terrible choices in font selection for anyone NOT using these settings?

  • bongobingo1 4 days ago

    The "on" image looks closer to how I see fonts on macOS, so probably optimising that way has a bigger win. I don't use the system often though.

Flex247A 4 days ago

Can someone comment on the effect lcdfilter has?

Seems like full hinting + RGB antialiasing is the way to go on non 4K displays.

anothername12 4 days ago

If I have a 4K display, do I still need to worry about font rendering settings?

  • vardump 4 days ago

    If it's small like 15", probably not. If it's 32" or more, probably yes.

  • Flex247A 4 days ago

    Quite likely no, as even grayscale antialiasing is going to look amazing in 4K.

    macOS has grayscale antialiasing on by default.

konart 4 days ago

What kind of sadist sets `font-size: 10.5pt;`?

Sorry about that rant.

  • matejn 4 days ago

    I wish I could set my Visual Studio coding font to 10.5pt. Now I have to use 10pt and 105% zoom!

    But then again, VS doesn't allow font size in px ...

    • konart 4 days ago

      10(.5) looks tiny though. I can place 6 copies of the article on my monitor, lol.

gjvc 4 days ago

This works very well on my 4k screen.

Every discussion of font rendering technology must include a statement to the effect of "Acorn RISC OS fonts from 1990 have not been bettered". :-)

peter_retief 4 days ago

I cant see the difference?

  • peter_retief 4 days ago

    I just opened a file in VIM and it does look denser/clearer. Cool.

zhenyi 4 days ago

Video showing the before/after: https://streamable.com/904w6l

  • planb 4 days ago

    They are both blurry. One is just a bit bolder. Nothing beats HiDPI displays for crisp font displaying imho.

    • necovek 3 days ago

      HiDPI displays combined with subpixel renderings for the total win. Unfortunately, that's dead for MacOS today, and for a time, unachievable with Wayland on Linux.

pjmlp 4 days ago

Kind of, as usual, given the desktop fragmentation.

bamboozled 3 days ago

The screenshots look the same to me?

greenavocado 4 days ago

Both are garbage if you are on a low DPI display

  • dartharva 4 days ago

    Font rendering on linux tends to be garbage even if you have a high DPI display

    • omnimus 4 days ago

      I have both side by side and i think this is more of a myth. They simply render the same on hidipi except very small sizes where the engines differ.

      On Linux font selection is terrible, people have problems with Wayland/x11 rendering and other settings (often opinionated defaults from distros). But when you are lucky :)) you can get pretty much same hidpi font rendering.

      • NoGravitas 3 days ago

        Freetype at least has knobs that are both configurable and understandable. On Windows, you only have Cleartype Tuner, and it can be hard to get what you want.

    • hollerith 4 days ago

      I used to prefer font rendering on Gnome to that of MacOS and Windows, but as of Gnome 46, they changed it so that if you set a fractional scale factor, everything gets blurry (a different kind of blurry than the OP means) just like it does on MacOS, making Windows font rendering the desktop font rendering I like the best as long as I stick to relatively modern apps like browsers, VSCode and the newer apps that comes with Windows like Settings. (Legacy apps on Windows are super super blurry at fractional scale factors.)

      I use non-HiDPI 1920-pixel-wide displays at scale factors between 1.25 and 2.0. (Yes, I like big type and big UI elements: my eyesight is bad.)

      • omnimus 4 days ago

        Sounds like you might have something to do with Wayland?

        • mmcnl 4 days ago

          No, Linux (Gnome/Wayland) uses a method similar to macOS where everything is rendered at a higher resolution and then scaled down. If you have a very high DPI display it looks nice, but if the DPI is not that high to begin with you will notice that the result is a bit blurry. Only Windows has truly good scaling.

          • hollerith 3 days ago

            I agree. The blurriness at fractional scaling factors on Mac and in very recent versions of Gnome is obvious on a non-HiDPI display (at least a 24-inch one like the ones I use).

            Until a year ago, Gnome/Wayland did it the way Windows does it! I.e. the method the OS used to make the text and the other UI elements the size the user specified refrained from applying any blurriness-causing resolution-scaling algorithm as long as you avoided the XWayland compatibility layer (and the compatibility layer that lets modern Windows support legacy apps makes the legacy app blurry, too).

            Chrome's "zoom" feature (activated by the keyboard shortcuts Ctrl+plus and Ctrl+minus for example) does fractional scaling (i.e., allows the user to adjust the size of the text and the other UI elements) without introducing any blurriness. Installing Chrome and experimenting with that feature is probably the easiest way for the average reader to see what we are talking about.

            One parenthetical detail is that (unlike on Mac or Windows) in order to get fractional scaling at all on Gnome/Wayland you have had to use a command line to add a particular key to a list named 'experimental-features', but that will change next month when Gnome version 47 will be released, at which time the user will be able to change the scaling factor in the Settings app just like one can on Mac or Windows without first having to configure anything or opt in to anything.

            I would love to know the reasoning behind the Gnome team's decision here because just because although you or I might not be able to notice the blurriness on a HiDPI display or to say with confidence that the blurriness is there doesn't mean that the image is as sharp as it could be: the scaling algorithm used on Mac and on Gnome versions 46 and 47 is clearly throwing some "visual information" away regardless of the resolution of the display.

            • shiroiushi 3 days ago

              >I would love to know the reasoning behind the Gnome team's decision here

              Obviously, as with many things, the Gnome team is just aping MacOS, poorly.

              I wonder how KDE handles it.

    • mmcnl 4 days ago

      At least Windows and Linux try to render something usable. macOS simply removes all font aliasing and expects you to use a high DPI display.

Apreche 4 days ago

It's an improvement, but it's still absolute garbage. Just throw in the towel already. Copy whatever MacOS does and have it be the default.

  • p1necone 4 days ago

    Isn't "what MacOS does" just shipping extremely high DPI screens on everything + using regular anti-aliasing on fonts with no hinting or subpixel rendering?

    • konart 4 days ago

      I have Fedora Kinoite and Mbpr M1Pro connected to the same 4k Dell monitor. Font rendering is still much better on macos than any settings in Fedora.

      • MindSpunk 4 days ago

        Probably because MacOS doesn't do fractional scaling. If you're not using a scaling mode that's an integer multiple of your monitor resolution MacOS oversamples the whole display. Kinonite is KDE iirc which can do fractional scaling (quite well too) on KDE 6. So if you're comparing 1.5x scaling on KDE compared to MacOS you're actually comparing a 3840x2160 render to 5120x2880.

        • konart 4 days ago

          MacOS was doing fractional scaling long before KDE even began to implement it.

          UPD: just to be clear - even on macbook's screen macOS uses scaled resolution by default.

          • p1necone 4 days ago

            MacOS always renders at an integer multiple of their internal "logical" resolution, and then just shrinks the framebuffer down at the end to whatever the native resolution of the target monitor is.

            Fractional scaling is rendering directly to a framebuffer at the target monitors native resolution, and scaling your UI fractionally to look right on the target screen size.

            Apples approach is more resource intensive, but probably makes UI layout implementation a lot simpler and more consistent + potentially looks better than rendering at "real" native res for a lot of content.

        • chupasaurus 4 days ago

          KDE got fractional rendering of fonts and UI back in Plasma 4.

          • MindSpunk 4 days ago

            I think you're right, I might've been confusing it with proper support for mixed DPI desktops (i.e. one display at 1.5, and another at 1x). I think Plasma 5 could do it but I had a lot of problems until Plasma 6.

            • chupasaurus 4 days ago

              Fractional scaling was introduced in late 5 update and wasn't touched in 6 at all, per-monitor setting is only in Wayland for an obvious reason.

        • silon42 4 days ago

          Why would anyone do fractional scaling... why not just natively render at the proper resolution. (barring old broken apps).

          • MindSpunk 4 days ago

            Given a 4K monitor you have 3 options: - No scaling, produces tiny text and UI elements - 2x scaling, use a logically 1080p display leading to very large text and UI elements - 1.5x scaling, logically a 1440p display

            4k at 1x produces UI that's too small, 2x scaling is too large for a 27 inch monitor. 1.5x sizes everything like a 1440p display but you still get the higher resolution rendering at 4k.

            Fractional scaling _is_ rendering at the 'proper' resolution of the display. It can be challenging to workaround some issues like how to deal with window sizes that scale to a fractional native buffer (i.e. a 501 logical pixel wide image becomes 751.5 physical pixels?). Apple decides 'no' to native fractional scaling, so does GNOME unless that's changed recently.

            • p1necone 4 days ago

              > so does GNOME unless that's changed recently.

              Which imo was a bad choice. It works pretty well on MacOS because Apple only ships machines with a fairly constrained, known set of resolution/physical dimension combos. And they sell their own monitors for stuff like the mac mini where that isn't the case. This means they can design all their UI elements to be the "right" size without needing fractional scaling for a very large percentage of their users.

              Gnome is running on all sorts of screens, with all sorts of resolution/dimension combos and on a lot of them you have to choose between "way too big" and "way too small".

            • kaba0 4 days ago

              I believe GNOME just hides the option for fractional values in their native settings menu, but it can be easily accessed through something like GNOME tweaks and similar.

          • necovek 3 days ago

            It's exactly for "old broken apps".

            What I generally do on my Gnome systems running 4k screens is to use 125% or 150% scaling, and then set larger default fonts for the system. Fractional scaling helps keep old apps legible even if imperfect, but updated apps which respect system font size render great (Gtk+ widgets auto-scale to fit the text).

            Unfortunately, this approach ain't gonna work with Wayland as well.

          • konart 4 days ago

            Wokring on a 4+k monitor without scaling is difficult (at least if we are talking about text)

            • necovek 3 days ago

              Gnome has another set of hidden settings to set default system font sizes: for UI elements and "document" font. Some non-GNOME apps used to respect that too (like Firefox and LibreOffice), but others were not-scaled.

              Still, if you mostly relied on Gtk+ apps, where Gtk+ widgets scale to fit the text they are showing, this produced pretty good results even if some spacing was a bit inconsistent (i.e. spacing was too small compared to widget and text sizes).

              Unfortunately, this approach seems to work less and less well as Gnome devs are following Apple's lead.

        • mmcnl 4 days ago

          KDE works the same.

    • shiroiushi 4 days ago

      That sounds exactly like what I do in Linux.

    • pjerem 4 days ago

      It’s probably what macOS does but they manage to do it cleanly whatever the display scaling you choose, including arbitrary values of scaling that are unsupported by the OS (but that you can force with some software).

      On Linux, good luck using anything else than integer scaling. And it’s a shame because with a 4K screen and fractional scaling, you can get both more definition AND more real estate.

  • adgjlsfhk1 4 days ago

    the macos approach is to only ship weird and nonstandard monitor resolutions to convince people to buy your fancy display that costs 5x as much

  • Apreche 3 days ago

    In the replies people are talking about current issues regarding scaling and high dpi displays.

    That has little to do with it. Apple has had vastly superior font rendering since the day OSX launched, and have been in first place ever since. There's no point in my memory of the past 20+ years this was not the case regardless of display technology.

    Even though other systems implement a lot of the same techniques of sup-pixel rendering, fractional scaling, etc. the results speak for themselves. With the same monitors, the same display cables, the same fonts, and the same applications, text rendered by Apple operating systems is more crisp, clean, readable, and pleasing to the eye. Configuring settings on Linux improves the situation, but no configuration gets close to the quality Apple delivers by default. Apple is also superior to Windows in this regard, but the margin is much smaller.

    This is coming from a person who maintains a long list of MacOS gripes because of all the things they do wrong. Font rendering is one of the few things they have consistently done better than everyone else.

  • phito 4 days ago

    Am I the only one who absolutely doesn't care about this? Both are just fine to me.

  • abhinavk 4 days ago

    macOS uses grayscale anti-aliasing exclusively. But what it makes it work that well is the HiDPI (200+ dpi) displays Apple ships.

    Use a 32-inch 4K display and it will have be blurry/thick fonts too.

    • onli 4 days ago

      Absolutely. On my external display macos fonts looked horrible, way worse than on Linux.

    • doublepg23 4 days ago

      I might be blind but I'm using 2x4K 32" displays on macOS and the fonts still look like paper to me. It was funny because every Apple commentator I had listened to made it seem like my retinas would melt out of sheer disgust if I didn't use a 5K display.

      It looks night and day better than my 2x4K 27" I use with my Windows 11 work laptop, even with ClearType (That was a let down. I replaced 2x1080p 21" displays and expected that to fix the woeful fonts on Windows).

  • ben-schaaf 4 days ago

    Font rendering is absolute garbage on macOS since they removed subpixel-antialiasing. Everything is a blurry mess unless you have a high-dpi display (and even then it's pointlessly worse).

    • eviks 4 days ago

      How is it worse with a high dpi display?

      • lostmsu 4 days ago

        HiDPI is not magic. If you look closer, you will see artifacts which you won't see on OSes that care.

  • globular-toast 4 days ago

    Mac looks shit. Always has done. Turns out you can get used to anything. My standard is paper. The only way to get close to that is high DPI screens. Guess what Macs come with now? My Linux PC also has a high DPI display. Only difference is you have the choice to spend less and still get decent looking fonts if you don't use Mac.

  • ahartmetz 4 days ago

    Yeah, no. Mac font rendering is blurry, Windows font rendering is pixelated or okay with the right ClearType settings, FreeType with slight hinting and subpixel rendering (lcdfilter) is between these two and really fine IMO.

7e 4 days ago

Linux fonts have never been beautiful, and probably never will be. The talent is just not there. Give me a high DPI Mac any day. Now that is beauty.

  • mmcnl 4 days ago

    macOS doesn't do anything with fonts. They removed all aliasing features. The Apple solution is to use retina displays for everything. That works great with Linux too.

    • kalleboo 3 days ago

      macOS absolutely does antialiasing, it's just grayscale antialiasing instead of sub-pixel antialiasing. This is obvious if you mess with CSS "font-smoothing: none"

  • omnimus 4 days ago

    I am not sure if you are talking about fonts included with the OS or about rendering. But on high DPI screen all OS render fonts well (and pretty much the same). Its how the OS renders the small type on low DPI screens where lots of guessing from rendering engine has to happen.

    About the font choices for sure Mac has better selection than open source typefaces… but you know you can take your San Francisco and install it on Linux if you want.