• 0 Posts
  • 52 Comments
Joined 1 year ago
cake
Cake day: June 17th, 2023

help-circle

  • Last time I asked around about this question, the answer was surprisingly “probably not much”! When a low-power x86 chip (like those mobile chips) is idling (which is pretty much all the time if all you are doing is hosting a server on it) it consumes very little power, about the same level as an idling Pi. It is when the frequency ramps up that performance-per-watt gets noticeably worse on x86.

    Edit: My personal test showed that my x86 laptop fared slightly worse than my Pi 3 in idling power (~2 watts higher it seems), but that laptop is oooooooold.





  • orangeboats@lemmy.worldtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    36
    ·
    edit-2
    2 months ago

    Servo was an experimental ground for Mozilla in some ways (like testing out a new CSS engine and porting it back to Gecko if it works). So it’s quite normal for people to be unaware of it, it was not meant for the public.

    But later on it was abandoned by Mozilla and stuck in a limbo, until it got picked up by the Linux Foundation. Now it’s a standalone project and I wish them well. We really need a new FOSS web engine.


  • One of the issues at hand is that X11, the predecessor of Wayland, does not have a standardized way to tell applications what scale they should use. Applications on X11 get the scale from environment variables (completely bypassing X11), or from Xft.dpi, or by providing in-application settings, or they guess it using some unorthodox means, or simply don’t scale at all. It’s a huge mess overall.

    It is one of the more-or-less fundamentally unfixable parts of the protocol, since it wants everything to be on the same coordinate space (i.e. 1 pixel is 1 pixel everywhere, which is… quite unsuitable for modern systems.)

    Wayland does operate like how you say it and applications supporting Wayland will work properly in HiDPI environments.

    However a lot of people and applications are still on X11 due to various reasons.



  • Yeah I get the display server part. What I meant was that 200% scaling gets you 1920x1080 logical resolution on HiDPI applications – LoDPI applications continue to be blurry just as if you set your actual resolution to 1080p, but HiDPI applications will enjoy the enhanced visual acuity.

    Even on smaller screens like the 14" ones, the quality of very high resolution (e.g. 4K) is still quite visible IMO, especially when it comes to text rendering. But it could very well just be my eyes.




  • Agreed. HiDPI is the way to go and we should appreciate Framework for putting that in their laptops instead of continuing the use of shitty 1366x768 screens.

    Xorg is the reason why OP is facing the scaling issues. OP, try to force the apps to run on native Wayland if they support it but don’t default to it. The Wayland page on Arch wiki has instructions on that. Immensely improved my HiDPI experience.



  • orangeboats@lemmy.worldtoLinux Gaming@lemmy.worldJust Switch Over
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    3
    ·
    edit-2
    3 months ago

    The “quit having fun” meme is ironically becoming as cringey as the thing it is originally complaining about.

    You will help the community more by telling non-Linux people why Linux gaming is better, and this meme is doing the exact opposite of it – “oh Linux can’t play some games, yada yada. But we are still better! Switch over!” – like what’s the logic of it?

    What’s the purpose of this meme other than circlejerking?

    Disclamer: I am a Linux user myself, started with Debian and is now using Arch Linux.

    I will share some advantages I experienced in Linux gaming:

    1. Alt-tabbing old fullscreened games won’t mess with my monitor.

    2. The compatibility of Wine when it comes to some older games is wild. SimCity 4 actually crashed less when I played it on Linux.

    3. Better performance across the board. Granted it’s just a mere 5% difference but I will take it, why not.