• 5 Posts
  • 84 Comments
Joined 11 months ago
cake
Cake day: August 11th, 2023

help-circle



  • You vastly misunderstand both what I am talking about, and how updates work on both Windows and Linux.

    You don’t press shut down and then get a blue updating screen that stops you from doing anything on Linux. Go and update a Linux system and you will see what I am talking about. You run it just like a normal command or program.

    Also yes they update the files on the drive while the system is running.







  • People argue that systemd is too much like Windows NT. I argue that Windows NT has at least a few good ideas in it. And if one of those ideas solves a problem that Linux has, Linux should use that idea.

    It’s actually closer to how macOS init system launchd works anyway, not the Windows version. MacOS is arguably closer to true Unix than Linux is anyway, so I don’t think the Unix argument is a good one to use anyway.



  • This isn’t actually true. They offer both glibc and musl these days. Glibc is the normal one most Linux distros use. Musl doesn’t work with some things, but is still desirable to some people for various reasons. Flatpak could be used to work around this, as it should pull in whatever libc that the program needs. Distrobox would also work. Though again this only applies of using the musl libc version.

    Another potential sore point is not using systemd init. There are some things dependant on systemd, though generally there are packages which act as a replacement for whatever systemd functionality is needed.

    I still have no idea what’s wrong with Voids fonts though. You are on your own there!



  • It’s not just about Retina displays. High res and HDR isn’t uncommon anymore. Pretty much all new TVs anybody would want to buy will be 4K. It has to support the Apple 5K display anyway because that’s one of their products.

    As we’ve discussed two external displays are supported on the new macbook base models. It was a bit of an oversight on the original sure, but that’s been fixed now.

    Also the same SoCs is used in iPads. It’s not mac only. I can’t imagine wanting three displays on an ipad.


  • Sigh. It’s not just a fricking driver. It’s an entire framebuffer you plug into a USB or Thunderbolt port. That’s why they are more expensive, and why they even need a driver.

    A 1080p monitor has one quarter of the pixels of a 4K monitor. The necessary bandwidth increases with the pixels required. Apple chooses instead to use the bandwidth they have to support 2 5K and 6K monitors, instead of supporting say 8 or 10 1080p monitors. That’s a design decision that they probably thought made sense for the product they wanted to produce. Honestly I agree with them for the most part. Most people don’t run 8 monitors, very few have even 3, and those that do can just buy the higher end model or get an adapter like you did. If you are the kind of person to use 3 monitors you probably also want the extra performance.



  • Well yeah, no shit Sherlock. They could have done that in the first generation. It takes four 1080p monitors to equal the resolution of one 4K monitor. Apple though doesn’t have a good enough reason to support many low res monitors. That’s not their typical consumer base, who mostly use retina displays or other high res displays. Apple only sells high res displays. The display in the actual laptops is way above 1080p. In other words they chose quality over quantity as a design decision.


  • Not really. There is a compromise between output resolution, refresh rate, bit depth (think HDR), number of displays, and the overall system performance. Another computer might technically have more monitor output, but they probably sacrificed something to get there like resolution, HDR, power consumption or cost. Apple is doing 5K output with HDR on their lowest end chips. Think about that for a minute.

    A lot of people like to blame AMD for high ideal power usage when they are running multi-monitor setups with different refresh rates and resolutions. Likewise I have seen Intel systems struggle to run a single 4K monitor because they were in single channel mode. Apple probably wanted to avoid those issues on their lower end chips which have much less bandwidth to play with.


  • Not necessarily. The base machines aren’t that expensive, and this chip is also used in iPads. They support high resolution HDR output. The higher the number of monitors, resolution, bit depth, and refresh rate the more bandwidth is required for display output and the more complex and expensive the framebuffers are. Another system might support 3 or 4 monitors, but not support 5K output like the MacBooks do. I’ve seen Intel systems that struggled to even do a single 4K 60 FPS until I added another ram stick to make it dual channel. Apple do 5K output. Like sure they might technically support more monitors in theory, but in practice you will run into limitations if those monitors require too much bandwidth.

    Oh yeah and these systems also need to share bandwidth between the framebuffers, CPU, and GPU. It’s no wonder they didn’t put 3 or more very high resolution buffers into the lower end chips which have less bandwidth than the higher end ones. Even if it did work the performance impacts probably aren’t worth it for a small number of users.