Oh, just ask an Arch user about Manjaro.
Oh, just ask an Arch user about Manjaro.
The 8 dependencies must be an optional dependency for some other package you already have installed. That said, that kind of stuff is the main reason I want to try NixOS - any time I install something, configure something, etc. I’m risking forgetting about it and getting tripped up over it down the line, with no good way to check.
Man, and here I put too much effort writing a reply to a troll 😔
Does windows come preinstalled and preconfigured with more potentially vulnerable software on open ports?
I personally don’t value an antivirus that much, since it can only protect you from known threats, and even then, it only matters when you’re already getting compromised - but fair point for Windows, I suspect most distros come without antivirus preinstalled and preconfigured.
A firewall, on the other hand, only has value if you already have insecure services listening on your system - and I’m pretty sure on Windows those services aren’t gonna be blocked by the default settings. All that said though… Most Linux distros come with a firewall, something like iptables or firewalld, though not sure which ones would have it preconfigured for blocking connections by default.
So while I would dispute both of those points as not being that notable, I feel like other arguments in favor of Linux still stand, like reduced surface area, simpler kernel code, open and auditable source.
One big issue with Linux security for consumers (which I have to assume is what you’re talking about, since on the server side a sysadmin will want to configure any antivirus and firewall anyways) could be that different distributions will have different configurations - both for security and for preference-based things like desktop environments. This does unfortunately mean that users could find themselves installing less secure distros without realizing it, choosing them for their looks/usage patterns.
Question, how is Linux more insecure out of the box?
I think you’re actually agreeing with me here. I was disputing the claim that software should be made available in “a native package format”, and my counterpoint is that devs shouldn’t be packaging things for distros, and instead providing source code with build instructions, alongside whatever builds they can comfortably provide - primarily flatpak and appimage, in my example.
I don’t use flatpak, and I prefer to use packages with my distro’s package manager, but I definitely can’t expect every package to be available in that format. Flatpak and appimage, to my knowledge, are designed to be distro-agnostic and easily distributed by the software developer, so they’re probably the best options - flatpak better for long-term use, appimage usable for quickly trying out software or one-off utilities.
As for tar.gz, these days software tends to be made available on GitHub and similar platforms, where you can fetch the source from git by commit, and releases also have autogenerated source downloads. Makefiles/automake isn’t a reasonable expectation these days, with a plethora of languages and build toolchains, but good, clear instructions are definitely something to include.
The responsibility to figure out the dependencies and packaging for distros, and then maintain those going forwards, should not be placed on the developer. If a developer wants to do that, then that’s fine - but if a developer just wants to provide source with solid build instructions, and then provide a flatpak, maybe an appimage, then that’s also perfectly fine.
In a sense, developers shouldn’t even be trusted to manage packaging for distributions - it’s usually not their area of expertise, maintainers of specific distributions will usually know better.
And reinstalling the packages, moving over all the configs, setting up the partitions and moving the data over? (Not in this order, of course)
Cloning a drive would just require you to plug both the old and new to the same machine, boot up (probably from a live image to avoid issues), running a command and waiting until it finishes. Then maybe fixing up the fstab and reinstalling the bootloader, but those are things you need to do to install the system anyways.
I think the reason you’d want to reinstall is to save time, or get a clean slate without any past config mistakes you’ve already forgotten about, which I’ve done for that very reason, especially since it was still my first, and less experienced, install.
In addition to what was said by somebody else about atomic updates, even a simple update via package manager on a regular distro will do all the work up front, and not take extra time on next boot. Before you reboot, most things will continue working fine - and most of the remaining things that might not can be worked around.
“Calling out” gnome for needing extensions for customization seems stupid when those extensions are easy to find, easy to use, and work really well. On the other hand, I have not been able to find a taskbar for plasma that would let me group windows from an application together while also letting me rearrange the windows inside of a group. I know I need to try implementing it myself someday, but I feel like gnome ends up having more options.
I think Arch is meant for people who want to learn the software - so that you can also choose, control, customize, diagnose, and fix the software!
That said, archwiki is still a great resource on other distros for when something does go wrong, or when it’s not obvious how to do something, particularly when messing with experimental or server stuff.
That’s the point of the fediverse and activitypub - posts from one platform are federated to other compatible platforms. I know this also includes kbin, but there’s probably other platforms.
As others mentioned, archwiki is the information source if you want to use Arch, and a great source of information even if using other distributions.
For other distros, I’ve seen people mention Linux Journey.
All that said, you might not be able to drop Windows entirely - if we’re talking CAD software, the Adobe suite, that kind of stuff, you might not be able to find suitable alternatives for Linux. That said, you can always dual boot, or you might even be able to work with a VM.
If you do want to try a dual boot, I strongly recommend setting up the Linux boot partition on a separate physical drive, to minimize the risk of Windows overwriting it… As well as you accidentally messing up your windows install. I’d also recommend using rEFInd as the bootloader, since it’s very easy to set up and will automatically show a boot option for Windows.
Feel free to ask questions, I’m no expert, but I’ll try to answer when I have time.
I will happily recommend Arch to a new user… If they’re interested in learning Linux, and not dependent on it working reliably, while warning them of the risks and telling them about the advantages.
I wouldn’t recommend it to somebody who wants something that just works, but for tech-inclined people looking for a system they are in control (and responsibility) of, willing to learn how to set it up, I think a manual installation is a good experience.
But they will be warned.
It’s usually combined with a rectangle/window selection, thus clipping a part of your screen into an image/clipboard.
Not tools like screenshots, screen recording - because Wayland is inherently different, you couldn’t make those work in Xwayland without sacrifices.
I think Wayland is in a good enough state to be a daily driver for most (non-NVidia) users, but there’s still big caveats to keep in mind that can be deal breakers.
I’m pretty sure hotfixes are still being released. It’s more so that there are two release streams, stable and unstable, and when there isn’t a new unstable release, the unstable stream is just on the same version as the stable stream.
That’s the thing - none of those would’ve affected you negatively if you’ve been using Nvidia, so if you’re just playing games and not following the news, you’re more likely to just hear people complain about AMD this, AMD that, they broke it… But everything works fine for you
There’s nothing special about it. Linux distros are one of the options, alongside windows and osx as desktop systems.
What there are are preferences, morals, affordability. Linux is generally free, has different approaches to how the system is structured, how software is installed, how much access to the system you have, and how much responsibility for setting it up you have.
This will also vary from distro to distro, but generally software is installed from the distribution’s repositories, not downloading files from various websites - and instead of having some different scheme for updating every program on your computer, you use a single command (or button in an app) to update your system and all your software. This is one of the main things I love about Linux - you get to update your stuff when you want, all at once.