Well, it was either that or “I’ve been using Unix for so long that my first text editor was ed”.
Well, it was either that or “I’ve been using Unix for so long that my first text editor was ed”.
Yeah, but when it comes to RAM and Storage, the other golden rule is that the longer you delay your upgrade the cheaper it will be (assuming you’ll even need it) or the more you can get for the same money.
So there are two competing pulls in this.
The other day I got a Mini PC to use as a home server (including as media server with Kodi).
It has 8GB of RAM, came with some Windows (10 or 11), didn’t even try it and wiped it out, put Lubunto on it and a bunch of services along with Kodi.
Even though it’s running X in order to have Kodi there and Firefox is open and everything, it’s using slightly over 2GB of RAM.
I keep wanting to upgrade it to 16 GB, because, you know, I just like mucking about with hardware and there’s the whole new toy feeling, but I look at the memory usage and just can’t bring myself around to do it just for fun, as it would be a completelly useless upgrade and not even bright eyed uuh, shinny me can convince adult me to waste 60 bucks on something so utterly completelly useless.
Some man pages are just gigantic lists of unintuitive parameters in alphabetical order with no usage examples and even if you know how to search for text in a man page (forward slash then the text you want to search for) you’re just stabbing in the dark.
Others are excellent.
The problem with man pages is that you never know if you’re getting the former or the latter.
“We stopped applying antitrust laws because they were hurting trust in corporations”
I love it how you just want to do something simple and very, very common and normal with a command but you don’t know the magic flags to get it to do it and they’re not just a logical one (like, say “-a” for all) so you do a man for it and it has something like 50 flags listed in alphabethical rather than functional order, some of which only make sense in specific combinations (which are never show together and have to be found by reading the entries for all 50 flags) and there are no examples anywhere to be found of normal usage scenarios for that command.
So that’s when you use some internet search engine and it turns out the most common simplest use of it is something like “doshit --lol --nokidding --verbose=3”.
The UK plugs, on the other hand, are the only shape of plug known to man that when it’s left on the ground the pins always end up turned up, ready and eager to receive any feet not suitably protected by a hard shoe sole.
Getting X to work required mucking about with a textfile where you specified parameters directing the operation of the electron gun inside a CRT monitor that were so down to the metal, that you could create your own resolution or even blow up your monitor.
Ah, those were the days ;)
Whilst a 100W delta seems unlikelly, a 50W delta seems realistic as the kind of stuff you have in a NAS will use maybe 5W (about the same as a Raspberry PI, possibly less) whilst the typical desktop PC uses significantly more even outside graphics mode (part of the reason to use Linux in text mode only is exactly to try and save power there). It mainly depends on what the desktop was used for before: a “gaming PC” with a dedicated graphics card from an old enough generation (i.e. with HW from back before the manufactures of GPUs started competing on power usage) will use signiificantly more power than integrated graphics even in idle mode.
That said, making it a “home server” as you suggest makes a lot of sense - if that thing is an “All In One” server (media server, NAS, print server, torrent download server and so on) loaded with software of your choice (and hence stuff that respects your privacy and doesn’t shove Ads in your face) it’s probably a superior solution to getting those things as separate standalone devices, especially in the current era of enshittification.
A NAS is basically some software running on a computer, so you can use a desktop as that computer, ideally with a light operating system (for example, Linux in text only mode).
HOWEVER: desktops are designed for far higher computational loads than needed by a NAS, plus things like graphical user interfaces and direct connection of user peripherals such as mice, so even when idle they consume a lot more power than the kind of hardware used in a typical NAS.
Also the hardware in a good NAS will have things like extra higher speed connectors for HDDs/SDDs (such as SATA) rather than you having to use slower stuff like USB.
So keep in mind that a desktop as NAS will consume significantly more power than a dedicated NAS (as the latter will probably be running on something like an ARM and have a power source dimensioned for a couple of HDDs, not to run a dedicate graphics card like a desktop has) and probably won’t fit as many disks.
If you’re ok with having most disks be accessed a bit slower and USB3 work for you (and, for example, if your NAS is on 100 Mbit Ethernet, it’s the network that’s the slowest thing, not USB3) then it’s usually better to use an old notebook rather than desktop because notebooks were designed for running of batteries hence consume significantly less power.
Frankly I would advise against using an old desktop as NAS mainly because in a year or two of continued use you’ll have paid enough in extra electricity costs vs using a NAS to pay for a simple but decent dedicated NAS.
If you’re running a remote controlled car you want something way down the power scale like as ESP32 or even an ATTiny + radio HW.
Mind you, I don’t disagree with your actual point, I just think the example you used wasn’t correct.
Sledgehammer!
Works every time.(1)
1might have unpleasant side effects(2)
2for definitions of the word “might” were the probability of that outcome is at least 5 nines.
Well, depending on how long one is trying to exit Vim and hence the level of frustration, exiting Vim might involve the use of a sledgehammer…
Speaking for myself a long time ago when I was younger and handsomer but dumber ;) some people at a certain stage of their lives have trouble remembering that what’s obvious for oneself given the context one is in and the information one has, is not obvious for others.
I like to think most of us grow out of it.
The point is that a programmer would first need to think about what needs to be explained or not to the average user and then explain it properly, none of which is considered as interesting as coding.
It’s not by chance that even tools with actual one line of explanation for each parameter are general of the badly documented kind (I especially like the ones were the “help” for a command doesn’t say what the bloody command actually does).
I mean, you even see this kind of meaningless “documentation” in API documentation for widelly used libraries were the documentations is generated from comments embedded in the code: “public void doStuff(int height)” => “Does stuff. Parameters - height: the input height”.
I might have put it in a humouristic way but this quite a well-known and widespread phenomenon.
Oh, you sweet summer child, there is no level of ease for the average programmer that will make him or her want to document things… ;)
On a more serious note, good documentation for parameters in any tool that’s not stupidly simple tends to be more than a one liner if one doesn’t assume that the user already knows a ton of context (for example, imagine explaining “chmod” parameters with just one liners)
Programmers generally detest to do documentation, so when the user help “UI” is all down to a programmer to define this is often what you get, especially if it’s a small tool.
It’s wonderful how the expression “humble Arch Linux user” manages to pack a contradiction in a mere 4 words.