I mean, bash is a code.
Till next time
I mean, bash is a code.
Till next time
Use Gentoo, add -telemetry to FLAGS, so every software will be built without it - no spyware.
You can /s me later.
Not anymore I’m not, you are correct.
Also wrote that earlier.
I will never expect Windows to respect any preference. Updates burnt me too many times.
Linux for life.
Reminded me how Windows would set the hardware clock to different timezone that Linux uses, can’t remember which.
It would make my blood boil, that’s when I decided to never boot it again. 100% Linux everywhere, I get it on routers when I can.
Remember the South Park movie, when they wanted to invade, and when their visualisation failed, called for Bill Gates?
In Europe you can return the Windows licence and get your money back… at least I think you should be able to.
Their GPUs have brute force, and are always top recommendations for gaming PCs. I buy all red, but feel like nVidia is still more popular among the gaming community, excluding Linux.
Disclaimer: I only think this is true, correct me if I’m wrong.
GPUs do floating-point math, in their own precision. They have way more computing cores than usual CPUs do, because they do only FP, so are smaller. Also they have their own memory on the board, which has faster access timing than RAM.
Usually we put multiple matrices (multi-dimensional array of numbers) inside, and expect some back, while the entire load of math is done on the GPU. Now it can do milions of computations very fast, in parallel, never care about anything external like RAM or goodness forbid disk or network (which is monumentally slower).
Now CUDA is the ‘platform’, that lets you write the code for general programming and to utilize not only the computing cores on the GPU, but also move data between the GPU, CPU and RAM.
This is a weak example, but back in the day, I mined BTC on my PC. IIRC, it just runs md5 hashes until it finds specific output. MD5 is just a mathematical algorithm, and you can run it on both CPU and GPU.
My CPU at the time (I think it was 6 core Phenom II?) could output 12 milion of md5 hashes per second. My GPU - AMD Radeon 6990 - after some tweaks and full table fan blowing from the side inside chasis could get close over 800 Mhash/s.
So there are direct incentives to use GPU for other cases than gaming, specifically machine learning is all about floating point math. But to do that, you want to be able to write your own software that implements the algorithms to squeeze every last bit of performance out of it, and that’s what CUDA lets you do.
CUDA is specific to nVidia GPUs, AMD is trying to catch up with ROCm, but came almost a decade later, so they have a lot of catching up to do. Intel also started their own oneAPI, and both oneAPI and ROCm are open-source, with CUDA being closed source, so only nVidia can modify it.
It can go anywhere if you want it to.
Special Fuck You to:
I only use dwm, so no idea how long it takes to compile KDE or Gentoo Gnome.
Everything else is so quick. Just those four take 20-30 minutes each.
Sure, your experience may be different.
That happened in 2013 with random laptop they gave me. I kid you not it took that long, could have been a bug somewhere in the OEM, never cared enough to find out.
But my experience is just as real as much as yours.
Yes, because even once is too many.
In a corporate, I spent an hour and half every morning waiting for Windows to update. Then my coworker handed me Fedora DVD and I never looked back.
Don’t normal countries rebate the fee when you ask for a systemless PC?
Many games have a much better experience under Linux.
Excelent write-up, thank you very much. I’m going to invest my time to learning Incus!
How is the development of LXD?
I am a huge fan of LXC, but I hate random daemons running (so no Docker for me). I have been looking at the Linux Container website, and they mentioned Canonical taking LXD development under its wings, and something about no one else participating apart from Canonical devs.
So I’m kind of scared about the future of LXC and Incus. Do you have any more information about that?
Yes, there is/was a setting for that, should be on by default.