🇨🇦

  • 2 Posts
  • 59 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle


  • Sure, cloudflare provides other security benefits; but that’s not what OP was talking about. They just wanted/liked the plug+play aspect, which doesn’t need cloudflare.

    Those ‘benefits’ are also really not necessary for the vast majority of self hosters. What are you hosting, from your home, that garners that kind of attention?

    The only things I host from home are private services for myself or a very limited group; which, as far as ‘attacks’ goes, just gets the occasional script kiddy looking for exposed endpoints. Nothing that needs mitigation.




  • I setup borg around 4 months ago using option 1. I’ve messed around with it a bit, restoring a few backups, and haven’t run into any issues with corrupt/broken databases.

    I just used the example script provided by borg, but modified it to include my docker data, and write info to a log file instead of the console.

    Daily at midnight, a new backup of around 427gb of data is taken. At the moment that takes 2-15min to complete, depending on how much data has changed since yesterday; though the initial backup was closer to 45min. Then old backups are trimmed; Backups <24hr old are kept, along with 7 dailys, 3 weeklys, and 6 monthlys. Anything outside that scope gets deleted.

    With the compression and de-duplication process borg does; the 15 backups I have so far (5.75tb of data) currently take up 255.74gb of space. 10/10 would recommend on that aspect alone.

    /edit, one note: I’m not backing up Docker volumes directly, though you could just fine. Anything I want backed up lives in a regular folder that’s then bind mounted to a docker container. (including things like paperless-ngxs databases)







  • After reading this thread and a few other similar ones, I tried out BorgBackup and have been massively impressed with it’s efficiency.

    Data that hasn’t changed, is stored under a different location, or otherwise is identical to what’s already stored in the backup repository (both in the backup currently being created and all historical backups) isn’t replicated. Only the information required to link that existing data to its doppelgangers is stored.

    The original set of data I’ve got being backed up is around 270gb: I currently have 13 backups of it. Raw; thats 3.78tb of data. After just compression using zlib; that’s down to 1.56tb. But the incredible bit is after de-duplication (the part described in the above paragraph), the raw data stored on disk for all 13 of those backups: 67.9gb.

    I can mount any one of those 13 backups to the filesystem, or just extract any of 3.78tb of files directly from that backup repository of just 67.9gb of data.



  • Darkassassin07@lemmy.catolinuxmemes@lemmy.worldThat's LTT in the bottom
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    4 months ago

    I’d need a windows system to put it in. The Drobo isn’t upgradable beyond stuffing more drives in it, and the laptop is an old hp craptop…

    I’ve got a second desktop that’s got usb3 (drobo is usb3), so that’d probably improve things, just not by a lot (pretty sure the slowdown is in the samba share, but I need to do more testing and see where exactly the issue is), and I kinda want to keep that system free for other experiments.

    Idk, still thinking on it.


  • Darkassassin07@lemmy.catolinuxmemes@lemmy.worldThat's LTT in the bottom
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    4 months ago

    The only real issue I’ve had with Linux is trying to get my old Drobo 5C to work. (it’s a self-managed dynamically adjustable/resizable raid array that just presents itself as a single 70tb usb hard disk. The company that made them dissolved a few years ago)

    It’s formatted in ntfs and loaded with 25tb+ of data from when I ran windows primarily.

    It’ll mount and work temporarily, but quickly stops responding, with anything that tries to access it frozen. Particularly docker containers.

    Then it’ll drop into some internal data recovery routine (it’s a ‘black box’ with very little user control, definitely wouldn’t be my choice again, but here we are), refusing to interact with the attached system for half an hour or so. When it finally comes back, linux refuses to mount it. ‘dirty filesystem’, but ntfsfix won’t touch it either. Off to windows and chkdsk, then rinse and repeat.

    I gave up when one of those attempts resulted in corrupt data (a bunch of mkvs that wouldn’t play from the beginning, but would play if you skipped past the first second or two). I can’t backup this data, (no alternative storage or funds to acquire it) so that was enough tempting fate.

    I ended up attaching it to an old windows laptop that’s now dedicated to serving it via samba :(

    Really looking forward to setting up a proper raid array eventually, but till then I’m stuck with 11mbps. I’d love to rent storage temporarily so I can move the data and try a different fs on the drobo…





  • Interesting, that I was not aware of. I’ve never run into a scenario where I’ve had to add/edit while offline.

    When using vaultwarden however, you can be offline as long as the client can still reach the server (ie they are within the same lan network or are the same machine). You’d still be fine to add/edit while your home wan is out for example, just not on the go.

    Plus there’s the no-internet package mentioned in that link, but it’s limited to the desktop application.


  • Bitwarden is (primarily) a single db synced between devices via a server. A copy is kept locally on each device you sign into.

    Changes made to an offline copy will sync to the server and your other devices once back online. (with the most recent change to each individual item being kept if there are multiple changes across several devices) /edit: the local copy is for access to your passwords offline. Edits must be made with a connection to the server your account resides on, be that bitwardens or your own.

    If you host your own sync server via vaultwarden, you can easily maintain multiple databases (called vaults) either with multiple accounts, or with a single account and the organizations feature. (options for creating vaults separate from your main one and sharing those vaults with multiple accounts) You can do this with regular bitwarden as well, but have to pay for the privilege.

    Using vaultwarden also gives you all the paid features of bitwarden for free (as it’s self-hosted instead of using public servers)

    I’ve been incredibly happy with it after setting it up ~3 months ago. Worth looking into.